2026-03-09T16:04:51.463 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-09T16:04:51.472 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T16:04:51.520 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{reef} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '542' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/reef ' name: kyr-2026-03-09_11:23:05-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 3443 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm03.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOIZEBnfGxvj390nA0VMx0tqzlYmYniA+x2hbaS4sKZfaa0kPjHYQFAn/jvyh3g/XGN7Eo9SToucDP2twAe+3Sc= vm05.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP/Eb7hVD9vGpiSA4NlO/5cAKZlDWB9nnLdUlhZF5ziZ71R0gdmZ9YVpK538QkFMXvZwFJbWWaLeQbxyGK4U214= tasks: - install: branch: reef exclude_packages: - ceph-volume - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.ceph.io/ceph-ci/ceph:reef roleless: true - print: '**** done end installing reef cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data true --yes-i-really-really-mean-it - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: [] meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: true teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-09_11:23:05 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done - ceph versions | jq -e '.mgr | length == 1' - ceph versions | jq -e '.mgr | keys' | grep $sha1 - ceph versions | jq -e '.overall | length == 2' - ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '.up_to_date | length == 2' - ceph orch ps - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-09T16:04:51.520 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-09T16:04:51.520 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-09T16:04:51.520 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-09T16:04:51.521 INFO:teuthology.task.internal:Checking packages... 2026-03-09T16:04:51.521 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-09T16:04:51.521 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:04:51.521 INFO:teuthology.packaging:ref: None 2026-03-09T16:04:51.521 INFO:teuthology.packaging:tag: None 2026-03-09T16:04:51.521 INFO:teuthology.packaging:branch: squid 2026-03-09T16:04:51.521 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:04:51.521 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-09T16:04:52.262 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-09T16:04:52.263 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-09T16:04:52.263 INFO:teuthology.task.internal:no buildpackages task found 2026-03-09T16:04:52.263 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-09T16:04:52.264 INFO:teuthology.task.internal:Saving configuration 2026-03-09T16:04:52.273 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-09T16:04:52.273 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-09T16:04:52.280 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm03.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 16:03:19.640931', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:03', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOIZEBnfGxvj390nA0VMx0tqzlYmYniA+x2hbaS4sKZfaa0kPjHYQFAn/jvyh3g/XGN7Eo9SToucDP2twAe+3Sc='} 2026-03-09T16:04:52.286 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm05.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 16:03:19.640550', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:05', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP/Eb7hVD9vGpiSA4NlO/5cAKZlDWB9nnLdUlhZF5ziZ71R0gdmZ9YVpK538QkFMXvZwFJbWWaLeQbxyGK4U214='} 2026-03-09T16:04:52.286 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-09T16:04:52.287 INFO:teuthology.task.internal:roles: ubuntu@vm03.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-09T16:04:52.287 INFO:teuthology.task.internal:roles: ubuntu@vm05.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-09T16:04:52.287 INFO:teuthology.run_tasks:Running task console_log... 2026-03-09T16:04:52.293 DEBUG:teuthology.task.console_log:vm03 does not support IPMI; excluding 2026-03-09T16:04:52.300 DEBUG:teuthology.task.console_log:vm05 does not support IPMI; excluding 2026-03-09T16:04:52.300 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f001f372170>, signals=[15]) 2026-03-09T16:04:52.300 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-09T16:04:52.301 INFO:teuthology.task.internal:Opening connections... 2026-03-09T16:04:52.301 DEBUG:teuthology.task.internal:connecting to ubuntu@vm03.local 2026-03-09T16:04:52.301 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T16:04:52.364 DEBUG:teuthology.task.internal:connecting to ubuntu@vm05.local 2026-03-09T16:04:52.365 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T16:04:52.426 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-09T16:04:52.427 DEBUG:teuthology.orchestra.run.vm03:> uname -m 2026-03-09T16:04:52.481 INFO:teuthology.orchestra.run.vm03.stdout:x86_64 2026-03-09T16:04:52.481 DEBUG:teuthology.orchestra.run.vm03:> cat /etc/os-release 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:NAME="CentOS Stream" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:VERSION="9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:ID="centos" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:ID_LIKE="rhel fedora" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_ID="9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:PLATFORM_ID="platform:el9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:ANSI_COLOR="0;31" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:LOGO="fedora-logo-icon" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:HOME_URL="https://centos.org/" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T16:04:52.537 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T16:04:52.537 INFO:teuthology.lock.ops:Updating vm03.local on lock server 2026-03-09T16:04:52.542 DEBUG:teuthology.orchestra.run.vm05:> uname -m 2026-03-09T16:04:52.559 INFO:teuthology.orchestra.run.vm05.stdout:x86_64 2026-03-09T16:04:52.559 DEBUG:teuthology.orchestra.run.vm05:> cat /etc/os-release 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:NAME="CentOS Stream" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:VERSION="9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:ID="centos" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:ID_LIKE="rhel fedora" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_ID="9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:PLATFORM_ID="platform:el9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:ANSI_COLOR="0;31" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:LOGO="fedora-logo-icon" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:HOME_URL="https://centos.org/" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T16:04:52.616 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T16:04:52.616 INFO:teuthology.lock.ops:Updating vm05.local on lock server 2026-03-09T16:04:52.621 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-09T16:04:52.623 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-09T16:04:52.624 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-09T16:04:52.624 DEBUG:teuthology.orchestra.run.vm03:> test '!' -e /home/ubuntu/cephtest 2026-03-09T16:04:52.625 DEBUG:teuthology.orchestra.run.vm05:> test '!' -e /home/ubuntu/cephtest 2026-03-09T16:04:52.672 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-09T16:04:52.673 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-09T16:04:52.673 DEBUG:teuthology.orchestra.run.vm03:> test -z $(ls -A /var/lib/ceph) 2026-03-09T16:04:52.680 DEBUG:teuthology.orchestra.run.vm05:> test -z $(ls -A /var/lib/ceph) 2026-03-09T16:04:52.692 INFO:teuthology.orchestra.run.vm03.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T16:04:52.730 INFO:teuthology.orchestra.run.vm05.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T16:04:52.730 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-09T16:04:52.738 DEBUG:teuthology.orchestra.run.vm03:> test -e /ceph-qa-ready 2026-03-09T16:04:52.755 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:04:52.964 DEBUG:teuthology.orchestra.run.vm05:> test -e /ceph-qa-ready 2026-03-09T16:04:52.979 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:04:53.173 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-09T16:04:53.174 INFO:teuthology.task.internal:Creating test directory... 2026-03-09T16:04:53.174 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T16:04:53.176 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T16:04:53.193 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-09T16:04:53.194 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-09T16:04:53.196 INFO:teuthology.task.internal:Creating archive directory... 2026-03-09T16:04:53.196 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T16:04:53.235 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T16:04:53.255 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-09T16:04:53.256 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-09T16:04:53.256 DEBUG:teuthology.orchestra.run.vm03:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T16:04:53.308 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:04:53.308 DEBUG:teuthology.orchestra.run.vm05:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T16:04:53.323 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:04:53.324 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T16:04:53.351 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T16:04:53.378 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T16:04:53.388 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T16:04:53.390 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T16:04:53.400 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T16:04:53.401 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-09T16:04:53.405 INFO:teuthology.task.internal:Configuring sudo... 2026-03-09T16:04:53.405 DEBUG:teuthology.orchestra.run.vm03:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T16:04:53.433 DEBUG:teuthology.orchestra.run.vm05:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T16:04:53.469 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-09T16:04:53.471 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-09T16:04:53.471 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T16:04:53.500 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T16:04:53.528 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T16:04:53.577 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T16:04:53.636 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:04:53.636 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T16:04:53.698 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T16:04:53.724 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T16:04:53.782 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:04:53.782 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T16:04:53.844 DEBUG:teuthology.orchestra.run.vm03:> sudo service rsyslog restart 2026-03-09T16:04:53.846 DEBUG:teuthology.orchestra.run.vm05:> sudo service rsyslog restart 2026-03-09T16:04:53.875 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T16:04:53.915 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T16:04:54.302 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-09T16:04:54.304 INFO:teuthology.task.internal:Starting timer... 2026-03-09T16:04:54.304 INFO:teuthology.run_tasks:Running task pcp... 2026-03-09T16:04:54.306 INFO:teuthology.run_tasks:Running task selinux... 2026-03-09T16:04:54.308 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-09T16:04:54.308 INFO:teuthology.task.selinux:Excluding vm03: VMs are not yet supported 2026-03-09T16:04:54.308 INFO:teuthology.task.selinux:Excluding vm05: VMs are not yet supported 2026-03-09T16:04:54.308 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-09T16:04:54.308 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-09T16:04:54.308 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-09T16:04:54.308 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-09T16:04:54.310 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-09T16:04:54.310 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-09T16:04:54.311 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-09T16:04:55.075 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-09T16:04:55.081 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-09T16:04:55.081 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryzzudscfb --limit vm03.local,vm05.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-09T16:06:58.212 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm03.local'), Remote(name='ubuntu@vm05.local')] 2026-03-09T16:06:58.213 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm03.local' 2026-03-09T16:06:58.213 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T16:06:58.282 DEBUG:teuthology.orchestra.run.vm03:> true 2026-03-09T16:06:58.353 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm03.local' 2026-03-09T16:06:58.353 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm05.local' 2026-03-09T16:06:58.353 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T16:06:58.414 DEBUG:teuthology.orchestra.run.vm05:> true 2026-03-09T16:06:58.491 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm05.local' 2026-03-09T16:06:58.491 INFO:teuthology.run_tasks:Running task clock... 2026-03-09T16:06:58.493 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-09T16:06:58.493 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T16:06:58.494 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T16:06:58.495 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T16:06:58.495 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T16:06:58.535 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T16:06:58.555 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T16:06:58.570 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T16:06:58.585 INFO:teuthology.orchestra.run.vm03.stderr:sudo: ntpd: command not found 2026-03-09T16:06:58.588 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T16:06:58.600 INFO:teuthology.orchestra.run.vm03.stdout:506 Cannot talk to daemon 2026-03-09T16:06:58.620 INFO:teuthology.orchestra.run.vm05.stderr:sudo: ntpd: command not found 2026-03-09T16:06:58.623 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T16:06:58.636 INFO:teuthology.orchestra.run.vm05.stdout:506 Cannot talk to daemon 2026-03-09T16:06:58.642 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T16:06:58.658 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T16:06:58.676 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T16:06:58.699 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-09T16:06:58.731 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:^? formularfetischisten.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:^? sv1.ggsrv.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:^? 217.145.111.106 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.819 INFO:teuthology.orchestra.run.vm03.stdout:^? mx03.fischl-online.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:^? formularfetischisten.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:^? sv1.ggsrv.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:^? 217.145.111.106 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.820 INFO:teuthology.orchestra.run.vm05.stdout:^? mx03.fischl-online.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T16:06:58.820 INFO:teuthology.run_tasks:Running task install... 2026-03-09T16:06:58.822 DEBUG:teuthology.task.install:project ceph 2026-03-09T16:06:58.822 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T16:06:58.822 DEBUG:teuthology.task.install:config {'branch': 'reef', 'exclude_packages': ['ceph-volume'], 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T16:06:58.822 INFO:teuthology.task.install:Using flavor: default 2026-03-09T16:06:58.824 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-09T16:06:58.825 INFO:teuthology.task.install:extra packages: [] 2026-03-09T16:06:58.825 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': 'reef', 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-09T16:06:58.825 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:06:58.825 INFO:teuthology.packaging:ref: None 2026-03-09T16:06:58.825 INFO:teuthology.packaging:tag: None 2026-03-09T16:06:58.825 INFO:teuthology.packaging:branch: reef 2026-03-09T16:06:58.825 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:06:58.825 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T16:06:58.826 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': 'reef', 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-09T16:06:58.826 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:06:58.826 INFO:teuthology.packaging:ref: None 2026-03-09T16:06:58.826 INFO:teuthology.packaging:tag: None 2026-03-09T16:06:58.826 INFO:teuthology.packaging:branch: reef 2026-03-09T16:06:58.826 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:06:58.826 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T16:06:59.579 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/ 2026-03-09T16:06:59.579 INFO:teuthology.task.install.rpm:Package version is 18.2.7-1055.gab47f43c 2026-03-09T16:06:59.601 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/ 2026-03-09T16:06:59.601 INFO:teuthology.task.install.rpm:Package version is 18.2.7-1055.gab47f43c 2026-03-09T16:07:00.088 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T16:07:00.088 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:07:00.088 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T16:07:00.104 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T16:07:00.104 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:07:00.104 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T16:07:00.130 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T16:07:00.130 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:07:00.130 INFO:teuthology.packaging:ref: None 2026-03-09T16:07:00.130 INFO:teuthology.packaging:tag: None 2026-03-09T16:07:00.130 INFO:teuthology.packaging:branch: reef 2026-03-09T16:07:00.131 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:07:00.131 DEBUG:teuthology.orchestra.run.vm03:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/reef/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T16:07:00.141 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T16:07:00.141 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:07:00.141 INFO:teuthology.packaging:ref: None 2026-03-09T16:07:00.141 INFO:teuthology.packaging:tag: None 2026-03-09T16:07:00.141 INFO:teuthology.packaging:branch: reef 2026-03-09T16:07:00.141 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:07:00.141 DEBUG:teuthology.orchestra.run.vm05:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/reef/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T16:07:00.208 DEBUG:teuthology.orchestra.run.vm03:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T16:07:00.220 DEBUG:teuthology.orchestra.run.vm05:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T16:07:00.304 DEBUG:teuthology.orchestra.run.vm03:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T16:07:00.309 DEBUG:teuthology.orchestra.run.vm05:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T16:07:00.338 INFO:teuthology.orchestra.run.vm03.stdout:check_obsoletes = 1 2026-03-09T16:07:00.339 INFO:teuthology.orchestra.run.vm05.stdout:check_obsoletes = 1 2026-03-09T16:07:00.340 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-09T16:07:00.341 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-09T16:07:00.561 INFO:teuthology.orchestra.run.vm03.stdout:41 files removed 2026-03-09T16:07:00.577 INFO:teuthology.orchestra.run.vm05.stdout:41 files removed 2026-03-09T16:07:00.611 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T16:07:00.626 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T16:07:02.035 INFO:teuthology.orchestra.run.vm03.stdout:ceph packages for x86_64 64 kB/s | 77 kB 00:01 2026-03-09T16:07:02.043 INFO:teuthology.orchestra.run.vm05.stdout:ceph packages for x86_64 64 kB/s | 77 kB 00:01 2026-03-09T16:07:03.006 INFO:teuthology.orchestra.run.vm03.stdout:ceph noarch packages 12 kB/s | 11 kB 00:00 2026-03-09T16:07:03.039 INFO:teuthology.orchestra.run.vm05.stdout:ceph noarch packages 12 kB/s | 11 kB 00:00 2026-03-09T16:07:03.999 INFO:teuthology.orchestra.run.vm03.stdout:ceph source packages 1.9 kB/s | 1.9 kB 00:00 2026-03-09T16:07:04.032 INFO:teuthology.orchestra.run.vm05.stdout:ceph source packages 1.9 kB/s | 1.9 kB 00:00 2026-03-09T16:07:05.045 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - BaseOS 8.7 MB/s | 8.9 MB 00:01 2026-03-09T16:07:05.438 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - BaseOS 6.4 MB/s | 8.9 MB 00:01 2026-03-09T16:07:06.763 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - AppStream 58 MB/s | 27 MB 00:00 2026-03-09T16:07:11.149 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - CRB 8.5 MB/s | 8.0 MB 00:00 2026-03-09T16:07:13.216 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - AppStream 3.7 MB/s | 27 MB 00:07 2026-03-09T16:07:13.664 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - Extras packages 13 kB/s | 20 kB 00:01 2026-03-09T16:07:14.585 INFO:teuthology.orchestra.run.vm05.stdout:Extra Packages for Enterprise Linux 24 MB/s | 20 MB 00:00 2026-03-09T16:07:19.154 INFO:teuthology.orchestra.run.vm05.stdout:lab-extras 63 kB/s | 50 kB 00:00 2026-03-09T16:07:20.525 INFO:teuthology.orchestra.run.vm05.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T16:07:20.525 INFO:teuthology.orchestra.run.vm05.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T16:07:20.529 INFO:teuthology.orchestra.run.vm05.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T16:07:20.529 INFO:teuthology.orchestra.run.vm05.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T16:07:20.555 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout:======================================================================================= 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout:======================================================================================= 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout:Installing: 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 6.5 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 5.1 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 850 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 143 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 1.5 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 140 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 3.5 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 7.4 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 49 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 7.8 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 36 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 226 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 31 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 710 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 126 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 162 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 322 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 302 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 100 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 87 k 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T16:07:20.559 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 172 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout:Upgrading: 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.3 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout:Installing dependencies: 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 18 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 24 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 2.1 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 248 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.7 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 17 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 17 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 25 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 166 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 475 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.5 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 45 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 130 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T16:07:20.560 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:======================================================================================= 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:Install 115 Packages 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:Upgrade 2 Packages 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:Total download size: 181 M 2026-03-09T16:07:20.561 INFO:teuthology.orchestra.run.vm05.stdout:Downloading Packages: 2026-03-09T16:07:22.025 INFO:teuthology.orchestra.run.vm05.stdout:(1/117): ceph-18.2.7-1055.gab47f43c.el9.x86_64. 9.5 kB/s | 6.5 kB 00:00 2026-03-09T16:07:22.730 INFO:teuthology.orchestra.run.vm05.stdout:(2/117): ceph-fuse-18.2.7-1055.gab47f43c.el9.x8 1.2 MB/s | 850 kB 00:00 2026-03-09T16:07:22.870 INFO:teuthology.orchestra.run.vm05.stdout:(3/117): ceph-immutable-object-cache-18.2.7-105 1.0 MB/s | 143 kB 00:00 2026-03-09T16:07:23.227 INFO:teuthology.orchestra.run.vm05.stdout:(4/117): ceph-base-18.2.7-1055.gab47f43c.el9.x8 2.7 MB/s | 5.1 MB 00:01 2026-03-09T16:07:23.480 INFO:teuthology.orchestra.run.vm05.stdout:(5/117): ceph-mds-18.2.7-1055.gab47f43c.el9.x86 3.4 MB/s | 2.1 MB 00:00 2026-03-09T16:07:23.599 INFO:teuthology.orchestra.run.vm05.stdout:(6/117): ceph-mgr-18.2.7-1055.gab47f43c.el9.x86 3.9 MB/s | 1.5 MB 00:00 2026-03-09T16:07:24.131 INFO:teuthology.orchestra.run.vm05.stdout:(7/117): ceph-mon-18.2.7-1055.gab47f43c.el9.x86 7.2 MB/s | 4.7 MB 00:00 2026-03-09T16:07:24.847 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - CRB 909 kB/s | 8.0 MB 00:08 2026-03-09T16:07:25.185 INFO:teuthology.orchestra.run.vm05.stdout:(8/117): ceph-common-18.2.7-1055.gab47f43c.el9. 4.8 MB/s | 18 MB 00:03 2026-03-09T16:07:25.540 INFO:teuthology.orchestra.run.vm05.stdout:(9/117): ceph-radosgw-18.2.7-1055.gab47f43c.el9 5.5 MB/s | 7.8 MB 00:01 2026-03-09T16:07:25.598 INFO:teuthology.orchestra.run.vm05.stdout:(10/117): ceph-selinux-18.2.7-1055.gab47f43c.el 61 kB/s | 25 kB 00:00 2026-03-09T16:07:25.750 INFO:teuthology.orchestra.run.vm05.stdout:(11/117): ceph-osd-18.2.7-1055.gab47f43c.el9.x8 7.7 MB/s | 17 MB 00:02 2026-03-09T16:07:25.871 INFO:teuthology.orchestra.run.vm05.stdout:(12/117): libcephfs-devel-18.2.7-1055.gab47f43c 114 kB/s | 31 kB 00:00 2026-03-09T16:07:25.914 INFO:teuthology.orchestra.run.vm05.stdout:(13/117): libcephfs2-18.2.7-1055.gab47f43c.el9. 4.2 MB/s | 710 kB 00:00 2026-03-09T16:07:26.007 INFO:teuthology.orchestra.run.vm05.stdout:(14/117): libcephsqlite-18.2.7-1055.gab47f43c.e 1.2 MB/s | 166 kB 00:00 2026-03-09T16:07:26.046 INFO:teuthology.orchestra.run.vm05.stdout:(15/117): librados-devel-18.2.7-1055.gab47f43c. 961 kB/s | 126 kB 00:00 2026-03-09T16:07:26.249 INFO:teuthology.orchestra.run.vm05.stdout:(16/117): libradosstriper1-18.2.7-1055.gab47f43 1.9 MB/s | 475 kB 00:00 2026-03-09T16:07:26.281 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - Extras packages 33 kB/s | 20 kB 00:00 2026-03-09T16:07:26.473 INFO:teuthology.orchestra.run.vm05.stdout:(17/117): python3-ceph-argparse-18.2.7-1055.gab 201 kB/s | 45 kB 00:00 2026-03-09T16:07:26.665 INFO:teuthology.orchestra.run.vm05.stdout:(18/117): python3-ceph-common-18.2.7-1055.gab47 678 kB/s | 130 kB 00:00 2026-03-09T16:07:26.790 INFO:teuthology.orchestra.run.vm05.stdout:(19/117): python3-cephfs-18.2.7-1055.gab47f43c. 1.3 MB/s | 162 kB 00:00 2026-03-09T16:07:26.835 INFO:teuthology.orchestra.run.vm03.stdout:Extra Packages for Enterprise Linux 43 MB/s | 20 MB 00:00 2026-03-09T16:07:26.915 INFO:teuthology.orchestra.run.vm05.stdout:(20/117): python3-rados-18.2.7-1055.gab47f43c.e 2.5 MB/s | 322 kB 00:00 2026-03-09T16:07:27.041 INFO:teuthology.orchestra.run.vm05.stdout:(21/117): python3-rbd-18.2.7-1055.gab47f43c.el9 2.3 MB/s | 302 kB 00:00 2026-03-09T16:07:27.055 INFO:teuthology.orchestra.run.vm05.stdout:(22/117): librgw2-18.2.7-1055.gab47f43c.el9.x86 4.5 MB/s | 4.5 MB 00:01 2026-03-09T16:07:27.163 INFO:teuthology.orchestra.run.vm05.stdout:(23/117): python3-rgw-18.2.7-1055.gab47f43c.el9 817 kB/s | 100 kB 00:00 2026-03-09T16:07:27.174 INFO:teuthology.orchestra.run.vm05.stdout:(24/117): rbd-fuse-18.2.7-1055.gab47f43c.el9.x8 733 kB/s | 87 kB 00:00 2026-03-09T16:07:27.550 INFO:teuthology.orchestra.run.vm05.stdout:(25/117): rbd-nbd-18.2.7-1055.gab47f43c.el9.x86 457 kB/s | 172 kB 00:00 2026-03-09T16:07:27.683 INFO:teuthology.orchestra.run.vm05.stdout:(26/117): ceph-grafana-dashboards-18.2.7-1055.g 182 kB/s | 24 kB 00:00 2026-03-09T16:07:27.795 INFO:teuthology.orchestra.run.vm05.stdout:(27/117): rbd-mirror-18.2.7-1055.gab47f43c.el9. 4.7 MB/s | 3.0 MB 00:00 2026-03-09T16:07:27.804 INFO:teuthology.orchestra.run.vm05.stdout:(28/117): ceph-mgr-cephadm-18.2.7-1055.gab47f43 1.1 MB/s | 140 kB 00:00 2026-03-09T16:07:28.408 INFO:teuthology.orchestra.run.vm05.stdout:(29/117): ceph-mgr-dashboard-18.2.7-1055.gab47f 5.8 MB/s | 3.5 MB 00:00 2026-03-09T16:07:28.539 INFO:teuthology.orchestra.run.vm05.stdout:(30/117): ceph-mgr-modules-core-18.2.7-1055.gab 1.9 MB/s | 248 kB 00:00 2026-03-09T16:07:28.660 INFO:teuthology.orchestra.run.vm05.stdout:(31/117): ceph-mgr-rook-18.2.7-1055.gab47f43c.e 405 kB/s | 49 kB 00:00 2026-03-09T16:07:28.781 INFO:teuthology.orchestra.run.vm05.stdout:(32/117): ceph-prometheus-alerts-18.2.7-1055.ga 139 kB/s | 17 kB 00:00 2026-03-09T16:07:28.995 INFO:teuthology.orchestra.run.vm05.stdout:(33/117): cephadm-18.2.7-1055.gab47f43c.el9.noa 1.0 MB/s | 226 kB 00:00 2026-03-09T16:07:29.083 INFO:teuthology.orchestra.run.vm05.stdout:(34/117): ledmon-libs-1.1.0-3.el9.x86_64.rpm 464 kB/s | 40 kB 00:00 2026-03-09T16:07:29.142 INFO:teuthology.orchestra.run.vm05.stdout:(35/117): libconfig-1.7.2-9.el9.x86_64.rpm 1.2 MB/s | 72 kB 00:00 2026-03-09T16:07:29.235 INFO:teuthology.orchestra.run.vm05.stdout:(36/117): libgfortran-11.5.0-14.el9.x86_64.rpm 8.4 MB/s | 794 kB 00:00 2026-03-09T16:07:29.266 INFO:teuthology.orchestra.run.vm05.stdout:(37/117): libquadmath-11.5.0-14.el9.x86_64.rpm 5.8 MB/s | 184 kB 00:00 2026-03-09T16:07:29.296 INFO:teuthology.orchestra.run.vm05.stdout:(38/117): mailcap-2.1.49-5.el9.noarch.rpm 1.1 MB/s | 33 kB 00:00 2026-03-09T16:07:29.328 INFO:teuthology.orchestra.run.vm05.stdout:(39/117): python3-cffi-1.14.5-5.el9.x86_64.rpm 7.7 MB/s | 253 kB 00:00 2026-03-09T16:07:29.355 INFO:teuthology.orchestra.run.vm05.stdout:(40/117): ceph-mgr-diskprediction-local-18.2.7- 4.8 MB/s | 7.4 MB 00:01 2026-03-09T16:07:29.419 INFO:teuthology.orchestra.run.vm05.stdout:(41/117): python3-cryptography-36.0.1-5.el9.x86 14 MB/s | 1.2 MB 00:00 2026-03-09T16:07:29.450 INFO:teuthology.orchestra.run.vm05.stdout:(42/117): python3-pycparser-2.20-6.el9.noarch.r 4.3 MB/s | 135 kB 00:00 2026-03-09T16:07:29.481 INFO:teuthology.orchestra.run.vm05.stdout:(43/117): python3-requests-2.25.1-10.el9.noarch 4.0 MB/s | 126 kB 00:00 2026-03-09T16:07:29.498 INFO:teuthology.orchestra.run.vm05.stdout:(44/117): python3-ply-3.11-14.el9.noarch.rpm 744 kB/s | 106 kB 00:00 2026-03-09T16:07:29.513 INFO:teuthology.orchestra.run.vm05.stdout:(45/117): python3-urllib3-1.26.5-7.el9.noarch.r 6.7 MB/s | 218 kB 00:00 2026-03-09T16:07:29.635 INFO:teuthology.orchestra.run.vm05.stdout:(46/117): flexiblas-3.0.4-9.el9.x86_64.rpm 243 kB/s | 30 kB 00:00 2026-03-09T16:07:29.694 INFO:teuthology.orchestra.run.vm05.stdout:(47/117): boost-program-options-1.75.0-13.el9.x 532 kB/s | 104 kB 00:00 2026-03-09T16:07:29.724 INFO:teuthology.orchestra.run.vm05.stdout:(48/117): flexiblas-openblas-openmp-3.0.4-9.el9 501 kB/s | 15 kB 00:00 2026-03-09T16:07:29.784 INFO:teuthology.orchestra.run.vm05.stdout:(49/117): libpmemobj-1.12.1-1.el9.x86_64.rpm 2.6 MB/s | 160 kB 00:00 2026-03-09T16:07:29.815 INFO:teuthology.orchestra.run.vm05.stdout:(50/117): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.4 MB/s | 45 kB 00:00 2026-03-09T16:07:29.869 INFO:teuthology.orchestra.run.vm05.stdout:(51/117): flexiblas-netlib-3.0.4-9.el9.x86_64.r 13 MB/s | 3.0 MB 00:00 2026-03-09T16:07:29.877 INFO:teuthology.orchestra.run.vm05.stdout:(52/117): librdkafka-1.6.1-102.el9.x86_64.rpm 11 MB/s | 662 kB 00:00 2026-03-09T16:07:29.901 INFO:teuthology.orchestra.run.vm05.stdout:(53/117): libstoragemgmt-1.10.1-1.el9.x86_64.rp 7.6 MB/s | 246 kB 00:00 2026-03-09T16:07:29.908 INFO:teuthology.orchestra.run.vm05.stdout:(54/117): libxslt-1.1.34-12.el9.x86_64.rpm 7.4 MB/s | 233 kB 00:00 2026-03-09T16:07:29.934 INFO:teuthology.orchestra.run.vm05.stdout:(55/117): lttng-ust-2.12.0-6.el9.x86_64.rpm 9.0 MB/s | 292 kB 00:00 2026-03-09T16:07:29.937 INFO:teuthology.orchestra.run.vm05.stdout:(56/117): openblas-0.3.29-1.el9.x86_64.rpm 1.4 MB/s | 42 kB 00:00 2026-03-09T16:07:30.123 INFO:teuthology.orchestra.run.vm05.stdout:(57/117): openblas-openmp-0.3.29-1.el9.x86_64.r 28 MB/s | 5.3 MB 00:00 2026-03-09T16:07:30.158 INFO:teuthology.orchestra.run.vm05.stdout:(58/117): python3-devel-3.9.25-3.el9.x86_64.rpm 7.3 MB/s | 244 kB 00:00 2026-03-09T16:07:30.191 INFO:teuthology.orchestra.run.vm05.stdout:(59/117): python3-babel-2.9.1-2.el9.noarch.rpm 24 MB/s | 6.0 MB 00:00 2026-03-09T16:07:30.195 INFO:teuthology.orchestra.run.vm05.stdout:(60/117): python3-jinja2-2.11.3-8.el9.noarch.rp 6.3 MB/s | 249 kB 00:00 2026-03-09T16:07:30.221 INFO:teuthology.orchestra.run.vm05.stdout:(61/117): python3-jmespath-1.0.1-1.el9.noarch.r 1.6 MB/s | 48 kB 00:00 2026-03-09T16:07:30.226 INFO:teuthology.orchestra.run.vm05.stdout:(62/117): python3-libstoragemgmt-1.10.1-1.el9.x 5.6 MB/s | 177 kB 00:00 2026-03-09T16:07:30.253 INFO:teuthology.orchestra.run.vm05.stdout:(63/117): python3-mako-1.1.4-6.el9.noarch.rpm 5.4 MB/s | 172 kB 00:00 2026-03-09T16:07:30.255 INFO:teuthology.orchestra.run.vm05.stdout:(64/117): python3-markupsafe-1.1.1-12.el9.x86_6 1.2 MB/s | 35 kB 00:00 2026-03-09T16:07:30.294 INFO:teuthology.orchestra.run.vm05.stdout:(65/117): python3-numpy-f2py-1.23.5-2.el9.x86_6 11 MB/s | 442 kB 00:00 2026-03-09T16:07:30.326 INFO:teuthology.orchestra.run.vm05.stdout:(66/117): python3-pyasn1-0.4.8-7.el9.noarch.rpm 5.0 MB/s | 157 kB 00:00 2026-03-09T16:07:30.358 INFO:teuthology.orchestra.run.vm05.stdout:(67/117): python3-pyasn1-modules-0.4.8-7.el9.no 8.5 MB/s | 277 kB 00:00 2026-03-09T16:07:30.387 INFO:teuthology.orchestra.run.vm05.stdout:(68/117): python3-requests-oauthlib-1.3.0-12.el 1.8 MB/s | 54 kB 00:00 2026-03-09T16:07:30.576 INFO:teuthology.orchestra.run.vm05.stdout:(69/117): python3-numpy-1.23.5-2.el9.x86_64.rpm 19 MB/s | 6.1 MB 00:00 2026-03-09T16:07:30.607 INFO:teuthology.orchestra.run.vm05.stdout:(70/117): python3-toml-0.10.2-6.el9.noarch.rpm 1.3 MB/s | 42 kB 00:00 2026-03-09T16:07:30.640 INFO:teuthology.orchestra.run.vm05.stdout:(71/117): socat-1.7.4.1-8.el9.x86_64.rpm 9.2 MB/s | 303 kB 00:00 2026-03-09T16:07:30.686 INFO:teuthology.orchestra.run.vm05.stdout:(72/117): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.4 MB/s | 64 kB 00:00 2026-03-09T16:07:30.713 INFO:teuthology.orchestra.run.vm05.stdout:(73/117): fmt-8.1.1-5.el9.x86_64.rpm 4.0 MB/s | 111 kB 00:00 2026-03-09T16:07:30.744 INFO:teuthology.orchestra.run.vm05.stdout:(74/117): gperftools-libs-2.9.1-3.el9.x86_64.rp 9.6 MB/s | 308 kB 00:00 2026-03-09T16:07:30.820 INFO:teuthology.orchestra.run.vm05.stdout:(75/117): libarrow-9.0.0-15.el9.x86_64.rpm 58 MB/s | 4.4 MB 00:00 2026-03-09T16:07:30.823 INFO:teuthology.orchestra.run.vm05.stdout:(76/117): libarrow-doc-9.0.0-15.el9.noarch.rpm 9.2 MB/s | 25 kB 00:00 2026-03-09T16:07:30.826 INFO:teuthology.orchestra.run.vm05.stdout:(77/117): liboath-2.6.12-1.el9.x86_64.rpm 19 MB/s | 49 kB 00:00 2026-03-09T16:07:30.829 INFO:teuthology.orchestra.run.vm05.stdout:(78/117): libunwind-1.6.2-1.el9.x86_64.rpm 24 MB/s | 67 kB 00:00 2026-03-09T16:07:30.842 INFO:teuthology.orchestra.run.vm05.stdout:(79/117): parquet-libs-9.0.0-15.el9.x86_64.rpm 68 MB/s | 838 kB 00:00 2026-03-09T16:07:30.852 INFO:teuthology.orchestra.run.vm05.stdout:(80/117): python3-asyncssh-2.13.2-5.el9.noarch. 55 MB/s | 548 kB 00:00 2026-03-09T16:07:30.854 INFO:teuthology.orchestra.run.vm05.stdout:(81/117): python3-autocommand-2.2.2-8.el9.noarc 12 MB/s | 29 kB 00:00 2026-03-09T16:07:30.858 INFO:teuthology.orchestra.run.vm05.stdout:(82/117): python3-backports-tarfile-1.2.0-1.el9 19 MB/s | 60 kB 00:00 2026-03-09T16:07:30.860 INFO:teuthology.orchestra.run.vm05.stdout:(83/117): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 17 MB/s | 43 kB 00:00 2026-03-09T16:07:30.863 INFO:teuthology.orchestra.run.vm05.stdout:(84/117): python3-cachetools-4.2.4-1.el9.noarch 13 MB/s | 32 kB 00:00 2026-03-09T16:07:30.866 INFO:teuthology.orchestra.run.vm05.stdout:(85/117): python3-certifi-2023.05.07-4.el9.noar 5.5 MB/s | 14 kB 00:00 2026-03-09T16:07:30.870 INFO:teuthology.orchestra.run.vm05.stdout:(86/117): python3-cheroot-10.0.1-4.el9.noarch.r 42 MB/s | 173 kB 00:00 2026-03-09T16:07:30.890 INFO:teuthology.orchestra.run.vm05.stdout:(87/117): python3-cherrypy-18.6.1-2.el9.noarch. 18 MB/s | 358 kB 00:00 2026-03-09T16:07:30.896 INFO:teuthology.orchestra.run.vm05.stdout:(88/117): python3-google-auth-2.45.0-1.el9.noar 48 MB/s | 254 kB 00:00 2026-03-09T16:07:30.899 INFO:teuthology.orchestra.run.vm05.stdout:(89/117): python3-jaraco-8.2.1-3.el9.noarch.rpm 3.8 MB/s | 11 kB 00:00 2026-03-09T16:07:30.902 INFO:teuthology.orchestra.run.vm05.stdout:(90/117): python3-jaraco-classes-3.2.1-5.el9.no 8.2 MB/s | 18 kB 00:00 2026-03-09T16:07:30.904 INFO:teuthology.orchestra.run.vm05.stdout:(91/117): python3-jaraco-collections-3.0.0-8.el 9.8 MB/s | 23 kB 00:00 2026-03-09T16:07:30.907 INFO:teuthology.orchestra.run.vm05.stdout:(92/117): python3-jaraco-context-6.0.1-3.el9.no 8.4 MB/s | 20 kB 00:00 2026-03-09T16:07:30.910 INFO:teuthology.orchestra.run.vm05.stdout:(93/117): python3-jaraco-functools-3.5.0-2.el9. 6.8 MB/s | 19 kB 00:00 2026-03-09T16:07:30.913 INFO:teuthology.orchestra.run.vm05.stdout:(94/117): python3-jaraco-text-4.0.0-2.el9.noarc 11 MB/s | 26 kB 00:00 2026-03-09T16:07:30.929 INFO:teuthology.orchestra.run.vm05.stdout:(95/117): python3-kubernetes-26.1.0-3.el9.noarc 64 MB/s | 1.0 MB 00:00 2026-03-09T16:07:30.932 INFO:teuthology.orchestra.run.vm05.stdout:(96/117): python3-logutils-0.3.5-21.el9.noarch. 16 MB/s | 46 kB 00:00 2026-03-09T16:07:30.936 INFO:teuthology.orchestra.run.vm05.stdout:(97/117): python3-more-itertools-8.12.0-2.el9.n 26 MB/s | 79 kB 00:00 2026-03-09T16:07:30.939 INFO:teuthology.orchestra.run.vm05.stdout:(98/117): python3-natsort-7.1.1-5.el9.noarch.rp 20 MB/s | 58 kB 00:00 2026-03-09T16:07:30.945 INFO:teuthology.orchestra.run.vm05.stdout:(99/117): python3-pecan-1.4.2-3.el9.noarch.rpm 49 MB/s | 272 kB 00:00 2026-03-09T16:07:30.948 INFO:teuthology.orchestra.run.vm05.stdout:(100/117): python3-portend-3.1.0-2.el9.noarch.r 7.7 MB/s | 16 kB 00:00 2026-03-09T16:07:30.951 INFO:teuthology.orchestra.run.vm05.stdout:(101/117): python3-pyOpenSSL-21.0.0-1.el9.noarc 28 MB/s | 90 kB 00:00 2026-03-09T16:07:30.954 INFO:teuthology.orchestra.run.vm05.stdout:(102/117): python3-repoze-lru-0.7-16.el9.noarch 11 MB/s | 31 kB 00:00 2026-03-09T16:07:30.960 INFO:teuthology.orchestra.run.vm05.stdout:(103/117): python3-routes-2.5.1-5.el9.noarch.rp 33 MB/s | 188 kB 00:00 2026-03-09T16:07:30.966 INFO:teuthology.orchestra.run.vm05.stdout:(104/117): python3-rsa-4.9-2.el9.noarch.rpm 9.7 MB/s | 59 kB 00:00 2026-03-09T16:07:30.983 INFO:teuthology.orchestra.run.vm05.stdout:(105/117): python3-tempora-5.0.0-2.el9.noarch.r 2.1 MB/s | 36 kB 00:00 2026-03-09T16:07:30.987 INFO:teuthology.orchestra.run.vm05.stdout:(106/117): python3-typing-extensions-4.15.0-1.e 19 MB/s | 86 kB 00:00 2026-03-09T16:07:30.995 INFO:teuthology.orchestra.run.vm05.stdout:(107/117): python3-webob-1.8.8-2.el9.noarch.rpm 34 MB/s | 230 kB 00:00 2026-03-09T16:07:30.998 INFO:teuthology.orchestra.run.vm05.stdout:(108/117): python3-websocket-client-1.2.3-2.el9 25 MB/s | 90 kB 00:00 2026-03-09T16:07:31.008 INFO:teuthology.orchestra.run.vm05.stdout:(109/117): python3-werkzeug-2.0.3-3.el9.1.noarc 45 MB/s | 427 kB 00:00 2026-03-09T16:07:31.010 INFO:teuthology.orchestra.run.vm05.stdout:(110/117): python3-xmltodict-0.12.0-15.el9.noar 8.2 MB/s | 22 kB 00:00 2026-03-09T16:07:31.013 INFO:teuthology.orchestra.run.vm05.stdout:(111/117): python3-zc-lockfile-2.0-10.el9.noarc 8.7 MB/s | 20 kB 00:00 2026-03-09T16:07:31.020 INFO:teuthology.orchestra.run.vm05.stdout:(112/117): re2-20211101-20.el9.x86_64.rpm 30 MB/s | 191 kB 00:00 2026-03-09T16:07:31.045 INFO:teuthology.orchestra.run.vm05.stdout:(113/117): thrift-0.15.0-4.el9.x86_64.rpm 62 MB/s | 1.6 MB 00:00 2026-03-09T16:07:31.147 INFO:teuthology.orchestra.run.vm05.stdout:(114/117): python3-scipy-1.9.3-2.el9.x86_64.rpm 25 MB/s | 19 MB 00:00 2026-03-09T16:07:31.390 INFO:teuthology.orchestra.run.vm03.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-09T16:07:32.217 INFO:teuthology.orchestra.run.vm05.stdout:(115/117): librados2-18.2.7-1055.gab47f43c.el9. 2.8 MB/s | 3.3 MB 00:01 2026-03-09T16:07:32.248 INFO:teuthology.orchestra.run.vm05.stdout:(116/117): librbd1-18.2.7-1055.gab47f43c.el9.x8 2.7 MB/s | 3.0 MB 00:01 2026-03-09T16:07:32.747 INFO:teuthology.orchestra.run.vm03.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T16:07:32.748 INFO:teuthology.orchestra.run.vm03.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T16:07:32.753 INFO:teuthology.orchestra.run.vm03.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T16:07:32.753 INFO:teuthology.orchestra.run.vm03.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T16:07:32.783 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout:======================================================================================= 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout:======================================================================================= 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout:Installing: 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 6.5 k 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 5.1 M 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 850 k 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 143 k 2026-03-09T16:07:32.787 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 1.5 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 140 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 3.5 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 7.4 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 49 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 7.8 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 36 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 226 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 31 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 710 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 126 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 162 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 322 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 302 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 100 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 87 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 172 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout:Upgrading: 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.3 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout:Installing dependencies: 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 18 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 24 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 2.1 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 248 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.7 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 17 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 17 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 25 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 166 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 475 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.5 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T16:07:32.788 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 45 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 130 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout:======================================================================================= 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout:Install 115 Packages 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout:Upgrade 2 Packages 2026-03-09T16:07:32.789 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:07:32.790 INFO:teuthology.orchestra.run.vm03.stdout:Total download size: 181 M 2026-03-09T16:07:32.790 INFO:teuthology.orchestra.run.vm03.stdout:Downloading Packages: 2026-03-09T16:07:34.888 INFO:teuthology.orchestra.run.vm03.stdout:(1/117): ceph-18.2.7-1055.gab47f43c.el9.x86_64. 13 kB/s | 6.5 kB 00:00 2026-03-09T16:07:35.031 INFO:teuthology.orchestra.run.vm05.stdout:(117/117): ceph-test-18.2.7-1055.gab47f43c.el9. 3.8 MB/s | 36 MB 00:09 2026-03-09T16:07:35.034 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-09T16:07:35.034 INFO:teuthology.orchestra.run.vm05.stdout:Total 12 MB/s | 181 MB 00:14 2026-03-09T16:07:35.513 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:07:35.554 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:07:35.554 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:07:35.731 INFO:teuthology.orchestra.run.vm03.stdout:(2/117): ceph-fuse-18.2.7-1055.gab47f43c.el9.x8 1.0 MB/s | 850 kB 00:00 2026-03-09T16:07:35.853 INFO:teuthology.orchestra.run.vm03.stdout:(3/117): ceph-immutable-object-cache-18.2.7-105 1.1 MB/s | 143 kB 00:00 2026-03-09T16:07:36.072 INFO:teuthology.orchestra.run.vm03.stdout:(4/117): ceph-base-18.2.7-1055.gab47f43c.el9.x8 3.1 MB/s | 5.1 MB 00:01 2026-03-09T16:07:36.233 INFO:teuthology.orchestra.run.vm03.stdout:(5/117): ceph-mds-18.2.7-1055.gab47f43c.el9.x86 5.5 MB/s | 2.1 MB 00:00 2026-03-09T16:07:36.269 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:07:36.269 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:07:36.325 INFO:teuthology.orchestra.run.vm03.stdout:(6/117): ceph-mgr-18.2.7-1055.gab47f43c.el9.x86 5.8 MB/s | 1.5 MB 00:00 2026-03-09T16:07:36.736 INFO:teuthology.orchestra.run.vm03.stdout:(7/117): ceph-mon-18.2.7-1055.gab47f43c.el9.x86 9.3 MB/s | 4.7 MB 00:00 2026-03-09T16:07:37.080 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:07:37.099 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/119 2026-03-09T16:07:37.112 INFO:teuthology.orchestra.run.vm05.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/119 2026-03-09T16:07:37.291 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/119 2026-03-09T16:07:37.292 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:07:37.345 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:07:37.346 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T16:07:37.378 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T16:07:37.389 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T16:07:37.394 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/119 2026-03-09T16:07:37.395 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/119 2026-03-09T16:07:37.405 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/119 2026-03-09T16:07:37.406 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:07:37.462 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:07:37.464 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T16:07:37.515 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T16:07:37.520 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/119 2026-03-09T16:07:37.545 INFO:teuthology.orchestra.run.vm05.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/119 2026-03-09T16:07:37.555 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/119 2026-03-09T16:07:37.558 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/119 2026-03-09T16:07:37.586 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/119 2026-03-09T16:07:37.603 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/119 2026-03-09T16:07:37.608 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/119 2026-03-09T16:07:37.616 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/119 2026-03-09T16:07:37.619 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/119 2026-03-09T16:07:37.624 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/119 2026-03-09T16:07:37.635 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 22/119 2026-03-09T16:07:37.655 INFO:teuthology.orchestra.run.vm03.stdout:(8/117): ceph-common-18.2.7-1055.gab47f43c.el9. 5.7 MB/s | 18 MB 00:03 2026-03-09T16:07:37.668 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 23/119 2026-03-09T16:07:37.698 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/119 2026-03-09T16:07:37.721 INFO:teuthology.orchestra.run.vm03.stdout:(9/117): ceph-radosgw-18.2.7-1055.gab47f43c.el9 7.9 MB/s | 7.8 MB 00:00 2026-03-09T16:07:37.761 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/119 2026-03-09T16:07:37.776 INFO:teuthology.orchestra.run.vm03.stdout:(10/117): ceph-selinux-18.2.7-1055.gab47f43c.el 208 kB/s | 25 kB 00:00 2026-03-09T16:07:37.778 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/119 2026-03-09T16:07:37.786 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/119 2026-03-09T16:07:37.794 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/119 2026-03-09T16:07:37.799 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 29/119 2026-03-09T16:07:37.835 INFO:teuthology.orchestra.run.vm05.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/119 2026-03-09T16:07:37.842 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/119 2026-03-09T16:07:37.861 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/119 2026-03-09T16:07:37.887 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/119 2026-03-09T16:07:37.894 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/119 2026-03-09T16:07:37.895 INFO:teuthology.orchestra.run.vm03.stdout:(11/117): libcephfs-devel-18.2.7-1055.gab47f43c 261 kB/s | 31 kB 00:00 2026-03-09T16:07:37.901 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/119 2026-03-09T16:07:37.915 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/119 2026-03-09T16:07:37.927 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/119 2026-03-09T16:07:37.938 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/119 2026-03-09T16:07:38.001 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/119 2026-03-09T16:07:38.011 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/119 2026-03-09T16:07:38.021 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/119 2026-03-09T16:07:38.072 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/119 2026-03-09T16:07:38.483 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/119 2026-03-09T16:07:38.495 INFO:teuthology.orchestra.run.vm03.stdout:(12/117): ceph-osd-18.2.7-1055.gab47f43c.el9.x8 7.7 MB/s | 17 MB 00:02 2026-03-09T16:07:38.506 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/119 2026-03-09T16:07:38.512 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/119 2026-03-09T16:07:38.520 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/119 2026-03-09T16:07:38.525 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/119 2026-03-09T16:07:38.532 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/119 2026-03-09T16:07:38.536 INFO:teuthology.orchestra.run.vm05.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/119 2026-03-09T16:07:38.539 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/119 2026-03-09T16:07:38.551 INFO:teuthology.orchestra.run.vm05.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/119 2026-03-09T16:07:38.559 INFO:teuthology.orchestra.run.vm05.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/119 2026-03-09T16:07:38.564 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/119 2026-03-09T16:07:38.574 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/119 2026-03-09T16:07:38.579 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/119 2026-03-09T16:07:38.589 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/119 2026-03-09T16:07:38.594 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/119 2026-03-09T16:07:38.616 INFO:teuthology.orchestra.run.vm03.stdout:(13/117): libcephsqlite-18.2.7-1055.gab47f43c.e 1.3 MB/s | 166 kB 00:00 2026-03-09T16:07:38.637 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/119 2026-03-09T16:07:38.735 INFO:teuthology.orchestra.run.vm03.stdout:(14/117): librados-devel-18.2.7-1055.gab47f43c. 1.0 MB/s | 126 kB 00:00 2026-03-09T16:07:38.771 INFO:teuthology.orchestra.run.vm03.stdout:(15/117): libcephfs2-18.2.7-1055.gab47f43c.el9. 811 kB/s | 710 kB 00:00 2026-03-09T16:07:38.858 INFO:teuthology.orchestra.run.vm03.stdout:(16/117): libradosstriper1-18.2.7-1055.gab47f43 3.8 MB/s | 475 kB 00:00 2026-03-09T16:07:38.925 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/119 2026-03-09T16:07:38.958 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/119 2026-03-09T16:07:38.965 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T16:07:38.977 INFO:teuthology.orchestra.run.vm03.stdout:(17/117): python3-ceph-argparse-18.2.7-1055.gab 379 kB/s | 45 kB 00:00 2026-03-09T16:07:39.036 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/119 2026-03-09T16:07:39.040 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/119 2026-03-09T16:07:39.068 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/119 2026-03-09T16:07:39.097 INFO:teuthology.orchestra.run.vm03.stdout:(18/117): python3-ceph-common-18.2.7-1055.gab47 1.1 MB/s | 130 kB 00:00 2026-03-09T16:07:39.217 INFO:teuthology.orchestra.run.vm03.stdout:(19/117): python3-cephfs-18.2.7-1055.gab47f43c. 1.3 MB/s | 162 kB 00:00 2026-03-09T16:07:39.338 INFO:teuthology.orchestra.run.vm03.stdout:(20/117): python3-rados-18.2.7-1055.gab47f43c.e 2.6 MB/s | 322 kB 00:00 2026-03-09T16:07:39.465 INFO:teuthology.orchestra.run.vm03.stdout:(21/117): python3-rbd-18.2.7-1055.gab47f43c.el9 2.3 MB/s | 302 kB 00:00 2026-03-09T16:07:39.470 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/119 2026-03-09T16:07:39.557 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T16:07:39.585 INFO:teuthology.orchestra.run.vm03.stdout:(22/117): python3-rgw-18.2.7-1055.gab47f43c.el9 837 kB/s | 100 kB 00:00 2026-03-09T16:07:39.705 INFO:teuthology.orchestra.run.vm03.stdout:(23/117): rbd-fuse-18.2.7-1055.gab47f43c.el9.x8 723 kB/s | 87 kB 00:00 2026-03-09T16:07:39.734 INFO:teuthology.orchestra.run.vm03.stdout:(24/117): librgw2-18.2.7-1055.gab47f43c.el9.x86 4.7 MB/s | 4.5 MB 00:00 2026-03-09T16:07:39.855 INFO:teuthology.orchestra.run.vm03.stdout:(25/117): rbd-nbd-18.2.7-1055.gab47f43c.el9.x86 1.4 MB/s | 172 kB 00:00 2026-03-09T16:07:39.974 INFO:teuthology.orchestra.run.vm03.stdout:(26/117): ceph-grafana-dashboards-18.2.7-1055.g 204 kB/s | 24 kB 00:00 2026-03-09T16:07:40.095 INFO:teuthology.orchestra.run.vm03.stdout:(27/117): ceph-mgr-cephadm-18.2.7-1055.gab47f43 1.1 MB/s | 140 kB 00:00 2026-03-09T16:07:40.305 INFO:teuthology.orchestra.run.vm03.stdout:(28/117): rbd-mirror-18.2.7-1055.gab47f43c.el9. 5.0 MB/s | 3.0 MB 00:00 2026-03-09T16:07:40.351 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T16:07:40.377 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/119 2026-03-09T16:07:40.383 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/119 2026-03-09T16:07:40.388 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/119 2026-03-09T16:07:40.537 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/119 2026-03-09T16:07:40.540 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T16:07:40.570 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T16:07:40.573 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 73/119 2026-03-09T16:07:40.581 INFO:teuthology.orchestra.run.vm05.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/119 2026-03-09T16:07:40.794 INFO:teuthology.orchestra.run.vm05.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/119 2026-03-09T16:07:40.797 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T16:07:40.815 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T16:07:40.823 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 77/119 2026-03-09T16:07:40.839 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/119 2026-03-09T16:07:40.858 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/119 2026-03-09T16:07:40.932 INFO:teuthology.orchestra.run.vm03.stdout:(29/117): ceph-mgr-dashboard-18.2.7-1055.gab47f 4.2 MB/s | 3.5 MB 00:00 2026-03-09T16:07:40.946 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/119 2026-03-09T16:07:40.959 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/119 2026-03-09T16:07:40.986 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/119 2026-03-09T16:07:41.021 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/119 2026-03-09T16:07:41.054 INFO:teuthology.orchestra.run.vm03.stdout:(30/117): ceph-mgr-modules-core-18.2.7-1055.gab 2.0 MB/s | 248 kB 00:00 2026-03-09T16:07:41.081 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/119 2026-03-09T16:07:41.091 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/119 2026-03-09T16:07:41.095 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 86/119 2026-03-09T16:07:41.100 INFO:teuthology.orchestra.run.vm05.stdout: Installing : mailcap-2.1.49-5.el9.noarch 87/119 2026-03-09T16:07:41.103 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 88/119 2026-03-09T16:07:41.122 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:41.122 INFO:teuthology.orchestra.run.vm05.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T16:07:41.122 INFO:teuthology.orchestra.run.vm05.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T16:07:41.122 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:41.134 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:41.167 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:41.167 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T16:07:41.167 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:41.174 INFO:teuthology.orchestra.run.vm03.stdout:(31/117): ceph-mgr-rook-18.2.7-1055.gab47f43c.e 412 kB/s | 49 kB 00:00 2026-03-09T16:07:41.184 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 90/119 2026-03-09T16:07:41.246 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T16:07:41.250 INFO:teuthology.orchestra.run.vm05.stdout: Installing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T16:07:41.256 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 92/119 2026-03-09T16:07:41.288 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 93/119 2026-03-09T16:07:41.292 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 94/119 2026-03-09T16:07:41.293 INFO:teuthology.orchestra.run.vm03.stdout:(32/117): ceph-prometheus-alerts-18.2.7-1055.ga 140 kB/s | 17 kB 00:00 2026-03-09T16:07:41.415 INFO:teuthology.orchestra.run.vm03.stdout:(33/117): cephadm-18.2.7-1055.gab47f43c.el9.noa 1.8 MB/s | 226 kB 00:00 2026-03-09T16:07:41.599 INFO:teuthology.orchestra.run.vm03.stdout:(34/117): ledmon-libs-1.1.0-3.el9.x86_64.rpm 220 kB/s | 40 kB 00:00 2026-03-09T16:07:41.707 INFO:teuthology.orchestra.run.vm03.stdout:(35/117): libconfig-1.7.2-9.el9.x86_64.rpm 666 kB/s | 72 kB 00:00 2026-03-09T16:07:41.855 INFO:teuthology.orchestra.run.vm03.stdout:(36/117): ceph-mgr-diskprediction-local-18.2.7- 4.8 MB/s | 7.4 MB 00:01 2026-03-09T16:07:42.011 INFO:teuthology.orchestra.run.vm03.stdout:(37/117): libgfortran-11.5.0-14.el9.x86_64.rpm 2.6 MB/s | 794 kB 00:00 2026-03-09T16:07:42.086 INFO:teuthology.orchestra.run.vm03.stdout:(38/117): libquadmath-11.5.0-14.el9.x86_64.rpm 800 kB/s | 184 kB 00:00 2026-03-09T16:07:42.086 INFO:teuthology.orchestra.run.vm03.stdout:(39/117): mailcap-2.1.49-5.el9.noarch.rpm 442 kB/s | 33 kB 00:00 2026-03-09T16:07:42.146 INFO:teuthology.orchestra.run.vm03.stdout:(40/117): python3-cffi-1.14.5-5.el9.x86_64.rpm 4.2 MB/s | 253 kB 00:00 2026-03-09T16:07:42.201 INFO:teuthology.orchestra.run.vm03.stdout:(41/117): python3-cryptography-36.0.1-5.el9.x86 11 MB/s | 1.2 MB 00:00 2026-03-09T16:07:42.201 INFO:teuthology.orchestra.run.vm03.stdout:(42/117): python3-ply-3.11-14.el9.noarch.rpm 1.9 MB/s | 106 kB 00:00 2026-03-09T16:07:42.231 INFO:teuthology.orchestra.run.vm03.stdout:(43/117): python3-pycparser-2.20-6.el9.noarch.r 4.4 MB/s | 135 kB 00:00 2026-03-09T16:07:42.330 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:42.335 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:42.351 INFO:teuthology.orchestra.run.vm03.stdout:(44/117): python3-requests-2.25.1-10.el9.noarch 848 kB/s | 126 kB 00:00 2026-03-09T16:07:42.454 INFO:teuthology.orchestra.run.vm03.stdout:(45/117): boost-program-options-1.75.0-13.el9.x 1.0 MB/s | 104 kB 00:00 2026-03-09T16:07:42.508 INFO:teuthology.orchestra.run.vm03.stdout:(46/117): flexiblas-3.0.4-9.el9.x86_64.rpm 554 kB/s | 30 kB 00:00 2026-03-09T16:07:42.523 INFO:teuthology.orchestra.run.vm03.stdout:(47/117): python3-urllib3-1.26.5-7.el9.noarch.r 747 kB/s | 218 kB 00:00 2026-03-09T16:07:42.603 INFO:teuthology.orchestra.run.vm03.stdout:(48/117): flexiblas-openblas-openmp-3.0.4-9.el9 186 kB/s | 15 kB 00:00 2026-03-09T16:07:42.644 INFO:teuthology.orchestra.run.vm03.stdout:(49/117): flexiblas-netlib-3.0.4-9.el9.x86_64.r 22 MB/s | 3.0 MB 00:00 2026-03-09T16:07:42.653 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:42.699 INFO:teuthology.orchestra.run.vm03.stdout:(50/117): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.6 MB/s | 160 kB 00:00 2026-03-09T16:07:42.700 INFO:teuthology.orchestra.run.vm03.stdout:(51/117): librabbitmq-0.11.0-7.el9.x86_64.rpm 808 kB/s | 45 kB 00:00 2026-03-09T16:07:42.703 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T16:07:42.729 INFO:teuthology.orchestra.run.vm03.stdout:(52/117): libstoragemgmt-1.10.1-1.el9.x86_64.rp 8.5 MB/s | 246 kB 00:00 2026-03-09T16:07:42.747 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T16:07:42.748 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T16:07:42.748 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T16:07:42.748 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:42.750 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T16:07:42.758 INFO:teuthology.orchestra.run.vm03.stdout:(53/117): libxslt-1.1.34-12.el9.x86_64.rpm 7.8 MB/s | 233 kB 00:00 2026-03-09T16:07:42.769 INFO:teuthology.orchestra.run.vm03.stdout:(54/117): librdkafka-1.6.1-102.el9.x86_64.rpm 9.3 MB/s | 662 kB 00:00 2026-03-09T16:07:42.788 INFO:teuthology.orchestra.run.vm03.stdout:(55/117): lttng-ust-2.12.0-6.el9.x86_64.rpm 9.8 MB/s | 292 kB 00:00 2026-03-09T16:07:42.795 INFO:teuthology.orchestra.run.vm03.stdout:(56/117): openblas-0.3.29-1.el9.x86_64.rpm 1.6 MB/s | 42 kB 00:00 2026-03-09T16:07:42.956 INFO:teuthology.orchestra.run.vm03.stdout:(57/117): openblas-openmp-0.3.29-1.el9.x86_64.r 32 MB/s | 5.3 MB 00:00 2026-03-09T16:07:42.972 INFO:teuthology.orchestra.run.vm03.stdout:(58/117): python3-babel-2.9.1-2.el9.noarch.rpm 34 MB/s | 6.0 MB 00:00 2026-03-09T16:07:42.991 INFO:teuthology.orchestra.run.vm03.stdout:(59/117): python3-devel-3.9.25-3.el9.x86_64.rpm 6.9 MB/s | 244 kB 00:00 2026-03-09T16:07:42.998 INFO:teuthology.orchestra.run.vm03.stdout:(60/117): python3-jinja2-2.11.3-8.el9.noarch.rp 9.5 MB/s | 249 kB 00:00 2026-03-09T16:07:43.019 INFO:teuthology.orchestra.run.vm03.stdout:(61/117): python3-libstoragemgmt-1.10.1-1.el9.x 8.0 MB/s | 177 kB 00:00 2026-03-09T16:07:43.022 INFO:teuthology.orchestra.run.vm03.stdout:(62/117): python3-jmespath-1.0.1-1.el9.noarch.r 1.5 MB/s | 48 kB 00:00 2026-03-09T16:07:43.047 INFO:teuthology.orchestra.run.vm03.stdout:(63/117): python3-mako-1.1.4-6.el9.noarch.rpm 6.2 MB/s | 172 kB 00:00 2026-03-09T16:07:43.048 INFO:teuthology.orchestra.run.vm03.stdout:(64/117): python3-markupsafe-1.1.1-12.el9.x86_6 1.3 MB/s | 35 kB 00:00 2026-03-09T16:07:43.200 INFO:teuthology.orchestra.run.vm03.stdout:(65/117): ceph-test-18.2.7-1055.gab47f43c.el9.x 6.6 MB/s | 36 MB 00:05 2026-03-09T16:07:43.201 INFO:teuthology.orchestra.run.vm03.stdout:(66/117): python3-numpy-f2py-1.23.5-2.el9.x86_6 2.8 MB/s | 442 kB 00:00 2026-03-09T16:07:43.236 INFO:teuthology.orchestra.run.vm03.stdout:(67/117): python3-pyasn1-modules-0.4.8-7.el9.no 7.9 MB/s | 277 kB 00:00 2026-03-09T16:07:43.253 INFO:teuthology.orchestra.run.vm03.stdout:(68/117): python3-numpy-1.23.5-2.el9.x86_64.rpm 30 MB/s | 6.1 MB 00:00 2026-03-09T16:07:43.268 INFO:teuthology.orchestra.run.vm03.stdout:(69/117): python3-requests-oauthlib-1.3.0-12.el 1.7 MB/s | 54 kB 00:00 2026-03-09T16:07:43.306 INFO:teuthology.orchestra.run.vm03.stdout:(70/117): python3-toml-0.10.2-6.el9.noarch.rpm 1.1 MB/s | 42 kB 00:00 2026-03-09T16:07:43.327 INFO:teuthology.orchestra.run.vm03.stdout:(71/117): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.2 MB/s | 157 kB 00:00 2026-03-09T16:07:43.335 INFO:teuthology.orchestra.run.vm03.stdout:(72/117): socat-1.7.4.1-8.el9.x86_64.rpm 10 MB/s | 303 kB 00:00 2026-03-09T16:07:43.343 INFO:teuthology.orchestra.run.vm03.stdout:(73/117): fmt-8.1.1-5.el9.x86_64.rpm 14 MB/s | 111 kB 00:00 2026-03-09T16:07:43.350 INFO:teuthology.orchestra.run.vm03.stdout:(74/117): xmlstarlet-1.6.1-20.el9.x86_64.rpm 2.8 MB/s | 64 kB 00:00 2026-03-09T16:07:43.351 INFO:teuthology.orchestra.run.vm03.stdout:(75/117): gperftools-libs-2.9.1-3.el9.x86_64.rp 35 MB/s | 308 kB 00:00 2026-03-09T16:07:43.354 INFO:teuthology.orchestra.run.vm03.stdout:(76/117): libarrow-doc-9.0.0-15.el9.noarch.rpm 11 MB/s | 25 kB 00:00 2026-03-09T16:07:43.357 INFO:teuthology.orchestra.run.vm03.stdout:(77/117): liboath-2.6.12-1.el9.x86_64.rpm 16 MB/s | 49 kB 00:00 2026-03-09T16:07:43.361 INFO:teuthology.orchestra.run.vm03.stdout:(78/117): libunwind-1.6.2-1.el9.x86_64.rpm 18 MB/s | 67 kB 00:00 2026-03-09T16:07:43.399 INFO:teuthology.orchestra.run.vm03.stdout:(79/117): parquet-libs-9.0.0-15.el9.x86_64.rpm 22 MB/s | 838 kB 00:00 2026-03-09T16:07:43.485 INFO:teuthology.orchestra.run.vm03.stdout:(80/117): libarrow-9.0.0-15.el9.x86_64.rpm 33 MB/s | 4.4 MB 00:00 2026-03-09T16:07:43.487 INFO:teuthology.orchestra.run.vm03.stdout:(81/117): python3-autocommand-2.2.2-8.el9.noarc 13 MB/s | 29 kB 00:00 2026-03-09T16:07:43.490 INFO:teuthology.orchestra.run.vm03.stdout:(82/117): python3-backports-tarfile-1.2.0-1.el9 26 MB/s | 60 kB 00:00 2026-03-09T16:07:43.493 INFO:teuthology.orchestra.run.vm03.stdout:(83/117): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 16 MB/s | 43 kB 00:00 2026-03-09T16:07:43.495 INFO:teuthology.orchestra.run.vm03.stdout:(84/117): python3-cachetools-4.2.4-1.el9.noarch 14 MB/s | 32 kB 00:00 2026-03-09T16:07:43.497 INFO:teuthology.orchestra.run.vm03.stdout:(85/117): python3-certifi-2023.05.07-4.el9.noar 7.4 MB/s | 14 kB 00:00 2026-03-09T16:07:43.501 INFO:teuthology.orchestra.run.vm03.stdout:(86/117): python3-cheroot-10.0.1-4.el9.noarch.r 40 MB/s | 173 kB 00:00 2026-03-09T16:07:43.510 INFO:teuthology.orchestra.run.vm03.stdout:(87/117): python3-cherrypy-18.6.1-2.el9.noarch. 43 MB/s | 358 kB 00:00 2026-03-09T16:07:43.515 INFO:teuthology.orchestra.run.vm03.stdout:(88/117): python3-google-auth-2.45.0-1.el9.noar 43 MB/s | 254 kB 00:00 2026-03-09T16:07:43.518 INFO:teuthology.orchestra.run.vm03.stdout:(89/117): python3-jaraco-8.2.1-3.el9.noarch.rpm 3.9 MB/s | 11 kB 00:00 2026-03-09T16:07:43.520 INFO:teuthology.orchestra.run.vm03.stdout:(90/117): python3-jaraco-classes-3.2.1-5.el9.no 8.4 MB/s | 18 kB 00:00 2026-03-09T16:07:43.523 INFO:teuthology.orchestra.run.vm03.stdout:(91/117): python3-jaraco-collections-3.0.0-8.el 11 MB/s | 23 kB 00:00 2026-03-09T16:07:43.525 INFO:teuthology.orchestra.run.vm03.stdout:(92/117): python3-jaraco-context-6.0.1-3.el9.no 9.8 MB/s | 20 kB 00:00 2026-03-09T16:07:43.527 INFO:teuthology.orchestra.run.vm03.stdout:(93/117): python3-jaraco-functools-3.5.0-2.el9. 9.5 MB/s | 19 kB 00:00 2026-03-09T16:07:43.529 INFO:teuthology.orchestra.run.vm03.stdout:(94/117): python3-jaraco-text-4.0.0-2.el9.noarc 11 MB/s | 26 kB 00:00 2026-03-09T16:07:43.549 INFO:teuthology.orchestra.run.vm03.stdout:(95/117): python3-kubernetes-26.1.0-3.el9.noarc 53 MB/s | 1.0 MB 00:00 2026-03-09T16:07:43.553 INFO:teuthology.orchestra.run.vm03.stdout:(96/117): python3-logutils-0.3.5-21.el9.noarch. 14 MB/s | 46 kB 00:00 2026-03-09T16:07:43.555 INFO:teuthology.orchestra.run.vm03.stdout:(97/117): python3-more-itertools-8.12.0-2.el9.n 29 MB/s | 79 kB 00:00 2026-03-09T16:07:43.558 INFO:teuthology.orchestra.run.vm03.stdout:(98/117): python3-natsort-7.1.1-5.el9.noarch.rp 23 MB/s | 58 kB 00:00 2026-03-09T16:07:43.563 INFO:teuthology.orchestra.run.vm03.stdout:(99/117): python3-pecan-1.4.2-3.el9.noarch.rpm 51 MB/s | 272 kB 00:00 2026-03-09T16:07:43.566 INFO:teuthology.orchestra.run.vm03.stdout:(100/117): python3-portend-3.1.0-2.el9.noarch.r 7.0 MB/s | 16 kB 00:00 2026-03-09T16:07:43.569 INFO:teuthology.orchestra.run.vm03.stdout:(101/117): python3-pyOpenSSL-21.0.0-1.el9.noarc 30 MB/s | 90 kB 00:00 2026-03-09T16:07:43.571 INFO:teuthology.orchestra.run.vm03.stdout:(102/117): python3-repoze-lru-0.7-16.el9.noarch 13 MB/s | 31 kB 00:00 2026-03-09T16:07:43.576 INFO:teuthology.orchestra.run.vm03.stdout:(103/117): python3-routes-2.5.1-5.el9.noarch.rp 45 MB/s | 188 kB 00:00 2026-03-09T16:07:43.579 INFO:teuthology.orchestra.run.vm03.stdout:(104/117): python3-rsa-4.9-2.el9.noarch.rpm 20 MB/s | 59 kB 00:00 2026-03-09T16:07:43.581 INFO:teuthology.orchestra.run.vm03.stdout:(105/117): python3-tempora-5.0.0-2.el9.noarch.r 15 MB/s | 36 kB 00:00 2026-03-09T16:07:43.584 INFO:teuthology.orchestra.run.vm03.stdout:(106/117): python3-typing-extensions-4.15.0-1.e 33 MB/s | 86 kB 00:00 2026-03-09T16:07:43.590 INFO:teuthology.orchestra.run.vm03.stdout:(107/117): python3-webob-1.8.8-2.el9.noarch.rpm 42 MB/s | 230 kB 00:00 2026-03-09T16:07:43.593 INFO:teuthology.orchestra.run.vm03.stdout:(108/117): python3-websocket-client-1.2.3-2.el9 31 MB/s | 90 kB 00:00 2026-03-09T16:07:43.602 INFO:teuthology.orchestra.run.vm03.stdout:(109/117): python3-werkzeug-2.0.3-3.el9.1.noarc 51 MB/s | 427 kB 00:00 2026-03-09T16:07:43.604 INFO:teuthology.orchestra.run.vm03.stdout:(110/117): python3-xmltodict-0.12.0-15.el9.noar 9.7 MB/s | 22 kB 00:00 2026-03-09T16:07:43.609 INFO:teuthology.orchestra.run.vm03.stdout:(111/117): python3-zc-lockfile-2.0-10.el9.noarc 4.0 MB/s | 20 kB 00:00 2026-03-09T16:07:43.617 INFO:teuthology.orchestra.run.vm03.stdout:(112/117): re2-20211101-20.el9.x86_64.rpm 22 MB/s | 191 kB 00:00 2026-03-09T16:07:43.620 INFO:teuthology.orchestra.run.vm03.stdout:(113/117): python3-asyncssh-2.13.2-5.el9.noarch 2.4 MB/s | 548 kB 00:00 2026-03-09T16:07:43.653 INFO:teuthology.orchestra.run.vm03.stdout:(114/117): thrift-0.15.0-4.el9.x86_64.rpm 45 MB/s | 1.6 MB 00:00 2026-03-09T16:07:43.964 INFO:teuthology.orchestra.run.vm03.stdout:(115/117): python3-scipy-1.9.3-2.el9.x86_64.rpm 27 MB/s | 19 MB 00:00 2026-03-09T16:07:44.708 INFO:teuthology.orchestra.run.vm03.stdout:(116/117): librados2-18.2.7-1055.gab47f43c.el9. 3.0 MB/s | 3.3 MB 00:01 2026-03-09T16:07:44.837 INFO:teuthology.orchestra.run.vm03.stdout:(117/117): librbd1-18.2.7-1055.gab47f43c.el9.x8 2.5 MB/s | 3.0 MB 00:01 2026-03-09T16:07:44.841 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-09T16:07:44.841 INFO:teuthology.orchestra.run.vm03.stdout:Total 15 MB/s | 181 MB 00:12 2026-03-09T16:07:45.351 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:07:45.398 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:07:45.398 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:07:46.174 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:07:46.174 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:07:46.995 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:07:47.011 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/119 2026-03-09T16:07:47.051 INFO:teuthology.orchestra.run.vm03.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/119 2026-03-09T16:07:47.229 INFO:teuthology.orchestra.run.vm03.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/119 2026-03-09T16:07:47.243 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:07:47.291 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:07:47.292 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T16:07:47.321 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T16:07:47.332 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T16:07:47.335 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/119 2026-03-09T16:07:47.337 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/119 2026-03-09T16:07:47.352 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/119 2026-03-09T16:07:47.391 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:07:47.427 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:07:47.429 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T16:07:47.478 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T16:07:47.484 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/119 2026-03-09T16:07:47.511 INFO:teuthology.orchestra.run.vm03.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/119 2026-03-09T16:07:47.520 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/119 2026-03-09T16:07:47.525 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/119 2026-03-09T16:07:47.553 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/119 2026-03-09T16:07:47.570 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/119 2026-03-09T16:07:47.575 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/119 2026-03-09T16:07:47.583 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/119 2026-03-09T16:07:47.585 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/119 2026-03-09T16:07:47.590 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/119 2026-03-09T16:07:47.601 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 22/119 2026-03-09T16:07:47.615 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 23/119 2026-03-09T16:07:47.645 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/119 2026-03-09T16:07:47.710 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/119 2026-03-09T16:07:47.734 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/119 2026-03-09T16:07:47.743 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/119 2026-03-09T16:07:47.753 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/119 2026-03-09T16:07:47.758 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 29/119 2026-03-09T16:07:47.792 INFO:teuthology.orchestra.run.vm03.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/119 2026-03-09T16:07:47.799 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/119 2026-03-09T16:07:47.820 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/119 2026-03-09T16:07:47.850 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/119 2026-03-09T16:07:47.858 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/119 2026-03-09T16:07:47.866 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/119 2026-03-09T16:07:47.880 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/119 2026-03-09T16:07:47.894 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/119 2026-03-09T16:07:47.907 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/119 2026-03-09T16:07:47.979 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/119 2026-03-09T16:07:47.987 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/119 2026-03-09T16:07:47.997 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/119 2026-03-09T16:07:48.045 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/119 2026-03-09T16:07:48.468 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/119 2026-03-09T16:07:48.517 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/119 2026-03-09T16:07:48.553 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/119 2026-03-09T16:07:48.576 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/119 2026-03-09T16:07:48.624 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/119 2026-03-09T16:07:48.678 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/119 2026-03-09T16:07:48.712 INFO:teuthology.orchestra.run.vm03.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/119 2026-03-09T16:07:48.740 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/119 2026-03-09T16:07:48.804 INFO:teuthology.orchestra.run.vm03.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/119 2026-03-09T16:07:48.848 INFO:teuthology.orchestra.run.vm03.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/119 2026-03-09T16:07:48.888 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/119 2026-03-09T16:07:48.957 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/119 2026-03-09T16:07:48.986 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/119 2026-03-09T16:07:48.997 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/119 2026-03-09T16:07:49.004 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/119 2026-03-09T16:07:49.053 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/119 2026-03-09T16:07:49.362 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/119 2026-03-09T16:07:49.394 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/119 2026-03-09T16:07:49.401 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T16:07:49.465 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/119 2026-03-09T16:07:49.469 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/119 2026-03-09T16:07:49.496 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/119 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-09T16:07:49.614 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:49.650 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T16:07:49.911 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/119 2026-03-09T16:07:50.004 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T16:07:50.241 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T16:07:50.249 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T16:07:50.886 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T16:07:50.890 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T16:07:50.914 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T16:07:50.945 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/119 2026-03-09T16:07:50.955 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/119 2026-03-09T16:07:50.962 INFO:teuthology.orchestra.run.vm03.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/119 2026-03-09T16:07:50.971 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T16:07:51.065 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 101/119 2026-03-09T16:07:51.068 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T16:07:51.095 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T16:07:51.095 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:51.096 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T16:07:51.096 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T16:07:51.096 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T16:07:51.096 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:51.113 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T16:07:51.144 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/119 2026-03-09T16:07:51.147 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T16:07:51.183 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T16:07:51.188 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 73/119 2026-03-09T16:07:51.196 INFO:teuthology.orchestra.run.vm03.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/119 2026-03-09T16:07:51.241 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T16:07:51.244 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T16:07:51.271 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:51.440 INFO:teuthology.orchestra.run.vm03.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/119 2026-03-09T16:07:51.442 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T16:07:51.463 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T16:07:51.473 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 77/119 2026-03-09T16:07:51.492 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/119 2026-03-09T16:07:51.517 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/119 2026-03-09T16:07:51.534 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T16:07:51.563 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T16:07:51.563 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:51.563 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T16:07:51.564 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T16:07:51.564 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T16:07:51.564 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:51.618 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/119 2026-03-09T16:07:51.633 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/119 2026-03-09T16:07:51.664 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/119 2026-03-09T16:07:51.709 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/119 2026-03-09T16:07:51.780 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/119 2026-03-09T16:07:51.791 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/119 2026-03-09T16:07:51.795 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 86/119 2026-03-09T16:07:51.799 INFO:teuthology.orchestra.run.vm03.stdout: Installing : mailcap-2.1.49-5.el9.noarch 87/119 2026-03-09T16:07:51.802 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 88/119 2026-03-09T16:07:51.822 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:51.822 INFO:teuthology.orchestra.run.vm03.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T16:07:51.822 INFO:teuthology.orchestra.run.vm03.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T16:07:51.822 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:07:51.835 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:51.864 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T16:07:51.864 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T16:07:51.864 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:07:51.884 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 90/119 2026-03-09T16:07:51.947 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T16:07:51.951 INFO:teuthology.orchestra.run.vm03.stdout: Installing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T16:07:51.959 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 92/119 2026-03-09T16:07:51.999 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 93/119 2026-03-09T16:07:52.006 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 94/119 2026-03-09T16:07:52.425 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T16:07:52.454 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T16:07:52.455 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:52.455 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T16:07:52.455 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T16:07:52.455 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T16:07:52.455 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:52.883 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 107/119 2026-03-09T16:07:52.890 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T16:07:52.916 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:52.929 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T16:07:52.961 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T16:07:52.961 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:52.961 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T16:07:52.961 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:53.088 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:53.186 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:53.189 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T16:07:53.218 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T16:07:53.219 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:07:53.219 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T16:07:53.219 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T16:07:53.219 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T16:07:53.219 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:53.517 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T16:07:53.525 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T16:07:53.573 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T16:07:53.573 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T16:07:53.573 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T16:07:53.573 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:07:53.579 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T16:07:55.268 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 111/119 2026-03-09T16:07:55.281 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 112/119 2026-03-09T16:07:55.289 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 113/119 2026-03-09T16:07:55.336 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 114/119 2026-03-09T16:07:55.343 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 115/119 2026-03-09T16:07:55.354 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 116/119 2026-03-09T16:07:55.360 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 117/119 2026-03-09T16:07:55.360 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T16:07:55.379 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T16:07:55.379 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 3/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 5/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 7/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 9/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 11/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 12/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 13/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 14/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 15/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 16/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 17/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 18/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 19/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 20/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 21/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 22/119 2026-03-09T16:07:56.625 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 23/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 24/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 25/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 26/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 27/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 28/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 29/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 30/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 31/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 32/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 33/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 34/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 35/119 2026-03-09T16:07:56.626 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/119 2026-03-09T16:07:56.628 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 97/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 98/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 99/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 100/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 101/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 102/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 103/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 104/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 105/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 106/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 107/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 108/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 109/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 110/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 111/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 112/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 113/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 114/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 115/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 116/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 117/119 2026-03-09T16:07:56.629 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 118/119 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout:Upgraded: 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout:Installed: 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T16:07:56.746 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T16:07:56.747 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:07:56.748 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:07:56.870 DEBUG:teuthology.parallel:result is None 2026-03-09T16:08:00.842 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-09T16:08:00.843 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:00.890 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T16:08:01.471 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T16:08:01.479 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T16:08:02.075 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T16:08:02.078 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T16:08:02.144 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T16:08:02.223 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 101/119 2026-03-09T16:08:02.225 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T16:08:02.253 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T16:08:02.254 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:02.254 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T16:08:02.254 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T16:08:02.254 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T16:08:02.254 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:02.270 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T16:08:02.388 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T16:08:02.391 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T16:08:02.435 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T16:08:02.436 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:02.436 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T16:08:02.436 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T16:08:02.436 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T16:08:02.436 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:02.704 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T16:08:02.733 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:03.615 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T16:08:03.644 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:04.074 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 107/119 2026-03-09T16:08:04.079 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T16:08:04.105 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:04.119 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T16:08:04.148 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T16:08:04.148 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:04.148 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T16:08:04.148 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:04.312 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T16:08:04.337 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:06.382 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 111/119 2026-03-09T16:08:06.395 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 112/119 2026-03-09T16:08:06.401 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 113/119 2026-03-09T16:08:06.450 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 114/119 2026-03-09T16:08:06.457 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 115/119 2026-03-09T16:08:06.469 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 116/119 2026-03-09T16:08:06.474 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 117/119 2026-03-09T16:08:06.474 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T16:08:06.496 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T16:08:06.497 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:08:07.815 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:08:07.820 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 3/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 5/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T16:08:07.821 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 7/119 2026-03-09T16:08:07.824 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/119 2026-03-09T16:08:07.824 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 9/119 2026-03-09T16:08:07.824 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 11/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 12/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 13/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 14/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 15/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 16/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 17/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 18/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 19/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 20/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 21/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 22/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 23/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 24/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 25/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 26/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 27/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 28/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 29/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 30/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 31/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 32/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 33/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 34/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 35/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/119 2026-03-09T16:08:07.825 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 97/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 98/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 99/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 100/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 101/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 102/119 2026-03-09T16:08:07.826 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 103/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 104/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 105/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 106/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 107/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 108/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 109/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 110/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 111/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 112/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 113/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 114/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 115/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 116/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 117/119 2026-03-09T16:08:07.827 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 118/119 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout:Upgraded: 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout:Installed: 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.958 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T16:08:07.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:08:07.960 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:08:08.112 DEBUG:teuthology.parallel:result is None 2026-03-09T16:08:08.112 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:08:08.112 INFO:teuthology.packaging:ref: None 2026-03-09T16:08:08.112 INFO:teuthology.packaging:tag: None 2026-03-09T16:08:08.112 INFO:teuthology.packaging:branch: reef 2026-03-09T16:08:08.112 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:08:08.112 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T16:08:08.834 DEBUG:teuthology.orchestra.run.vm03:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T16:08:08.861 INFO:teuthology.orchestra.run.vm03.stdout:18.2.7-1055.gab47f43c.el9 2026-03-09T16:08:08.861 INFO:teuthology.packaging:The installed version of ceph is 18.2.7-1055.gab47f43c.el9 2026-03-09T16:08:08.861 INFO:teuthology.task.install:The correct ceph version 18.2.7-1055.gab47f43c is installed. 2026-03-09T16:08:08.862 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T16:08:08.863 INFO:teuthology.packaging:ref: None 2026-03-09T16:08:08.863 INFO:teuthology.packaging:tag: None 2026-03-09T16:08:08.863 INFO:teuthology.packaging:branch: reef 2026-03-09T16:08:08.863 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:08:08.863 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T16:08:09.633 DEBUG:teuthology.orchestra.run.vm05:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T16:08:09.665 INFO:teuthology.orchestra.run.vm05.stdout:18.2.7-1055.gab47f43c.el9 2026-03-09T16:08:09.665 INFO:teuthology.packaging:The installed version of ceph is 18.2.7-1055.gab47f43c.el9 2026-03-09T16:08:09.665 INFO:teuthology.task.install:The correct ceph version 18.2.7-1055.gab47f43c is installed. 2026-03-09T16:08:09.666 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-09T16:08:09.666 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:08:09.666 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T16:08:09.701 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:08:09.701 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T16:08:09.735 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-09T16:08:09.735 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:08:09.735 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T16:08:09.773 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T16:08:09.841 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:08:09.841 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T16:08:09.872 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T16:08:09.948 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-09T16:08:09.948 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:08:09.948 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T16:08:09.978 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T16:08:10.055 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:08:10.055 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T16:08:10.086 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T16:08:10.156 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-09T16:08:10.156 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:08:10.156 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T16:08:10.185 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T16:08:10.259 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:08:10.259 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T16:08:10.288 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T16:08:10.359 INFO:teuthology.run_tasks:Running task print... 2026-03-09T16:08:10.362 INFO:teuthology.task.print:**** done install task... 2026-03-09T16:08:10.362 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.ceph.io/ceph-ci/ceph:reef', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Cluster image is quay.ceph.io/ceph-ci/ceph:reef 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Cluster fsid is 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-09T16:08:10.408 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Monitor IPs: {'mon.vm03': '192.168.123.103', 'mon.vm05': '192.168.123.105'} 2026-03-09T16:08:10.408 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-09T16:08:10.408 DEBUG:teuthology.orchestra.run.vm03:> sudo hostname $(hostname -s) 2026-03-09T16:08:10.437 DEBUG:teuthology.orchestra.run.vm05:> sudo hostname $(hostname -s) 2026-03-09T16:08:10.463 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-09T16:08:10.463 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:08:11.068 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-09T16:08:11.923 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-09T16:08:11.924 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T16:08:11.924 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T16:08:11.924 DEBUG:teuthology.orchestra.run.vm03:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:13.132 INFO:teuthology.orchestra.run.vm03.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 16:08 /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:13.132 DEBUG:teuthology.orchestra.run.vm05:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:14.432 INFO:teuthology.orchestra.run.vm05.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 16:08 /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:14.432 DEBUG:teuthology.orchestra.run.vm03:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:14.450 DEBUG:teuthology.orchestra.run.vm05:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T16:08:14.471 INFO:tasks.cephadm:Pulling image quay.ceph.io/ceph-ci/ceph:reef on all hosts... 2026-03-09T16:08:14.471 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef pull 2026-03-09T16:08:14.492 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef pull 2026-03-09T16:08:14.642 INFO:teuthology.orchestra.run.vm03.stderr:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T16:08:14.648 INFO:teuthology.orchestra.run.vm05.stderr:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout: "image_id": "b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6", 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout: "repo_digests": [ 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout: "quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40" 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout: ] 2026-03-09T16:09:28.353 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout: "image_id": "b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6", 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout: "repo_digests": [ 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout: "quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40" 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout: ] 2026-03-09T16:09:29.939 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-09T16:09:29.952 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /etc/ceph 2026-03-09T16:09:29.978 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /etc/ceph 2026-03-09T16:09:30.014 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 777 /etc/ceph 2026-03-09T16:09:30.043 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 777 /etc/ceph 2026-03-09T16:09:30.086 INFO:tasks.cephadm:Writing seed config... 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-09T16:09:30.087 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-09T16:09:30.088 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-09T16:09:30.088 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:09:30.088 DEBUG:teuthology.orchestra.run.vm03:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-09T16:09:30.105 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 2b05df78-1bd2-11f1-83c0-c950214d6edc mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-09T16:09:30.105 DEBUG:teuthology.orchestra.run.vm03:mon.vm03> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service 2026-03-09T16:09:30.147 INFO:tasks.cephadm:Bootstrapping... 2026-03-09T16:09:30.147 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef -v bootstrap --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.103 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:09:30.265 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-09T16:09:30.265 INFO:teuthology.orchestra.run.vm03.stdout:cephadm ['--image', 'quay.ceph.io/ceph-ci/ceph:reef', '-v', 'bootstrap', '--fsid', '2b05df78-1bd2-11f1-83c0-c950214d6edc', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.103', '--skip-admin-label'] 2026-03-09T16:09:30.287 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-09T16:09:30.287 INFO:teuthology.orchestra.run.vm03.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-09T16:09:30.287 INFO:teuthology.orchestra.run.vm03.stdout:Verifying podman|docker is present... 2026-03-09T16:09:30.304 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-09T16:09:30.304 INFO:teuthology.orchestra.run.vm03.stdout:Verifying lvm2 is present... 2026-03-09T16:09:30.304 INFO:teuthology.orchestra.run.vm03.stdout:Verifying time synchronization is in place... 2026-03-09T16:09:30.310 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T16:09:30.310 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T16:09:30.315 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T16:09:30.315 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-09T16:09:30.320 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-09T16:09:30.325 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-09T16:09:30.325 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-09T16:09:30.325 INFO:teuthology.orchestra.run.vm03.stdout:Repeating the final host check... 2026-03-09T16:09:30.341 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-09T16:09:30.341 INFO:teuthology.orchestra.run.vm03.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-09T16:09:30.341 INFO:teuthology.orchestra.run.vm03.stdout:systemctl is present 2026-03-09T16:09:30.341 INFO:teuthology.orchestra.run.vm03.stdout:lvcreate is present 2026-03-09T16:09:30.345 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T16:09:30.346 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T16:09:30.350 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T16:09:30.350 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-09T16:09:30.356 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-09T16:09:30.364 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-09T16:09:30.364 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-09T16:09:30.364 INFO:teuthology.orchestra.run.vm03.stdout:Host looks OK 2026-03-09T16:09:30.364 INFO:teuthology.orchestra.run.vm03.stdout:Cluster fsid: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:30.364 INFO:teuthology.orchestra.run.vm03.stdout:Acquiring lock 140433623145344 on /run/cephadm/2b05df78-1bd2-11f1-83c0-c950214d6edc.lock 2026-03-09T16:09:30.365 INFO:teuthology.orchestra.run.vm03.stdout:Lock 140433623145344 acquired on /run/cephadm/2b05df78-1bd2-11f1-83c0-c950214d6edc.lock 2026-03-09T16:09:30.365 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 3300 ... 2026-03-09T16:09:30.365 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 6789 ... 2026-03-09T16:09:30.365 INFO:teuthology.orchestra.run.vm03.stdout:Base mon IP(s) is [192.168.123.103:3300, 192.168.123.103:6789], mon addrv is [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-09T16:09:30.368 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-09T16:09:30.368 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-09T16:09:30.371 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-09T16:09:30.371 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-09T16:09:30.373 INFO:teuthology.orchestra.run.vm03.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-09T16:09:30.374 INFO:teuthology.orchestra.run.vm03.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-09T16:09:30.374 INFO:teuthology.orchestra.run.vm03.stdout:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Trying to pull quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Getting image source signatures 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:8c0f38fb8a72d42ac81f075843e5360929f695c9f93c12951e7539b9ed9b1b5f 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:8e380faede39ebd4286247457b408d979ab568aafd8389c42ec304b8cfba4e92 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying config sha256:b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 2026-03-09T16:09:31.736 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-09T16:09:31.882 INFO:teuthology.orchestra.run.vm03.stdout:ceph: stdout ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable) 2026-03-09T16:09:31.882 INFO:teuthology.orchestra.run.vm03.stdout:Ceph version: ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable) 2026-03-09T16:09:31.882 INFO:teuthology.orchestra.run.vm03.stdout:Extracting ceph user uid/gid from container image... 2026-03-09T16:09:31.972 INFO:teuthology.orchestra.run.vm03.stdout:stat: stdout 167 167 2026-03-09T16:09:31.973 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial keys... 2026-03-09T16:09:32.070 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQC88K5prJEbAxAAxRB/2ATWpTUj+irhL20PYw== 2026-03-09T16:09:32.160 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQC88K5pTb1eCBAAX8gsPyOnMv7qmIr+P301OA== 2026-03-09T16:09:32.258 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQC88K5p92hKDhAA+VvXOtfa7EY1ircf/2AuVQ== 2026-03-09T16:09:32.259 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial monmap... 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool for vm03 [v2:192.168.123.103:3300,v1:192.168.123.103:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:setting min_mon_release = pacific 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: set fsid to 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:09:33.281 INFO:teuthology.orchestra.run.vm03.stdout:Creating mon... 2026-03-09T16:09:33.549 INFO:teuthology.orchestra.run.vm03.stdout:create mon.vm03 on 2026-03-09T16:09:33.726 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-09T16:09:33.856 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-09T16:09:34.040 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target → /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target. 2026-03-09T16:09:34.040 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target → /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target. 2026-03-09T16:09:34.238 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03 2026-03-09T16:09:34.238 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service: Unit ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service not loaded. 2026-03-09T16:09:34.390 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target.wants/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service → /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@.service. 2026-03-09T16:09:34.672 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-09T16:09:34.672 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T16:09:34.672 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon to start... 2026-03-09T16:09:34.672 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon... 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout cluster: 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout id: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout services: 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm03 (age 0.152681s) 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout data: 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pgs: 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.807+0000 7f927c7e8640 1 Processor -- start 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.808+0000 7f927c7e8640 1 -- start start 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.808+0000 7f927c7e8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.808+0000 7f927c7e8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9274079ec0 con 0x7f9274104850 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.809+0000 7f927a55d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:34.901 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.809+0000 7f927a55d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59020/0 (socket says 192.168.123.103:59020) 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.809+0000 7f927a55d640 1 -- 192.168.123.103:0/1955469574 learned_addr learned my addr 192.168.123.103:0/1955469574 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.811+0000 7f927a55d640 1 -- 192.168.123.103:0/1955469574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92741071a0 con 0x7f9274104850 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.812+0000 7f927a55d640 1 --2- 192.168.123.103:0/1955469574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f9264009b80 tx=0x7f926402f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=17409702c5f2895f server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.812+0000 7f927955b640 1 -- 192.168.123.103:0/1955469574 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f926402fa10 con 0x7f9274104850 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.812+0000 7f927955b640 1 -- 192.168.123.103:0/1955469574 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f926402fb70 con 0x7f9274104850 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.812+0000 7f927955b640 1 -- 192.168.123.103:0/1955469574 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f926402fe40 con 0x7f9274104850 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 -- 192.168.123.103:0/1955469574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 msgr2=0x7f9274106c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 --2- 192.168.123.103:0/1955469574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f9264009b80 tx=0x7f926402f190 comp rx=0 tx=0).stop 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 -- 192.168.123.103:0/1955469574 shutdown_connections 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 --2- 192.168.123.103:0/1955469574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274104850 0x7f9274106c60 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 -- 192.168.123.103:0/1955469574 >> 192.168.123.103:0/1955469574 conn(0x7f9274100680 msgr2=0x7f9274102ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 -- 192.168.123.103:0/1955469574 shutdown_connections 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.813+0000 7f927c7e8640 1 -- 192.168.123.103:0/1955469574 wait complete. 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927c7e8640 1 Processor -- start 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927c7e8640 1 -- start start 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927c7e8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927c7e8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9274072910 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927a55d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.814+0000 7f927a55d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59034/0 (socket says 192.168.123.103:59034) 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.815+0000 7f927a55d640 1 -- 192.168.123.103:0/414884885 learned_addr learned my addr 192.168.123.103:0/414884885 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.815+0000 7f927a55d640 1 -- 192.168.123.103:0/414884885 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92640095d0 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.815+0000 7f927a55d640 1 --2- 192.168.123.103:0/414884885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f9264002410 tx=0x7f92640047c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f926402fa10 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9274072b10 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92741add70 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f9264003cc0 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9264037560 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.816+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f9264037a00 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.817+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f9264040b50 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.817+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9274107510 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.819+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f9264046020 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.856+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7f92741adf60 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.860+0000 7f92637fe640 1 -- 192.168.123.103:0/414884885 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7f926403c070 con 0x7f9274071fb0 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.861+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 msgr2=0x7f92740723d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.861+0000 7f927c7e8640 1 --2- 192.168.123.103:0/414884885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f9264002410 tx=0x7f92640047c0 comp rx=0 tx=0).stop 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.862+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 shutdown_connections 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.862+0000 7f927c7e8640 1 --2- 192.168.123.103:0/414884885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9274071fb0 0x7f92740723d0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.862+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 >> 192.168.123.103:0/414884885 conn(0x7f9274100680 msgr2=0x7f9274101350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.862+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 shutdown_connections 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:34.862+0000 7f927c7e8640 1 -- 192.168.123.103:0/414884885 wait complete. 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:mon is available 2026-03-09T16:09:34.902 INFO:teuthology.orchestra.run.vm03.stdout:Assimilating anything we can from ceph.conf... 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_cluster_log_file_level = debug 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.022+0000 7fb13a2da640 1 Processor -- start 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.023+0000 7fb13a2da640 1 -- start start 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.023+0000 7fb13a2da640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.023+0000 7fb13a2da640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb134106b00 con 0x7fb134106130 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.024+0000 7fb133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.024+0000 7fb133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59044/0 (socket says 192.168.123.103:59044) 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.024+0000 7fb133fff640 1 -- 192.168.123.103:0/3468963021 learned_addr learned my addr 192.168.123.103:0/3468963021 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.024+0000 7fb133fff640 1 -- 192.168.123.103:0/3468963021 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb134107290 con 0x7fb134106130 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.025+0000 7fb133fff640 1 --2- 192.168.123.103:0/3468963021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fb120009b80 tx=0x7fb12002f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b22c31c21e3f6c07 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:35.133 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.025+0000 7fb132ffd640 1 -- 192.168.123.103:0/3468963021 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb12002fa10 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.025+0000 7fb132ffd640 1 -- 192.168.123.103:0/3468963021 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fb12002fb70 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.025+0000 7fb132ffd640 1 -- 192.168.123.103:0/3468963021 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb12002fe40 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 -- 192.168.123.103:0/3468963021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 msgr2=0x7fb134106530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 --2- 192.168.123.103:0/3468963021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fb120009b80 tx=0x7fb12002f190 comp rx=0 tx=0).stop 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 -- 192.168.123.103:0/3468963021 shutdown_connections 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 --2- 192.168.123.103:0/3468963021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb134106530 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 -- 192.168.123.103:0/3468963021 >> 192.168.123.103:0/3468963021 conn(0x7fb134101960 msgr2=0x7fb134103d80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 -- 192.168.123.103:0/3468963021 shutdown_connections 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.028+0000 7fb13a2da640 1 -- 192.168.123.103:0/3468963021 wait complete. 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.029+0000 7fb13a2da640 1 Processor -- start 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.029+0000 7fb13a2da640 1 -- start start 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.029+0000 7fb13a2da640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.029+0000 7fb13a2da640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1341995f0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.030+0000 7fb133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.030+0000 7fb133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59050/0 (socket says 192.168.123.103:59050) 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.030+0000 7fb133fff640 1 -- 192.168.123.103:0/2988511696 learned_addr learned my addr 192.168.123.103:0/2988511696 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.030+0000 7fb133fff640 1 -- 192.168.123.103:0/2988511696 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1200095d0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.030+0000 7fb133fff640 1 --2- 192.168.123.103:0/2988511696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fb120002410 tx=0x7fb1200047c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.031+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb120047070 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.031+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fb120003cc0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.031+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb120037560 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.032+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb1341997f0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.032+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb134199c90 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.033+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7fb1200379c0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.033+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb120043a40 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.033+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1341074e0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.035+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7fb12003c070 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.068+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fb1341076a0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.073+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+471 (secure 0 0 0) 0x7fb12005bd80 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.073+0000 7fb1317fa640 1 -- 192.168.123.103:0/2988511696 <== mon.0 v2:192.168.123.103:3300/0 8 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7fb120037da0 con 0x7fb134106130 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.074+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 msgr2=0x7fb1341990b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.074+0000 7fb13a2da640 1 --2- 192.168.123.103:0/2988511696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fb120002410 tx=0x7fb1200047c0 comp rx=0 tx=0).stop 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.075+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 shutdown_connections 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.075+0000 7fb13a2da640 1 --2- 192.168.123.103:0/2988511696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb134106130 0x7fb1341990b0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.075+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 >> 192.168.123.103:0/2988511696 conn(0x7fb134101960 msgr2=0x7fb1341025b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.075+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 shutdown_connections 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.075+0000 7fb13a2da640 1 -- 192.168.123.103:0/2988511696 wait complete. 2026-03-09T16:09:35.134 INFO:teuthology.orchestra.run.vm03.stdout:Generating new minimal ceph.conf... 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.260+0000 7f2b5a7f9640 1 Processor -- start 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.260+0000 7f2b5a7f9640 1 -- start start 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.261+0000 7f2b5a7f9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.261+0000 7f2b5a7f9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b54108cc0 con 0x7f2b541082f0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.261+0000 7f2b53fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.261+0000 7f2b53fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59058/0 (socket says 192.168.123.103:59058) 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.261+0000 7f2b53fff640 1 -- 192.168.123.103:0/837442012 learned_addr learned my addr 192.168.123.103:0/837442012 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.262+0000 7f2b53fff640 1 -- 192.168.123.103:0/837442012 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b541094a0 con 0x7f2b541082f0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.262+0000 7f2b53fff640 1 --2- 192.168.123.103:0/837442012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f2b40009b80 tx=0x7f2b4002f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=53d831105d9afcdf server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.263+0000 7f2b52ffd640 1 -- 192.168.123.103:0/837442012 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b4002fa10 con 0x7f2b541082f0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.263+0000 7f2b52ffd640 1 -- 192.168.123.103:0/837442012 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f2b40037440 con 0x7f2b541082f0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.263+0000 7f2b52ffd640 1 -- 192.168.123.103:0/837442012 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b400354e0 con 0x7f2b541082f0 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/837442012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 msgr2=0x7f2b541086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 --2- 192.168.123.103:0/837442012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f2b40009b80 tx=0x7f2b4002f190 comp rx=0 tx=0).stop 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/837442012 shutdown_connections 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 --2- 192.168.123.103:0/837442012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b541086f0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/837442012 >> 192.168.123.103:0/837442012 conn(0x7f2b5407ba00 msgr2=0x7f2b541066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/837442012 shutdown_connections 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.264+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/837442012 wait complete. 2026-03-09T16:09:35.357 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.265+0000 7f2b5a7f9640 1 Processor -- start 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.265+0000 7f2b5a7f9640 1 -- start start 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.265+0000 7f2b5a7f9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.266+0000 7f2b5a7f9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b5419e310 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.266+0000 7f2b53fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.266+0000 7f2b53fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59072/0 (socket says 192.168.123.103:59072) 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.266+0000 7f2b53fff640 1 -- 192.168.123.103:0/1207513903 learned_addr learned my addr 192.168.123.103:0/1207513903 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.266+0000 7f2b53fff640 1 -- 192.168.123.103:0/1207513903 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b400095d0 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.267+0000 7f2b53fff640 1 --2- 192.168.123.103:0/1207513903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f2b40009b50 tx=0x7f2b40037920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.267+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b40037b30 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.267+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f2b40035bf0 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.267+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b40042ca0 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.268+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b5419e510 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.268+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b5419e950 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.268+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f2b4003e070 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.268+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f2b4004baf0 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.269+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b5410c6d0 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.271+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f2b4003c070 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.304+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f2b541a1290 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.305+0000 7f2b517fa640 1 -- 192.168.123.103:0/1207513903 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7f2b40035d60 con 0x7f2b541082f0 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.307+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 msgr2=0x7f2b5419ddd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.307+0000 7f2b5a7f9640 1 --2- 192.168.123.103:0/1207513903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f2b40009b50 tx=0x7f2b40037920 comp rx=0 tx=0).stop 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.307+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 shutdown_connections 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.307+0000 7f2b5a7f9640 1 --2- 192.168.123.103:0/1207513903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b541082f0 0x7f2b5419ddd0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.307+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 >> 192.168.123.103:0/1207513903 conn(0x7f2b5407ba00 msgr2=0x7f2b54105d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.308+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 shutdown_connections 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.308+0000 7f2b5a7f9640 1 -- 192.168.123.103:0/1207513903 wait complete. 2026-03-09T16:09:35.358 INFO:teuthology.orchestra.run.vm03.stdout:Restarting the monitor... 2026-03-09T16:09:35.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 systemd[1]: Starting Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:09:35.744 INFO:teuthology.orchestra.run.vm03.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 podman[51005]: 2026-03-09 16:09:35.699938993 +0000 UTC m=+0.016140640 container create b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=reef) 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 podman[51005]: 2026-03-09 16:09:35.72992374 +0000 UTC m=+0.046125397 container init b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 podman[51005]: 2026-03-09 16:09:35.734318928 +0000 UTC m=+0.050520575 container start b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 bash[51005]: b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 podman[51005]: 2026-03-09 16:09:35.6933944 +0000 UTC m=+0.009596047 image pull b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 quay.ceph.io/ceph-ci/ceph:reef 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 systemd[1]: Started Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable), process ceph-mon, pid 2 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: pidfile_write: ignore empty --pid-file 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: load: jerasure load: lrc 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: RocksDB version: 7.9.2 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Git sha 0 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Compile date 2026-02-26 02:56:47 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: DB SUMMARY 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: DB Session ID: U19G4FBGAC2RIN6CFHIG 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: CURRENT file: CURRENT 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000008.sst 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000009.log size: 88970 ; 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.error_if_exists: 0 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.create_if_missing: 0 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.paranoid_checks: 1 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T16:09:35.962 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.env: 0x5641fda3fee0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.info_log: 0x5641ff4a41e0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.statistics: (nil) 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.use_fsync: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_log_file_size: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_fallocate: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.use_direct_reads: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.db_log_dir: 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.wal_dir: 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.write_buffer_manager: 0x5641ff4b43c0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.unordered_write: 0 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T16:09:35.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.row_cache: None 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.wal_filter: None 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.two_write_queues: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.wal_compression: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.atomic_flush: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.log_readahead_size: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_background_jobs: 2 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_background_compactions: -1 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_subcompactions: 1 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_open_files: -1 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_background_flushes: -1 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Compression algorithms supported: 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kZSTD supported: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kXpressCompression supported: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kBZip2Compression supported: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T16:09:35.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kLZ4Compression supported: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kZlibCompression supported: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: kSnappyCompression supported: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.merge_operator: 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_filter: None 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5641ff4f5c80) 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x5641ff4c7350 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-09T16:09:35.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression: NoCompression 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.num_levels: 7 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T16:09:35.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.inplace_update_support: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.bloom_locality: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.max_successive_merges: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.ttl: 2592000 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T16:09:35.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enable_blob_files: false 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.min_blob_size: 0 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b715cc5-6ec8-4c80-a4cb-77ec8267e53d 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072575768484, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072575770651, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84616, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 285, "table_properties": {"data_size": 82738, "index_size": 203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 13151, "raw_average_key_size": 51, "raw_value_size": 75663, "raw_average_value_size": 295, "num_data_blocks": 9, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773072575, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b715cc5-6ec8-4c80-a4cb-77ec8267e53d", "db_session_id": "U19G4FBGAC2RIN6CFHIG", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072575770748, "job": 1, "event": "recovery_finished"} 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm03/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5641ff4c8e00 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: rocksdb: DB pointer 0x5641ff5d6000 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: starting mon.vm03 rank 0 at public addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] at bind addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon_data /var/lib/ceph/mon/ceph-vm03 fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???) e1 preinit fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).mds e1 new map 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).mds e1 print_map 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: e1 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: legacy client fscid: -1 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout: No filesystems configured 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mon.vm03 is new leader, mons vm03 in quorum (ranks 0) 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: monmap e1: 1 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: fsmap 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: osdmap e1: 0 total, 0 up, 0 in 2026-03-09T16:09:35.968 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:35 vm03 ceph-mon[51019]: mgrmap e1: no daemons active 2026-03-09T16:09:36.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.896+0000 7f5048c83640 1 Processor -- start 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.896+0000 7f5048c83640 1 -- start start 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.896+0000 7f5048c83640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.896+0000 7f5048c83640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50441047f0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.897+0000 7f5042575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.897+0000 7f5042575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59080/0 (socket says 192.168.123.103:59080) 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.897+0000 7f5042575640 1 -- 192.168.123.103:0/2584942840 learned_addr learned my addr 192.168.123.103:0/2584942840 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.899+0000 7f5042575640 1 -- 192.168.123.103:0/2584942840 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5044104fb0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.899+0000 7f5042575640 1 --2- 192.168.123.103:0/2584942840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f5038009b80 tx=0x7f503802f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=638064ca76b27058 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.899+0000 7f5041573640 1 -- 192.168.123.103:0/2584942840 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f503802fa10 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.899+0000 7f5041573640 1 -- 192.168.123.103:0/2584942840 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f5038037440 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.899+0000 7f5041573640 1 -- 192.168.123.103:0/2584942840 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f50380354e0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.900+0000 7f5048c83640 1 -- 192.168.123.103:0/2584942840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 msgr2=0x7f5044104220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.900+0000 7f5048c83640 1 --2- 192.168.123.103:0/2584942840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f5038009b80 tx=0x7f503802f190 comp rx=0 tx=0).stop 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.900+0000 7f5048c83640 1 -- 192.168.123.103:0/2584942840 shutdown_connections 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.900+0000 7f5048c83640 1 --2- 192.168.123.103:0/2584942840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044104220 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.900+0000 7f5048c83640 1 -- 192.168.123.103:0/2584942840 >> 192.168.123.103:0/2584942840 conn(0x7f50440ff960 msgr2=0x7f5044101dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 -- 192.168.123.103:0/2584942840 shutdown_connections 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 -- 192.168.123.103:0/2584942840 wait complete. 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 Processor -- start 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 -- start start 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.901+0000 7f5048c83640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50441142d0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f5042575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f5042575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59088/0 (socket says 192.168.123.103:59088) 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f5042575640 1 -- 192.168.123.103:0/1923006575 learned_addr learned my addr 192.168.123.103:0/1923006575 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f5042575640 1 -- 192.168.123.103:0/1923006575 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50380095d0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f5042575640 1 --2- 192.168.123.103:0/1923006575 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f5038037b50 tx=0x7f5038037b80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.902+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5038035820 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.903+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f5038035dd0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.903+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5038042e20 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.903+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5044112f40 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.903+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f50441133e0 con 0x7f5044103e20 2026-03-09T16:09:36.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.904+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f503803e070 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.904+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f503803f940 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.904+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5008005350 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.906+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f5038047070 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.938+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f50080058d0 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.946+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f503803c070 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.946+0000 7f50337fe640 1 -- 192.168.123.103:0/1923006575 <== mon.0 v2:192.168.123.103:3300/0 8 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f5038042440 con 0x7f5044103e20 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.947+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 msgr2=0x7f5044112a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.947+0000 7f5048c83640 1 --2- 192.168.123.103:0/1923006575 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f5038037b50 tx=0x7f5038037b80 comp rx=0 tx=0).stop 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.948+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 shutdown_connections 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.948+0000 7f5048c83640 1 --2- 192.168.123.103:0/1923006575 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5044103e20 0x7f5044112a00 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.948+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 >> 192.168.123.103:0/1923006575 conn(0x7f50440ff960 msgr2=0x7f5044100210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.948+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 shutdown_connections 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:35.948+0000 7f5048c83640 1 -- 192.168.123.103:0/1923006575 wait complete. 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Creating mgr... 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-09T16:09:36.006 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-09T16:09:36.153 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mgr.vm03.gbgzmu 2026-03-09T16:09:36.154 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mgr.vm03.gbgzmu.service: Unit ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mgr.vm03.gbgzmu.service not loaded. 2026-03-09T16:09:36.299 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc.target.wants/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mgr.vm03.gbgzmu.service → /etc/systemd/system/ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@.service. 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr to start... 2026-03-09T16:09:36.501 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr... 2026-03-09T16:09:36.771 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "2b05df78-1bd2-11f1-83c0-c950214d6edc", 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T16:09:36.772 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:36.774 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T16:09:34.709559+0000", 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.661+0000 7f2761fb4640 1 Processor -- start 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.662+0000 7f2761fb4640 1 -- start start 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.662+0000 7f2761fb4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.662+0000 7f2761fb4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f275c072210 con 0x7f275c071840 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.663+0000 7f2760fb2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.663+0000 7f2760fb2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59114/0 (socket says 192.168.123.103:59114) 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.663+0000 7f2760fb2640 1 -- 192.168.123.103:0/2910122274 learned_addr learned my addr 192.168.123.103:0/2910122274 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.663+0000 7f2760fb2640 1 -- 192.168.123.103:0/2910122274 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f275c072350 con 0x7f275c071840 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.664+0000 7f2760fb2640 1 --2- 192.168.123.103:0/2910122274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f274c00a9c0 tx=0x7f274c033650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b928a3a651804cbf server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.665+0000 7f275b7fe640 1 -- 192.168.123.103:0/2910122274 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f274c037450 con 0x7f275c071840 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.665+0000 7f275b7fe640 1 -- 192.168.123.103:0/2910122274 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f274c036030 con 0x7f275c071840 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 -- 192.168.123.103:0/2910122274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 msgr2=0x7f275c071c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 --2- 192.168.123.103:0/2910122274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f274c00a9c0 tx=0x7f274c033650 comp rx=0 tx=0).stop 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 -- 192.168.123.103:0/2910122274 shutdown_connections 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 --2- 192.168.123.103:0/2910122274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c071840 0x7f275c071c40 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 -- 192.168.123.103:0/2910122274 >> 192.168.123.103:0/2910122274 conn(0x7f275c06d080 msgr2=0x7f275c06f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 -- 192.168.123.103:0/2910122274 shutdown_connections 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.666+0000 7f2761fb4640 1 -- 192.168.123.103:0/2910122274 wait complete. 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.667+0000 7f2761fb4640 1 Processor -- start 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.667+0000 7f2761fb4640 1 -- start start 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.667+0000 7f2761fb4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.667+0000 7f2761fb4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f274c03c860 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.668+0000 7f2760fb2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.668+0000 7f2760fb2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59122/0 (socket says 192.168.123.103:59122) 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.668+0000 7f2760fb2640 1 -- 192.168.123.103:0/297711960 learned_addr learned my addr 192.168.123.103:0/297711960 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.668+0000 7f2760fb2640 1 -- 192.168.123.103:0/297711960 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f274c00a670 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.668+0000 7f2760fb2640 1 --2- 192.168.123.103:0/297711960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f274c03d9e0 tx=0x7f274c03da10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.669+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f274c010380 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.669+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f274c00f070 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.669+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f274c045df0 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.669+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f275c1b4100 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.669+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f275c1b6c40 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.672+0000 7f273b7fe640 1 -- 192.168.123.103:0/297711960 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2728005350 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.672+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f274c04f430 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.672+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f274c04e3e0 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.676+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f274c0453f0 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.710+0000 7f273b7fe640 1 -- 192.168.123.103:0/297711960 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f27280058d0 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.711+0000 7f2759ffb640 1 -- 192.168.123.103:0/297711960 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f274c045590 con 0x7f275c1b3740 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.714+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 msgr2=0x7f275c1b3b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.714+0000 7f2761fb4640 1 --2- 192.168.123.103:0/297711960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f274c03d9e0 tx=0x7f274c03da10 comp rx=0 tx=0).stop 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.714+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 shutdown_connections 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.714+0000 7f2761fb4640 1 --2- 192.168.123.103:0/297711960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f275c1b3740 0x7f275c1b3b60 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.714+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 >> 192.168.123.103:0/297711960 conn(0x7f275c06d080 msgr2=0x7f275c06e960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.715+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 shutdown_connections 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:36.715+0000 7f2761fb4640 1 -- 192.168.123.103:0/297711960 wait complete. 2026-03-09T16:09:36.775 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (1/15)... 2026-03-09T16:09:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:36 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1923006575' entity='client.admin' 2026-03-09T16:09:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:36 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/297711960' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T16:09:39.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:39 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2753741757' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "2b05df78-1bd2-11f1-83c0-c950214d6edc", 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T16:09:39.038 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T16:09:34.709559+0000", 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:39.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.904+0000 7ff667577640 1 Processor -- start 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.904+0000 7ff667577640 1 -- start start 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.904+0000 7ff667577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.904+0000 7ff667577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff668071a10 con 0x7ff668072f30 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.905+0000 7ff666575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.905+0000 7ff666575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51866/0 (socket says 192.168.123.103:51866) 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.905+0000 7ff666575640 1 -- 192.168.123.103:0/555685379 learned_addr learned my addr 192.168.123.103:0/555685379 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.906+0000 7ff666575640 1 -- 192.168.123.103:0/555685379 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff668071b50 con 0x7ff668072f30 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.906+0000 7ff666575640 1 --2- 192.168.123.103:0/555685379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff65c009920 tx=0x7ff65c02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=40fce166b071e463 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.906+0000 7ff665573640 1 -- 192.168.123.103:0/555685379 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff65c02f9b0 con 0x7ff668072f30 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff665573640 1 -- 192.168.123.103:0/555685379 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff65c037440 con 0x7ff668072f30 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff667577640 1 -- 192.168.123.103:0/555685379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 msgr2=0x7ff668071440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff667577640 1 --2- 192.168.123.103:0/555685379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff65c009920 tx=0x7ff65c02ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff667577640 1 -- 192.168.123.103:0/555685379 shutdown_connections 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff667577640 1 --2- 192.168.123.103:0/555685379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff668072f30 0x7ff668071440 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.907+0000 7ff667577640 1 -- 192.168.123.103:0/555685379 >> 192.168.123.103:0/555685379 conn(0x7ff66806d060 msgr2=0x7ff66806f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 -- 192.168.123.103:0/555685379 shutdown_connections 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 -- 192.168.123.103:0/555685379 wait complete. 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 Processor -- start 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 -- start start 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.908+0000 7ff667577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff65c035340 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.909+0000 7ff666575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.909+0000 7ff666575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51878/0 (socket says 192.168.123.103:51878) 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.909+0000 7ff666575640 1 -- 192.168.123.103:0/2753741757 learned_addr learned my addr 192.168.123.103:0/2753741757 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.909+0000 7ff666575640 1 -- 192.168.123.103:0/2753741757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff65c0095d0 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.909+0000 7ff666575640 1 --2- 192.168.123.103:0/2753741757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff65c0098f0 tx=0x7ff65c0377b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.912+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff65c037b50 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.912+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff65c037cb0 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.912+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff65c042e50 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.912+0000 7ff667577640 1 -- 192.168.123.103:0/2753741757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6681ab720 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.912+0000 7ff667577640 1 -- 192.168.123.103:0/2753741757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff66807aff0 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.918+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7ff65c04c730 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.918+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff65c04bae0 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.918+0000 7ff667577640 1 -- 192.168.123.103:0/2753741757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff62c005350 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.921+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7ff65c051050 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.958+0000 7ff667577640 1 -- 192.168.123.103:0/2753741757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7ff62c0051c0 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.959+0000 7ff6577fe640 1 -- 192.168.123.103:0/2753741757 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7ff65c047070 con 0x7ff6681aadc0 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.960+0000 7ff6557fa640 1 -- 192.168.123.103:0/2753741757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 msgr2=0x7ff6681ab1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 --2- 192.168.123.103:0/2753741757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff65c0098f0 tx=0x7ff65c0377b0 comp rx=0 tx=0).stop 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 -- 192.168.123.103:0/2753741757 shutdown_connections 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 --2- 192.168.123.103:0/2753741757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6681aadc0 0x7ff6681ab1e0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 -- 192.168.123.103:0/2753741757 >> 192.168.123.103:0/2753741757 conn(0x7ff66806d060 msgr2=0x7ff668112780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 -- 192.168.123.103:0/2753741757 shutdown_connections 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:38.961+0000 7ff6557fa640 1 -- 192.168.123.103:0/2753741757 wait complete. 2026-03-09T16:09:39.041 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (2/15)... 2026-03-09T16:09:40.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: mgrmap e2: vm03.gbgzmu(active, starting, since 0.00518292s) 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:40.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:40 vm03 ceph-mon[51019]: from='mgr.14100 192.168.123.103:0/1717575342' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "2b05df78-1bd2-11f1-83c0-c950214d6edc", 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-09T16:09:41.350 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T16:09:41.351 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T16:09:34.709559+0000", 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.159+0000 7f576aa0c640 1 Processor -- start 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.159+0000 7f576aa0c640 1 -- start start 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.159+0000 7f576aa0c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.159+0000 7f576aa0c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5764108cc0 con 0x7f57641082f0 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.160+0000 7f5763fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.160+0000 7f5763fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51942/0 (socket says 192.168.123.103:51942) 2026-03-09T16:09:41.353 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.160+0000 7f5763fff640 1 -- 192.168.123.103:0/3605625968 learned_addr learned my addr 192.168.123.103:0/3605625968 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.160+0000 7f5763fff640 1 -- 192.168.123.103:0/3605625968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57641094a0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f5763fff640 1 --2- 192.168.123.103:0/3605625968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f5754009920 tx=0x7f575402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4c55928a5aba906f server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f5762ffd640 1 -- 192.168.123.103:0/3605625968 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f575402f9b0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f5762ffd640 1 -- 192.168.123.103:0/3605625968 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f5754037440 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f576aa0c640 1 -- 192.168.123.103:0/3605625968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 msgr2=0x7f57641086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f576aa0c640 1 --2- 192.168.123.103:0/3605625968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f5754009920 tx=0x7f575402ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f576aa0c640 1 -- 192.168.123.103:0/3605625968 shutdown_connections 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f576aa0c640 1 --2- 192.168.123.103:0/3605625968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f57641086f0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.161+0000 7f576aa0c640 1 -- 192.168.123.103:0/3605625968 >> 192.168.123.103:0/3605625968 conn(0x7f576407ba00 msgr2=0x7f57641066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 -- 192.168.123.103:0/3605625968 shutdown_connections 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 -- 192.168.123.103:0/3605625968 wait complete. 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 Processor -- start 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 -- start start 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f576aa0c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f576419ea00 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f5763fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f5763fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51954/0 (socket says 192.168.123.103:51954) 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.162+0000 7f5763fff640 1 -- 192.168.123.103:0/2144482759 learned_addr learned my addr 192.168.123.103:0/2144482759 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f5763fff640 1 -- 192.168.123.103:0/2144482759 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57540095d0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f5763fff640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f57540098f0 tx=0x7f5754037a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f575403f5e0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f576419ec00 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f576419f0a0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f575403fbc0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.163+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5754040b30 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.164+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7f575403e050 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.164+0000 7f57617fa640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 0x7f572c03f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.164+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f5754076010 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.164+0000 7f57637fe640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 0x7f572c03f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.165+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5764108770 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.168+0000 7f57637fe640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 0x7f572c03f530 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f5750009a10 tx=0x7f5750006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.168+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5754036430 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.300+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f576410c6d0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.300+0000 7f57617fa640 1 -- 192.168.123.103:0/2144482759 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7f575403ced0 con 0x7f57641082f0 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 msgr2=0x7f572c03f530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 0x7f572c03f530 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f5750009a10 tx=0x7f5750006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 msgr2=0x7f576419e4c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f57540098f0 tx=0x7f5754037a70 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 shutdown_connections 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f572c03d070 0x7f572c03f530 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 --2- 192.168.123.103:0/2144482759 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57641082f0 0x7f576419e4c0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 >> 192.168.123.103:0/2144482759 conn(0x7f576407ba00 msgr2=0x7f5764105f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:41.354 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.304+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 shutdown_connections 2026-03-09T16:09:41.355 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.305+0000 7f576aa0c640 1 -- 192.168.123.103:0/2144482759 wait complete. 2026-03-09T16:09:41.355 INFO:teuthology.orchestra.run.vm03.stdout:mgr is available 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_cluster_log_file_level = debug 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.472+0000 7f4c01e69640 1 Processor -- start 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.472+0000 7f4c01e69640 1 -- start start 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4c01e69640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4c01e69640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4bfc108cc0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4bfb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4bfb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51968/0 (socket says 192.168.123.103:51968) 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4bfb7fe640 1 -- 192.168.123.103:0/1503390979 learned_addr learned my addr 192.168.123.103:0/1503390979 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.473+0000 7f4bfb7fe640 1 -- 192.168.123.103:0/1503390979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4bfc1094a0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.474+0000 7f4bfb7fe640 1 --2- 192.168.123.103:0/1503390979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f4be0009920 tx=0x7f4be002ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d7db5b1e250c6039 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4bfa7fc640 1 -- 192.168.123.103:0/1503390979 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4be002f9b0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4bfa7fc640 1 -- 192.168.123.103:0/1503390979 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f4be0037440 con 0x7f4bfc1082f0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4c01e69640 1 -- 192.168.123.103:0/1503390979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 msgr2=0x7f4bfc1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4c01e69640 1 --2- 192.168.123.103:0/1503390979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f4be0009920 tx=0x7f4be002ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4c01e69640 1 -- 192.168.123.103:0/1503390979 shutdown_connections 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4c01e69640 1 --2- 192.168.123.103:0/1503390979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc1086f0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.475+0000 7f4c01e69640 1 -- 192.168.123.103:0/1503390979 >> 192.168.123.103:0/1503390979 conn(0x7f4bfc07ba00 msgr2=0x7f4bfc1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 -- 192.168.123.103:0/1503390979 shutdown_connections 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 -- 192.168.123.103:0/1503390979 wait complete. 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 Processor -- start 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 -- start start 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.476+0000 7f4c01e69640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4be0035340 con 0x7f4bfc1082f0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.477+0000 7f4bfb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.637 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.477+0000 7f4bfb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51976/0 (socket says 192.168.123.103:51976) 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.477+0000 7f4bfb7fe640 1 -- 192.168.123.103:0/3877465263 learned_addr learned my addr 192.168.123.103:0/3877465263 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.477+0000 7f4bfb7fe640 1 -- 192.168.123.103:0/3877465263 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4be00095d0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.477+0000 7f4bfb7fe640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f4be00098f0 tx=0x7f4be0035ee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.478+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4be0037650 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.478+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f4be0037c30 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.478+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4bfc19e8a0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.478+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4be003f3d0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.478+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4bfc19ed40 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.479+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7f4be003e050 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.479+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4bc4005350 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.479+0000 7f4bf8ff9640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 0x7f4bd003f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.479+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f4be00752f0 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.479+0000 7f4bfaffd640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 0x7f4bd003f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.482+0000 7f4bfaffd640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 0x7f4bd003f580 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4be80099c0 tx=0x7f4be8006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.483+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f4be0036b80 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.582+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f4bc4005b80 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.587+0000 7f4bf8ff9640 1 -- 192.168.123.103:0/3877465263 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+409 (secure 0 0 0) 0x7f4be003c030 con 0x7f4bfc1082f0 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 msgr2=0x7f4bd003f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 0x7f4bd003f580 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4be80099c0 tx=0x7f4be8006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 msgr2=0x7f4bfc19e360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f4be00098f0 tx=0x7f4be0035ee0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 shutdown_connections 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f4bd003d0c0 0x7f4bd003f580 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 --2- 192.168.123.103:0/3877465263 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bfc1082f0 0x7f4bfc19e360 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 >> 192.168.123.103:0/3877465263 conn(0x7f4bfc07ba00 msgr2=0x7f4bfc105df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 shutdown_connections 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.589+0000 7f4c01e69640 1 -- 192.168.123.103:0/3877465263 wait complete. 2026-03-09T16:09:41.638 INFO:teuthology.orchestra.run.vm03.stdout:Enabling cephadm module... 2026-03-09T16:09:42.029 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.749+0000 7ff6c2e2d640 1 Processor -- start 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.749+0000 7ff6c2e2d640 1 -- start start 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c2e2d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c2e2d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6bc108cc0 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c0ba2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c0ba2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51982/0 (socket says 192.168.123.103:51982) 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c0ba2640 1 -- 192.168.123.103:0/3444089909 learned_addr learned my addr 192.168.123.103:0/3444089909 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.750+0000 7ff6c0ba2640 1 -- 192.168.123.103:0/3444089909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6bc109490 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.751+0000 7ff6c0ba2640 1 --2- 192.168.123.103:0/3444089909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff6a4009b80 tx=0x7ff6a402f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d452b86bb44bfd80 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.751+0000 7ff6b37fe640 1 -- 192.168.123.103:0/3444089909 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6a402fc20 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.751+0000 7ff6b37fe640 1 -- 192.168.123.103:0/3444089909 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff6a402fd80 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.751+0000 7ff6b37fe640 1 -- 192.168.123.103:0/3444089909 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6a4035750 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/3444089909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 msgr2=0x7ff6bc1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/3444089909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff6a4009b80 tx=0x7ff6a402f190 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/3444089909 shutdown_connections 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/3444089909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc1086f0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/3444089909 >> 192.168.123.103:0/3444089909 conn(0x7ff6bc07b8c0 msgr2=0x7ff6bc1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/3444089909 shutdown_connections 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.752+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/3444089909 wait complete. 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c2e2d640 1 Processor -- start 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c2e2d640 1 -- start start 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c2e2d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c2e2d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6bc19e5e0 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c0ba2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c0ba2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:51996/0 (socket says 192.168.123.103:51996) 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c0ba2640 1 -- 192.168.123.103:0/625656058 learned_addr learned my addr 192.168.123.103:0/625656058 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.753+0000 7ff6c0ba2640 1 -- 192.168.123.103:0/625656058 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6a40095d0 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6c0ba2640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff6a4002410 tx=0x7ff6a40379e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6a4035e60 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6bc19e7e0 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff6a4037b20 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6a403f550 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.754+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6bc19ec80 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.755+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7ff6a403e030 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.755+0000 7ff6b1ffb640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 0x7ff69803f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.755+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff6a4076800 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.755+0000 7ff6b3fff640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 0x7ff69803f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.756+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff684005350 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.758+0000 7ff6b3fff640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 0x7ff69803f530 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac009a10 tx=0x7ff6ac006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.760+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff6a403c070 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.878+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7ff6840051c0 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.992+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v4) v1 ==== 86+0+0 (secure 0 0 0) 0x7ff6a403ed50 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.994+0000 7ff6b1ffb640 1 -- 192.168.123.103:0/625656058 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mgrmap(e 4) v1 ==== 49370+0+0 (secure 0 0 0) 0x7ff6a4041910 con 0x7ff6bc1082f0 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.998+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 msgr2=0x7ff69803f530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.998+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 0x7ff69803f530 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac009a10 tx=0x7ff6ac006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.998+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 msgr2=0x7ff6bc19e0a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.998+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff6a4002410 tx=0x7ff6a40379e0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 shutdown_connections 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7ff69803d070 0x7ff69803f530 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 --2- 192.168.123.103:0/625656058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6bc1082f0 0x7ff6bc19e0a0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 >> 192.168.123.103:0/625656058 conn(0x7ff6bc07b8c0 msgr2=0x7ff6bc105ca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 shutdown_connections 2026-03-09T16:09:42.030 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:41.999+0000 7ff6c2e2d640 1 -- 192.168.123.103:0/625656058 wait complete. 2026-03-09T16:09:42.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:41 vm03 ceph-mon[51019]: mgrmap e3: vm03.gbgzmu(active, since 1.03083s) 2026-03-09T16:09:42.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:41 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2144482759' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T16:09:42.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:41 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3877465263' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-09T16:09:42.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:41 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/625656058' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 4, 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.gbgzmu", 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.167+0000 7f220d13e640 1 Processor -- start 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.167+0000 7f220d13e640 1 -- start start 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.167+0000 7f220d13e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.167+0000 7f220d13e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2208071a50 con 0x7f2208072f70 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.168+0000 7f2206d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.168+0000 7f2206d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52028/0 (socket says 192.168.123.103:52028) 2026-03-09T16:09:42.377 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.168+0000 7f2206d76640 1 -- 192.168.123.103:0/189110924 learned_addr learned my addr 192.168.123.103:0/189110924 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.168+0000 7f2206d76640 1 -- 192.168.123.103:0/189110924 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2208071b90 con 0x7f2208072f70 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.169+0000 7f2206d76640 1 --2- 192.168.123.103:0/189110924 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f21fc009920 tx=0x7f21fc02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b2847b3a655d52b7 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.169+0000 7f2205d74640 1 -- 192.168.123.103:0/189110924 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21fc02f9b0 con 0x7f2208072f70 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.169+0000 7f2205d74640 1 -- 192.168.123.103:0/189110924 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f21fc037440 con 0x7f2208072f70 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.169+0000 7f2205d74640 1 -- 192.168.123.103:0/189110924 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21fc0354e0 con 0x7f2208072f70 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.170+0000 7f220d13e640 1 -- 192.168.123.103:0/189110924 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 msgr2=0x7f2208071480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.170+0000 7f220d13e640 1 --2- 192.168.123.103:0/189110924 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f21fc009920 tx=0x7f21fc02ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- 192.168.123.103:0/189110924 shutdown_connections 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 --2- 192.168.123.103:0/189110924 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f2208071480 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- 192.168.123.103:0/189110924 >> 192.168.123.103:0/189110924 conn(0x7f220806d080 msgr2=0x7f220806f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- 192.168.123.103:0/189110924 shutdown_connections 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- 192.168.123.103:0/189110924 wait complete. 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 Processor -- start 2026-03-09T16:09:42.378 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- start start 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.171+0000 7f220d13e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22080798f0 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.172+0000 7f2206d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.172+0000 7f2206d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52040/0 (socket says 192.168.123.103:52040) 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.172+0000 7f2206d76640 1 -- 192.168.123.103:0/747242487 learned_addr learned my addr 192.168.123.103:0/747242487 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.173+0000 7f2206d76640 1 -- 192.168.123.103:0/747242487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21fc0095d0 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.173+0000 7f2206d76640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f21fc0098f0 tx=0x7f21fc0359b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.173+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21fc035d00 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.173+0000 7f220d13e640 1 -- 192.168.123.103:0/747242487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2208079af0 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.173+0000 7f220d13e640 1 -- 192.168.123.103:0/747242487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2208079f90 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.174+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f21fc035e60 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.175+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21fc02faf0 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.175+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f21fc040500 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.175+0000 7f21e7fff640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 0x7f21ec03f630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.176+0000 7f2206575640 1 -- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 msgr2=0x7f21ec03f630 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.176+0000 7f2206575640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 0x7f21ec03f630 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.176+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f21fc040b00 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.177+0000 7f220d13e640 1 -- 192.168.123.103:0/747242487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f21d4005350 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.180+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f21fc03c030 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.302+0000 7f220d13e640 1 -- 192.168.123.103:0/747242487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f21d4005e10 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.303+0000 7f21e7fff640 1 -- 192.168.123.103:0/747242487 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v4) v1 ==== 56+0+98 (secure 0 0 0) 0x7f21fc047910 con 0x7f2208072f70 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.306+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 msgr2=0x7f21ec03f630 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.306+0000 7f21e5ffb640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 0x7f21ec03f630 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.306+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 msgr2=0x7f22080793b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.307+0000 7f21e5ffb640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f21fc0098f0 tx=0x7f21fc0359b0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.307+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 shutdown_connections 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.307+0000 7f21e5ffb640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f21ec03d170 0x7f21ec03f630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.308+0000 7f21e5ffb640 1 --2- 192.168.123.103:0/747242487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2208072f70 0x7f22080793b0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.308+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 >> 192.168.123.103:0/747242487 conn(0x7f220806d080 msgr2=0x7f220806dc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.308+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 shutdown_connections 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.308+0000 7f21e5ffb640 1 -- 192.168.123.103:0/747242487 wait complete. 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-09T16:09:42.379 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 4... 2026-03-09T16:09:43.271 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:42 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/625656058' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-09T16:09:43.271 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:42 vm03 ceph-mon[51019]: mgrmap e4: vm03.gbgzmu(active, since 2s) 2026-03-09T16:09:43.271 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:42 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/747242487' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T16:09:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:09:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:09:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: osdmap e2: 0 total, 0 up, 0 in 2026-03-09T16:09:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: mgrmap e5: vm03.gbgzmu(active, starting, since 0.00747778s) 2026-03-09T16:09:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:09:45.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:45 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:09:46.577 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:46.577 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 6, 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.505+0000 7f89f1090640 1 Processor -- start 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.505+0000 7f89f1090640 1 -- start start 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.505+0000 7f89f1090640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.505+0000 7f89f1090640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89ec0731a0 con 0x7f89ec0715c0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.505+0000 7f89ead76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.506+0000 7f89ead76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52050/0 (socket says 192.168.123.103:52050) 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.506+0000 7f89ead76640 1 -- 192.168.123.103:0/4208488466 learned_addr learned my addr 192.168.123.103:0/4208488466 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.506+0000 7f89ead76640 1 -- 192.168.123.103:0/4208488466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89ec071f00 con 0x7f89ec0715c0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.507+0000 7f89ead76640 1 --2- 192.168.123.103:0/4208488466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f89e400a990 tx=0x7f89e40334e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a8db642313ab9583 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.507+0000 7f89e9d74640 1 -- 192.168.123.103:0/4208488466 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f89e4039480 con 0x7f89ec0715c0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.507+0000 7f89e9d74640 1 -- 192.168.123.103:0/4208488466 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f89e4036030 con 0x7f89ec0715c0 2026-03-09T16:09:46.578 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.507+0000 7f89e9d74640 1 -- 192.168.123.103:0/4208488466 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f89e403ca70 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- 192.168.123.103:0/4208488466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 msgr2=0x7f89ec0719c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 --2- 192.168.123.103:0/4208488466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f89e400a990 tx=0x7f89e40334e0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- 192.168.123.103:0/4208488466 shutdown_connections 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 --2- 192.168.123.103:0/4208488466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec0719c0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- 192.168.123.103:0/4208488466 >> 192.168.123.103:0/4208488466 conn(0x7f89ec06d2a0 msgr2=0x7f89ec06f6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- 192.168.123.103:0/4208488466 shutdown_connections 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- 192.168.123.103:0/4208488466 wait complete. 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 Processor -- start 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.508+0000 7f89f1090640 1 -- start start 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89f1090640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89f1090640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89ec08a160 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89ead76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89ead76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52060/0 (socket says 192.168.123.103:52060) 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89ead76640 1 -- 192.168.123.103:0/609004576 learned_addr learned my addr 192.168.123.103:0/609004576 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89ead76640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89e400a670 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.509+0000 7f89ead76640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f89e403c5c0 tx=0x7f89e4039b60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.510+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f89e4039db0 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.510+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f89e4045aa0 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.510+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f89e4044910 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.510+0000 7f89f1090640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89ec086760 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.511+0000 7f89f1090640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89ec086c00 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.511+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f89e404b020 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.511+0000 7f89cbfff640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.512+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f89e4045aa0 con 0x7f89c403cdd0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.512+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f89e4008040 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.512+0000 7f89ea575640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.512+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.712+0000 7f89ea575640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:42.712+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:43.113+0000 7f89ea575640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:43.113+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:43.914+0000 7f89ea575640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:43.914+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:45.515+0000 7f89ea575640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4159093290 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:45.515+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:45.522+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 5) v1 ==== 49137+0+0 (secure 0 0 0) 0x7f89e4045620 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:45.523+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 msgr2=0x7f89c403f290 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:45.523+0000 7f89cbfff640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.525+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f89e4062dc0 con 0x7f89ec0715c0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.525+0000 7f89cbfff640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 0x7f89c4042a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.525+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f89e4045aa0 con 0x7f89c4040640 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.525+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 0x7f89c4042a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.526+0000 7f89ea575640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 0x7f89c4042a30 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f89dc003e00 tx=0x7f89dc007280 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.528+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+7759 (secure 0 0 0) 0x7f89e4045aa0 con 0x7f89c4040640 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.531+0000 7f89f1090640 1 -- 192.168.123.103:0/609004576 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f89b8002670 con 0x7f89c4040640 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89cbfff640 1 -- 192.168.123.103:0/609004576 <== mgr.14118 v2:192.168.123.103:6800/4285644309 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f89b8002670 con 0x7f89c4040640 2026-03-09T16:09:46.580 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 msgr2=0x7f89c4042a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 0x7f89c4042a30 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f89dc003e00 tx=0x7f89dc007280 comp rx=0 tx=0).stop 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 msgr2=0x7f89ec089c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f89e403c5c0 tx=0x7f89e4039b60 comp rx=0 tx=0).stop 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 shutdown_connections 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f89c4040640 0x7f89c4042a30 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:6800/4159093290,v1:192.168.123.103:6801/4159093290] conn(0x7f89c403cdd0 0x7f89c403f290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 --2- 192.168.123.103:0/609004576 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89ec0715c0 0x7f89ec089c20 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.532+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 >> 192.168.123.103:0/609004576 conn(0x7f89ec06d2a0 msgr2=0x7f89ec06eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.533+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 shutdown_connections 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.533+0000 7f89c9ffb640 1 -- 192.168.123.103:0/609004576 wait complete. 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 4 is available 2026-03-09T16:09:46.581 INFO:teuthology.orchestra.run.vm03.stdout:Setting orchestrator backend to cephadm... 2026-03-09T16:09:46.852 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.692+0000 7f4886c7d640 1 Processor -- start 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.693+0000 7f4886c7d640 1 -- start start 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.693+0000 7f4886c7d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.693+0000 7f4886c7d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4880111e00 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.693+0000 7f48849f2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.694+0000 7f48849f2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52152/0 (socket says 192.168.123.103:52152) 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.694+0000 7f48849f2640 1 -- 192.168.123.103:0/1951584627 learned_addr learned my addr 192.168.123.103:0/1951584627 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.694+0000 7f48849f2640 1 -- 192.168.123.103:0/1951584627 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4880111f40 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f48849f2640 1 --2- 192.168.123.103:0/1951584627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f4868009920 tx=0x7f486802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=92941696a65fece6 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f48777fe640 1 -- 192.168.123.103:0/1951584627 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f486802f9b0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f48777fe640 1 -- 192.168.123.103:0/1951584627 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f4868037440 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f48777fe640 1 -- 192.168.123.103:0/1951584627 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48680354e0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f4886c7d640 1 -- 192.168.123.103:0/1951584627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 msgr2=0x7f4880111830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.695+0000 7f4886c7d640 1 --2- 192.168.123.103:0/1951584627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f4868009920 tx=0x7f486802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 -- 192.168.123.103:0/1951584627 shutdown_connections 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 --2- 192.168.123.103:0/1951584627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f4880111830 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 -- 192.168.123.103:0/1951584627 >> 192.168.123.103:0/1951584627 conn(0x7f488006c3b0 msgr2=0x7f488006c7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 -- 192.168.123.103:0/1951584627 shutdown_connections 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 -- 192.168.123.103:0/1951584627 wait complete. 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.696+0000 7f4886c7d640 1 Processor -- start 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f4886c7d640 1 -- start start 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f4886c7d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f4886c7d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48801a70a0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f48849f2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f48849f2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52156/0 (socket says 192.168.123.103:52156) 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f48849f2640 1 -- 192.168.123.103:0/3309549140 learned_addr learned my addr 192.168.123.103:0/3309549140 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.697+0000 7f48849f2640 1 -- 192.168.123.103:0/3309549140 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48680095d0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f48849f2640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f48680098f0 tx=0x7f48680358e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4868037ac0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f486802fdc0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4868040da0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48801a72a0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.698+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48801a7680 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.699+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4848005350 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.703+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f4868040600 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.703+0000 7f4875ffb640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 0x7f485003f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.703+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f48680758a0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.703+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f48680a38c0 con 0x7f4880111430 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.703+0000 7f4877fff640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 0x7f485003f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.704+0000 7f4877fff640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 0x7f485003f5d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4870009a10 tx=0x7f4870006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.806+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f4848002bf0 con 0x7f485003d110 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.815+0000 7f4875ffb640 1 -- 192.168.123.103:0/3309549140 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f4848002bf0 con 0x7f485003d110 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 msgr2=0x7f485003f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.853 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 0x7f485003f5d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4870009a10 tx=0x7f4870006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 msgr2=0x7f48801a6b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f48680098f0 tx=0x7f48680358e0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 shutdown_connections 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f485003d110 0x7f485003f5d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 --2- 192.168.123.103:0/3309549140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4880111430 0x7f48801a6b60 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.819+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 >> 192.168.123.103:0/3309549140 conn(0x7f488006c3b0 msgr2=0x7f488010f160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.820+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 shutdown_connections 2026-03-09T16:09:46.854 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.820+0000 7f4886c7d640 1 -- 192.168.123.103:0/3309549140 wait complete. 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: Found migration_current of "None". Setting to last migration. 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: mgrmap e6: vm03.gbgzmu(active, since 1.00967s) 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:47.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:46 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.980+0000 7f558dfc3640 1 Processor -- start 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f558dfc3640 1 -- start start 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f558dfc3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f558dfc3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5588111d90 con 0x7f5588111430 2026-03-09T16:09:47.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f55877fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f55877fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52166/0 (socket says 192.168.123.103:52166) 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.981+0000 7f55877fe640 1 -- 192.168.123.103:0/593930047 learned_addr learned my addr 192.168.123.103:0/593930047 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.982+0000 7f55877fe640 1 -- 192.168.123.103:0/593930047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5588111ed0 con 0x7f5588111430 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.982+0000 7f55877fe640 1 --2- 192.168.123.103:0/593930047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f5574009920 tx=0x7f557402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dd891528af778c95 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f55867fc640 1 -- 192.168.123.103:0/593930047 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f557402f9b0 con 0x7f5588111430 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f55867fc640 1 -- 192.168.123.103:0/593930047 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f5574037440 con 0x7f5588111430 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f558dfc3640 1 -- 192.168.123.103:0/593930047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 msgr2=0x7f5588111850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f558dfc3640 1 --2- 192.168.123.103:0/593930047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f5574009920 tx=0x7f557402ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f558dfc3640 1 -- 192.168.123.103:0/593930047 shutdown_connections 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f558dfc3640 1 --2- 192.168.123.103:0/593930047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f5588111850 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.983+0000 7f558dfc3640 1 -- 192.168.123.103:0/593930047 >> 192.168.123.103:0/593930047 conn(0x7f558806c3b0 msgr2=0x7f558806c7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.984+0000 7f558dfc3640 1 -- 192.168.123.103:0/593930047 shutdown_connections 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.984+0000 7f558dfc3640 1 -- 192.168.123.103:0/593930047 wait complete. 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.984+0000 7f558dfc3640 1 Processor -- start 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.984+0000 7f558dfc3640 1 -- start start 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f558dfc3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f558dfc3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5574035340 con 0x7f5588111430 2026-03-09T16:09:47.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f55877fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f55877fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52168/0 (socket says 192.168.123.103:52168) 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f55877fe640 1 -- 192.168.123.103:0/4227736040 learned_addr learned my addr 192.168.123.103:0/4227736040 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f55877fe640 1 -- 192.168.123.103:0/4227736040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55740095d0 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.985+0000 7f55877fe640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f55740098f0 tx=0x7f5574035ee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.986+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5574037650 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.986+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f5574037c30 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.986+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f557403f3d0 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.986+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55881a72a0 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.986+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55881a7740 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.987+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f557403e050 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.987+0000 7f5584ff9640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 0x7f555c03f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.988+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f5574075760 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.988+0000 7f5586ffd640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 0x7f555c03f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.988+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f554c005350 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.991+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f557403c070 con 0x7f5588111430 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:46.992+0000 7f5586ffd640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 0x7f555c03f5d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f556c0099c0 tx=0x7f556c006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.093+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f554c002bf0 con 0x7f555c03d110 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.094+0000 7f5584ff9640 1 -- 192.168.123.103:0/4227736040 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f554c002bf0 con 0x7f555c03d110 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 msgr2=0x7f555c03f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 0x7f555c03f5d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f556c0099c0 tx=0x7f556c006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 msgr2=0x7f55881a6d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f55740098f0 tx=0x7f5574035ee0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 shutdown_connections 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f555c03d110 0x7f555c03f5d0 secure :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f556c0099c0 tx=0x7f556c006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 --2- 192.168.123.103:0/4227736040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5588111430 0x7f55881a6d60 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 >> 192.168.123.103:0/4227736040 conn(0x7f558806c3b0 msgr2=0x7f558810f280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 shutdown_connections 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.097+0000 7f558dfc3640 1 -- 192.168.123.103:0/4227736040 wait complete. 2026-03-09T16:09:47.131 INFO:teuthology.orchestra.run.vm03.stdout:Generating ssh key... 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.241+0000 7f213a2f5640 1 Processor -- start 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.241+0000 7f213a2f5640 1 -- start start 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.241+0000 7f213a2f5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.242+0000 7f213a2f5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f213407a850 con 0x7f213407be10 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.242+0000 7f2133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.242+0000 7f2133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52170/0 (socket says 192.168.123.103:52170) 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.242+0000 7f2133fff640 1 -- 192.168.123.103:0/3158256635 learned_addr learned my addr 192.168.123.103:0/3158256635 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.242+0000 7f2133fff640 1 -- 192.168.123.103:0/3158256635 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f213407a990 con 0x7f213407be10 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.243+0000 7f2133fff640 1 --2- 192.168.123.103:0/3158256635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2118009920 tx=0x7f211802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b61a25df04f2d771 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.243+0000 7f2132ffd640 1 -- 192.168.123.103:0/3158256635 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f211802f9b0 con 0x7f213407be10 2026-03-09T16:09:47.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.243+0000 7f2132ffd640 1 -- 192.168.123.103:0/3158256635 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f2118037440 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.243+0000 7f2132ffd640 1 -- 192.168.123.103:0/3158256635 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f211802f9b0 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 -- 192.168.123.103:0/3158256635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 msgr2=0x7f213407a310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 --2- 192.168.123.103:0/3158256635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2118009920 tx=0x7f211802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 -- 192.168.123.103:0/3158256635 shutdown_connections 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 --2- 192.168.123.103:0/3158256635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213407a310 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 -- 192.168.123.103:0/3158256635 >> 192.168.123.103:0/3158256635 conn(0x7f21341018e0 msgr2=0x7f2134103d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 -- 192.168.123.103:0/3158256635 shutdown_connections 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.244+0000 7f213a2f5640 1 -- 192.168.123.103:0/3158256635 wait complete. 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f213a2f5640 1 Processor -- start 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f213a2f5640 1 -- start start 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f213a2f5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f213a2f5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f213419e740 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f2133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f2133fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52186/0 (socket says 192.168.123.103:52186) 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.245+0000 7f2133fff640 1 -- 192.168.123.103:0/4232941701 learned_addr learned my addr 192.168.123.103:0/4232941701 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.246+0000 7f2133fff640 1 -- 192.168.123.103:0/4232941701 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21180095d0 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.246+0000 7f2133fff640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2118037d70 tx=0x7f2118035ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.246+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f211802f9b0 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.246+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f211803f3f0 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.247+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f211802fb10 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.247+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f213419e940 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.248+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f213419ede0 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.249+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f211803e070 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.249+0000 7f21317fa640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 0x7f210c03f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.249+0000 7f21337fe640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 0x7f210c03f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.249+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f211807a580 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.250+0000 7f21337fe640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 0x7f210c03f530 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f21200099c0 tx=0x7f2120006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.250+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f20f8005350 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.254+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f211807b050 con 0x7f213407be10 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.348+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f20f8002bf0 con 0x7f210c03d070 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.377+0000 7f21317fa640 1 -- 192.168.123.103:0/4232941701 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f20f8002bf0 con 0x7f210c03d070 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.379+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 msgr2=0x7f210c03f530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.379+0000 7f213a2f5640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 0x7f210c03f530 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f21200099c0 tx=0x7f2120006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.379+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 msgr2=0x7f213419e200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.379+0000 7f213a2f5640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2118037d70 tx=0x7f2118035ab0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 shutdown_connections 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f210c03d070 0x7f210c03f530 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 --2- 192.168.123.103:0/4232941701 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f213407be10 0x7f213419e200 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 >> 192.168.123.103:0/4232941701 conn(0x7f21341018e0 msgr2=0x7f2134102450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 shutdown_connections 2026-03-09T16:09:47.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.380+0000 7f213a2f5640 1 -- 192.168.123.103:0/4232941701 wait complete. 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKhDbemk21YojNt8ejP70oZxwYFsPVN4U87El3w1TeUZ ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.572+0000 7f0f628cb640 1 Processor -- start 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.572+0000 7f0f628cb640 1 -- start start 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.572+0000 7f0f628cb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.572+0000 7f0f628cb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f5c108d40 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.573+0000 7f0f5bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.573+0000 7f0f5bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52194/0 (socket says 192.168.123.103:52194) 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.573+0000 7f0f5bfff640 1 -- 192.168.123.103:0/3039775624 learned_addr learned my addr 192.168.123.103:0/3039775624 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.573+0000 7f0f5bfff640 1 -- 192.168.123.103:0/3039775624 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f5c1094c0 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.574+0000 7f0f5bfff640 1 --2- 192.168.123.103:0/3039775624 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f0f4c009b80 tx=0x7f0f4c02f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3586669a3f5b725d server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.574+0000 7f0f5affd640 1 -- 192.168.123.103:0/3039775624 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0f4c02fa10 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.574+0000 7f0f5affd640 1 -- 192.168.123.103:0/3039775624 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f0f4c037440 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.575+0000 7f0f628cb640 1 -- 192.168.123.103:0/3039775624 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 msgr2=0x7f0f5c108770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.575+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3039775624 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f0f4c009b80 tx=0x7f0f4c02f190 comp rx=0 tx=0).stop 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.575+0000 7f0f628cb640 1 -- 192.168.123.103:0/3039775624 shutdown_connections 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.575+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3039775624 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c108770 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.575+0000 7f0f628cb640 1 -- 192.168.123.103:0/3039775624 >> 192.168.123.103:0/3039775624 conn(0x7f0f5c07bc00 msgr2=0x7f0f5c1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 -- 192.168.123.103:0/3039775624 shutdown_connections 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 -- 192.168.123.103:0/3039775624 wait complete. 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 Processor -- start 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 -- start start 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.576+0000 7f0f628cb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f4c0353c0 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.577+0000 7f0f5bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.577+0000 7f0f5bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52206/0 (socket says 192.168.123.103:52206) 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.577+0000 7f0f5bfff640 1 -- 192.168.123.103:0/3186631505 learned_addr learned my addr 192.168.123.103:0/3186631505 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.577+0000 7f0f5bfff640 1 -- 192.168.123.103:0/3186631505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f4c0095d0 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.577+0000 7f0f5bfff640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f0f4c009b50 tx=0x7f0f4c035a30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.578+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0f4c035ad0 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.578+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f0f4c035c30 con 0x7f0f5c108370 2026-03-09T16:09:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.579+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0f4c0377a0 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.579+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0f5c19e8c0 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.579+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0f5c19ed60 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.580+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0f5c10cdf0 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.583+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f0f4c037900 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.583+0000 7f0f597fa640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 0x7f0f2c03f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.584+0000 7f0f5b7fe640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 0x7f0f2c03f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.584+0000 7f0f5b7fe640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 0x7f0f2c03f580 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f0f48009a10 tx=0x7f0f48006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.584+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f0f4c076150 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.584+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0f4c0765d0 con 0x7f0f5c108370 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.678+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f0f5c106550 con 0x7f0f2c03d0c0 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.679+0000 7f0f597fa640 1 -- 192.168.123.103:0/3186631505 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+123 (secure 0 0 0) 0x7f0f5c106550 con 0x7f0f2c03d0c0 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.681+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 msgr2=0x7f0f2c03f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.681+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 0x7f0f2c03f580 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f0f48009a10 tx=0x7f0f48006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 msgr2=0x7f0f5c19e380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f0f4c009b50 tx=0x7f0f4c035a30 comp rx=0 tx=0).stop 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 shutdown_connections 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f0f2c03d0c0 0x7f0f2c03f580 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 --2- 192.168.123.103:0/3186631505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f5c108370 0x7f0f5c19e380 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 >> 192.168.123.103:0/3186631505 conn(0x7f0f5c07bc00 msgr2=0x7f0f5c105e40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.682+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 shutdown_connections 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.683+0000 7f0f628cb640 1 -- 192.168.123.103:0/3186631505 wait complete. 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:Adding key to root@localhost authorized_keys... 2026-03-09T16:09:47.715 INFO:teuthology.orchestra.run.vm03.stdout:Adding host vm03... 2026-03-09T16:09:48.202 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:46] ENGINE Bus STARTING 2026-03-09T16:09:48.202 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-09T16:09:48.202 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-09T16:09:48.202 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:46] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:46] ENGINE Client ('192.168.123.103', 47966) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:46] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:46] ENGINE Bus STARTED 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: from='client.14130 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:48.203 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:47 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: Generating ssh key... 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:49.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:48 vm03 ceph-mon[51019]: mgrmap e7: vm03.gbgzmu(active, since 2s) 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Added host 'vm03' with addr '192.168.123.103' 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.835+0000 7f6acf5f1640 1 Processor -- start 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acf5f1640 1 -- start start 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acf5f1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acf5f1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ac8111e00 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acd366640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acd366640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52220/0 (socket says 192.168.123.103:52220) 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.836+0000 7f6acd366640 1 -- 192.168.123.103:0/162720556 learned_addr learned my addr 192.168.123.103:0/162720556 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.837+0000 7f6acd366640 1 -- 192.168.123.103:0/162720556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ac8111f40 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.837+0000 7f6acd366640 1 --2- 192.168.123.103:0/162720556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f6ab8009920 tx=0x7f6ab802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=1eea2bdac4246e9 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.838+0000 7f6ab7fff640 1 -- 192.168.123.103:0/162720556 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ab802f9b0 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.838+0000 7f6ab7fff640 1 -- 192.168.123.103:0/162720556 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f6ab8037440 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.838+0000 7f6acf5f1640 1 -- 192.168.123.103:0/162720556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 msgr2=0x7f6ac8111830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.838+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/162720556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f6ab8009920 tx=0x7f6ab802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 -- 192.168.123.103:0/162720556 shutdown_connections 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/162720556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac8111830 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 -- 192.168.123.103:0/162720556 >> 192.168.123.103:0/162720556 conn(0x7f6ac806c3b0 msgr2=0x7f6ac806c7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 -- 192.168.123.103:0/162720556 shutdown_connections 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 -- 192.168.123.103:0/162720556 wait complete. 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 Processor -- start 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.839+0000 7f6acf5f1640 1 -- start start 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acf5f1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acf5f1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ab8035340 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acd366640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acd366640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52234/0 (socket says 192.168.123.103:52234) 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acd366640 1 -- 192.168.123.103:0/834361027 learned_addr learned my addr 192.168.123.103:0/834361027 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acd366640 1 -- 192.168.123.103:0/834361027 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ab80095d0 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.840+0000 7f6acd366640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f6ab802f450 tx=0x7f6ab8035ed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.841+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ab80376d0 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.841+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f6ab8037cb0 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.841+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ab803f3d0 con 0x7f6ac8111430 2026-03-09T16:09:49.651 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.841+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ac81a2db0 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.841+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ac81a3250 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.843+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 6) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f6ab803e050 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.843+0000 7f6ab67fc640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 0x7f6aa003f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.843+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6ab8076730 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.843+0000 7f6accb65640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 0x7f6aa003f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.844+0000 7f6accb65640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 0x7f6aa003f5d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f6abc009a10 tx=0x7f6abc006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.844+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a90005350 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.847+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6ab803c070 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:47.940+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}) v1 -- 0x7f6a90002bf0 con 0x7f6aa003d110 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:48.380+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f6ab8040470 con 0x7f6ac8111430 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.557+0000 7f6ab67fc640 1 -- 192.168.123.103:0/834361027 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f6a90002bf0 con 0x7f6aa003d110 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.560+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 msgr2=0x7f6aa003f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.560+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 0x7f6aa003f5d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f6abc009a10 tx=0x7f6abc006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.560+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 msgr2=0x7f6ac81a2870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.560+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f6ab802f450 tx=0x7f6ab8035ed0 comp rx=0 tx=0).stop 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 shutdown_connections 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f6aa003d110 0x7f6aa003f5d0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 --2- 192.168.123.103:0/834361027 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8111430 0x7f6ac81a2870 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 >> 192.168.123.103:0/834361027 conn(0x7f6ac806c3b0 msgr2=0x7f6ac810f190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 shutdown_connections 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.561+0000 7f6acf5f1640 1 -- 192.168.123.103:0/834361027 wait complete. 2026-03-09T16:09:49.652 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mon service with default placement... 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.808+0000 7f8e374b3640 1 Processor -- start 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.808+0000 7f8e374b3640 1 -- start start 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e374b3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e374b3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e300719d0 con 0x7f8e30072f50 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e35228640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e35228640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48278/0 (socket says 192.168.123.103:48278) 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e35228640 1 -- 192.168.123.103:0/2085512910 learned_addr learned my addr 192.168.123.103:0/2085512910 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.809+0000 7f8e35228640 1 -- 192.168.123.103:0/2085512910 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e30071b10 con 0x7f8e30072f50 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.810+0000 7f8e35228640 1 --2- 192.168.123.103:0/2085512910 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f8e2c009b80 tx=0x7f8e2c02f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=58141806be1be7f8 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.811+0000 7f8e27fff640 1 -- 192.168.123.103:0/2085512910 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8e2c02fa10 con 0x7f8e30072f50 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.811+0000 7f8e27fff640 1 -- 192.168.123.103:0/2085512910 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f8e2c037440 con 0x7f8e30072f50 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- 192.168.123.103:0/2085512910 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 msgr2=0x7f8e30071400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 --2- 192.168.123.103:0/2085512910 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f8e2c009b80 tx=0x7f8e2c02f190 comp rx=0 tx=0).stop 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- 192.168.123.103:0/2085512910 shutdown_connections 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 --2- 192.168.123.103:0/2085512910 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30072f50 0x7f8e30071400 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- 192.168.123.103:0/2085512910 >> 192.168.123.103:0/2085512910 conn(0x7f8e3006d080 msgr2=0x7f8e3006f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- 192.168.123.103:0/2085512910 shutdown_connections 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- 192.168.123.103:0/2085512910 wait complete. 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 Processor -- start 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.812+0000 7f8e374b3640 1 -- start start 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e374b3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e374b3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e2c0353c0 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e35228640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e35228640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48292/0 (socket says 192.168.123.103:48292) 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e35228640 1 -- 192.168.123.103:0/1651027104 learned_addr learned my addr 192.168.123.103:0/1651027104 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e35228640 1 -- 192.168.123.103:0/1651027104 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e2c0095d0 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.813+0000 7f8e35228640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8e2c02f6c0 tx=0x7f8e2c035d70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.814+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8e2c035e30 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.814+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e3008a040 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.814+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e30086c70 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.816+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f8e2c02fe90 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.816+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8e2c0375c0 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.816+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f8e2c03e070 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.818+0000 7f8e267fc640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 0x7f8e1803f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.818+0000 7f8e34a27640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 0x7f8e1803f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.818+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f8e2c075f30 con 0x7f8e30086600 2026-03-09T16:09:50.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.818+0000 7f8e34a27640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 0x7f8e1803f740 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f8e2800ad30 tx=0x7f8e280093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.819+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e00005350 con 0x7f8e30086600 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.824+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8e2c045020 con 0x7f8e30086600 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.958+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f8e00002bf0 con 0x7f8e1803d280 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.965+0000 7f8e267fc640 1 -- 192.168.123.103:0/1651027104 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f8e00002bf0 con 0x7f8e1803d280 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 msgr2=0x7f8e1803f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 0x7f8e1803f740 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f8e2800ad30 tx=0x7f8e280093f0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 msgr2=0x7f8e30089b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8e2c02f6c0 tx=0x7f8e2c035d70 comp rx=0 tx=0).stop 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 shutdown_connections 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f8e1803d280 0x7f8e1803f740 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.967+0000 7f8e374b3640 1 --2- 192.168.123.103:0/1651027104 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e30086600 0x7f8e30089b00 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.968+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 >> 192.168.123.103:0/1651027104 conn(0x7f8e3006d080 msgr2=0x7f8e3006d8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.968+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 shutdown_connections 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:49.968+0000 7f8e374b3640 1 -- 192.168.123.103:0/1651027104 wait complete. 2026-03-09T16:09:50.014 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mgr service with default placement... 2026-03-09T16:09:50.075 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:49 vm03 ceph-mon[51019]: Deploying cephadm binary to vm03 2026-03-09T16:09:50.075 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:49 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:50.075 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:49 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.141+0000 7fd010bf2640 1 Processor -- start 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.141+0000 7fd010bf2640 1 -- start start 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.141+0000 7fd010bf2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.141+0000 7fd010bf2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0040a52a0 con 0x7fd0040a48d0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.142+0000 7fd00b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.142+0000 7fd00b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48306/0 (socket says 192.168.123.103:48306) 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.142+0000 7fd00b7fe640 1 -- 192.168.123.103:0/3460144644 learned_addr learned my addr 192.168.123.103:0/3460144644 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.142+0000 7fd00b7fe640 1 -- 192.168.123.103:0/3460144644 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0040a5ad0 con 0x7fd0040a48d0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.142+0000 7fd00b7fe640 1 --2- 192.168.123.103:0/3460144644 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fd00c0695e0 tx=0x7fd00c09b9f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d0a09bbe7e22e70e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.143+0000 7fd00a7fc640 1 -- 192.168.123.103:0/3460144644 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd00c0a2500 con 0x7fd0040a48d0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.143+0000 7fd00a7fc640 1 -- 192.168.123.103:0/3460144644 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd00c0a2b20 con 0x7fd0040a48d0 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.144+0000 7fd010bf2640 1 -- 192.168.123.103:0/3460144644 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 msgr2=0x7fd0040a4cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.144+0000 7fd010bf2640 1 --2- 192.168.123.103:0/3460144644 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fd00c0695e0 tx=0x7fd00c09b9f0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.144+0000 7fd010bf2640 1 -- 192.168.123.103:0/3460144644 shutdown_connections 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.144+0000 7fd010bf2640 1 --2- 192.168.123.103:0/3460144644 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd0040a4cd0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.144+0000 7fd010bf2640 1 -- 192.168.123.103:0/3460144644 >> 192.168.123.103:0/3460144644 conn(0x7fd00409fbe0 msgr2=0x7fd0040a2040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.310 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.145+0000 7fd010bf2640 1 -- 192.168.123.103:0/3460144644 shutdown_connections 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.145+0000 7fd010bf2640 1 -- 192.168.123.103:0/3460144644 wait complete. 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.145+0000 7fd010bf2640 1 Processor -- start 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd010bf2640 1 -- start start 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd010bf2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd010bf2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd00c0a1050 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd00b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd00b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48314/0 (socket says 192.168.123.103:48314) 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd00b7fe640 1 -- 192.168.123.103:0/4278998681 learned_addr learned my addr 192.168.123.103:0/4278998681 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.146+0000 7fd00b7fe640 1 -- 192.168.123.103:0/4278998681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd00c04fa00 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.147+0000 7fd00b7fe640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fd00c073780 tx=0x7fd00c0a2730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.148+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd00c0a2820 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.148+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd00414bb90 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.148+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd00414c030 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.148+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcfd8005350 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.149+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd00c0a2980 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.149+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd00c0af650 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.149+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fd00c0af870 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.149+0000 7fd008ff9640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 0x7fcfe003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.150+0000 7fd00affd640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 0x7fcfe003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.150+0000 7fd00affd640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 0x7fcfe003f6f0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fcffc009a10 tx=0x7fcffc006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.152+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd00c0a2070 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.152+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd00c0a3d40 con 0x7fd0040a48d0 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.266+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7fcfd8002bf0 con 0x7fcfe003d230 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.272+0000 7fd008ff9640 1 -- 192.168.123.103:0/4278998681 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fcfd8002bf0 con 0x7fcfe003d230 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 msgr2=0x7fcfe003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 0x7fcfe003f6f0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fcffc009a10 tx=0x7fcffc006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 msgr2=0x7fd00414b650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.311 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fd00c073780 tx=0x7fd00c0a2730 comp rx=0 tx=0).stop 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 shutdown_connections 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fcfe003d230 0x7fcfe003f6f0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 --2- 192.168.123.103:0/4278998681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd0040a48d0 0x7fd00414b650 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.276+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 >> 192.168.123.103:0/4278998681 conn(0x7fd00409fbe0 msgr2=0x7fd0040a0570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.277+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 shutdown_connections 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.277+0000 7fd010bf2640 1 -- 192.168.123.103:0/4278998681 wait complete. 2026-03-09T16:09:50.312 INFO:teuthology.orchestra.run.vm03.stdout:Deploying crash service with default placement... 2026-03-09T16:09:50.616 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-09T16:09:50.616 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.430+0000 7f81eadfc640 1 Processor -- start 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.430+0000 7f81eadfc640 1 -- start start 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.430+0000 7f81eadfc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.430+0000 7f81eadfc640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81e4070d10 con 0x7f81e40718f0 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.431+0000 7f81e8b71640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.431+0000 7f81e8b71640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48328/0 (socket says 192.168.123.103:48328) 2026-03-09T16:09:50.617 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.431+0000 7f81e8b71640 1 -- 192.168.123.103:0/2475170490 learned_addr learned my addr 192.168.123.103:0/2475170490 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.431+0000 7f81e8b71640 1 -- 192.168.123.103:0/2475170490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81e4070e50 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.432+0000 7f81e8b71640 1 --2- 192.168.123.103:0/2475170490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f81dc00d180 tx=0x7f81dc0315f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5d9c0455655267d5 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.432+0000 7f81e37fe640 1 -- 192.168.123.103:0/2475170490 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f81dc006ea0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.432+0000 7f81e37fe640 1 -- 192.168.123.103:0/2475170490 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f81dc034050 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.432+0000 7f81e37fe640 1 -- 192.168.123.103:0/2475170490 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f81dc0387c0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.433+0000 7f81eadfc640 1 -- 192.168.123.103:0/2475170490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 msgr2=0x7f81e4071cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.433+0000 7f81eadfc640 1 --2- 192.168.123.103:0/2475170490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f81dc00d180 tx=0x7f81dc0315f0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- 192.168.123.103:0/2475170490 shutdown_connections 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 --2- 192.168.123.103:0/2475170490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e4071cf0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- 192.168.123.103:0/2475170490 >> 192.168.123.103:0/2475170490 conn(0x7f81e406d080 msgr2=0x7f81e406f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- 192.168.123.103:0/2475170490 shutdown_connections 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- 192.168.123.103:0/2475170490 wait complete. 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 Processor -- start 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- start start 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.435+0000 7f81eadfc640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81e41a5f10 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.436+0000 7f81e8b71640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.436+0000 7f81e8b71640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48336/0 (socket says 192.168.123.103:48336) 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.436+0000 7f81e8b71640 1 -- 192.168.123.103:0/585038389 learned_addr learned my addr 192.168.123.103:0/585038389 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.436+0000 7f81e8b71640 1 -- 192.168.123.103:0/585038389 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81dc00bdc0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.436+0000 7f81e8b71640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f81dc00d2b0 tx=0x7f81dc038910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.437+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f81dc047070 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.437+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81e41a6110 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.437+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81e41a2510 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.438+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f81dc009bc0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.438+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f81dc043ea0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.438+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f81dc03f030 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.438+0000 7f81e1ffb640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 0x7f81c003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.439+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f81dc0771f0 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.439+0000 7f81e3fff640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 0x7f81c003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.439+0000 7f81e3fff640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 0x7f81c003f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f81cc0099c0 tx=0x7f81cc006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.439+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f81ac005350 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.443+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f81dc03d070 con 0x7f81e40718f0 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.560+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f81ac002bf0 con 0x7f81c003d230 2026-03-09T16:09:50.618 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.567+0000 7f81e1ffb640 1 -- 192.168.123.103:0/585038389 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f81ac002bf0 con 0x7f81c003d230 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 msgr2=0x7f81c003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 0x7f81c003f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f81cc0099c0 tx=0x7f81cc006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 msgr2=0x7f81e41a59d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f81dc00d2b0 tx=0x7f81dc038910 comp rx=0 tx=0).stop 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 shutdown_connections 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f81c003d230 0x7f81c003f6f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 --2- 192.168.123.103:0/585038389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81e40718f0 0x7f81e41a59d0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 >> 192.168.123.103:0/585038389 conn(0x7f81e406d080 msgr2=0x7f81e41993e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 shutdown_connections 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.570+0000 7f81eadfc640 1 -- 192.168.123.103:0/585038389 wait complete. 2026-03-09T16:09:50.619 INFO:teuthology.orchestra.run.vm03.stdout:Deploying ceph-exporter service with default placement... 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.771+0000 7f1ae11f5640 1 Processor -- start 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.772+0000 7f1ae11f5640 1 -- start start 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.772+0000 7f1ae11f5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.772+0000 7f1ae11f5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1adc106840 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.773+0000 7f1adad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.773+0000 7f1adad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48348/0 (socket says 192.168.123.103:48348) 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.773+0000 7f1adad76640 1 -- 192.168.123.103:0/544042301 learned_addr learned my addr 192.168.123.103:0/544042301 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.773+0000 7f1adad76640 1 -- 192.168.123.103:0/544042301 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1adc106980 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.774+0000 7f1adad76640 1 --2- 192.168.123.103:0/544042301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f1ad0009b80 tx=0x7f1ad002f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=77d5ee031ca41908 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.774+0000 7f1ad9d74640 1 -- 192.168.123.103:0/544042301 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ad002fa10 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.774+0000 7f1ad9d74640 1 -- 192.168.123.103:0/544042301 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1ad002fb70 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.774+0000 7f1ad9d74640 1 -- 192.168.123.103:0/544042301 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ad00355b0 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.775+0000 7f1ae11f5640 1 -- 192.168.123.103:0/544042301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 msgr2=0x7f1adc106300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.775+0000 7f1ae11f5640 1 --2- 192.168.123.103:0/544042301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f1ad0009b80 tx=0x7f1ad002f190 comp rx=0 tx=0).stop 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.776+0000 7f1ae11f5640 1 -- 192.168.123.103:0/544042301 shutdown_connections 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.776+0000 7f1ae11f5640 1 --2- 192.168.123.103:0/544042301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc106300 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.776+0000 7f1ae11f5640 1 -- 192.168.123.103:0/544042301 >> 192.168.123.103:0/544042301 conn(0x7f1adc101740 msgr2=0x7f1adc103b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.776+0000 7f1ae11f5640 1 -- 192.168.123.103:0/544042301 shutdown_connections 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.776+0000 7f1ae11f5640 1 -- 192.168.123.103:0/544042301 wait complete. 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1ae11f5640 1 Processor -- start 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1ae11f5640 1 -- start start 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1ae11f5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1ae11f5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1adc1a2960 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1adad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1adad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48352/0 (socket says 192.168.123.103:48352) 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.777+0000 7f1adad76640 1 -- 192.168.123.103:0/3342252804 learned_addr learned my addr 192.168.123.103:0/3342252804 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.778+0000 7f1adad76640 1 -- 192.168.123.103:0/3342252804 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ad00095d0 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.778+0000 7f1adad76640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f1ad00357a0 tx=0x7f1ad0035e80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.779+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ad0037500 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.779+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1ad0037b20 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.779+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ad0041e30 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.779+0000 7f1ae11f5640 1 -- 192.168.123.103:0/3342252804 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1adc1a2b60 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.779+0000 7f1ae11f5640 1 -- 192.168.123.103:0/3342252804 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1adc1a3060 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.780+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f1ad003e030 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.781+0000 7f1abbfff640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 0x7f1ab003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.781+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f1ad00756d0 con 0x7f1adc105f00 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.781+0000 7f1ada575640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 0x7f1ab003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:50.956 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.781+0000 7f1ada575640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 0x7f1ab003f6f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f1ac4009a10 tx=0x7f1ac4006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.782+0000 7f1ae11f5640 1 -- 192.168.123.103:0/3342252804 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1aa0005350 con 0x7f1adc105f00 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.785+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f1ad003c030 con 0x7f1adc105f00 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.912+0000 7f1ae11f5640 1 -- 192.168.123.103:0/3342252804 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f1aa0002bf0 con 0x7f1ab003d230 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.916+0000 7f1abbfff640 1 -- 192.168.123.103:0/3342252804 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f1aa0002bf0 con 0x7f1ab003d230 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.919+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 msgr2=0x7f1ab003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.919+0000 7f1ab9ffb640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 0x7f1ab003f6f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f1ac4009a10 tx=0x7f1ac4006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 msgr2=0x7f1adc1a2420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f1ad00357a0 tx=0x7f1ad0035e80 comp rx=0 tx=0).stop 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 shutdown_connections 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f1ab003d230 0x7f1ab003f6f0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 --2- 192.168.123.103:0/3342252804 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1adc105f00 0x7f1adc1a2420 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 >> 192.168.123.103:0/3342252804 conn(0x7f1adc101740 msgr2=0x7f1adc1028a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.920+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 shutdown_connections 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:50.921+0000 7f1ab9ffb640 1 -- 192.168.123.103:0/3342252804 wait complete. 2026-03-09T16:09:50.957 INFO:teuthology.orchestra.run.vm03.stdout:Deploying prometheus service with default placement... 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: Added host vm03 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: Saving service mon spec with placement count:5 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.247 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:50 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:51.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.124+0000 7fb5cf3df640 1 Processor -- start 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.124+0000 7fb5cf3df640 1 -- start start 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.124+0000 7fb5cf3df640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.124+0000 7fb5cf3df640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb5c8106870 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.125+0000 7fb5cd154640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.125+0000 7fb5cd154640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48366/0 (socket says 192.168.123.103:48366) 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.125+0000 7fb5cd154640 1 -- 192.168.123.103:0/4008201244 learned_addr learned my addr 192.168.123.103:0/4008201244 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.125+0000 7fb5cd154640 1 -- 192.168.123.103:0/4008201244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5c81069b0 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.126+0000 7fb5cd154640 1 --2- 192.168.123.103:0/4008201244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb5b8009920 tx=0x7fb5b802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dce6f9110c3b24fe server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.126+0000 7fb5bffff640 1 -- 192.168.123.103:0/4008201244 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb5b802f9b0 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.126+0000 7fb5bffff640 1 -- 192.168.123.103:0/4008201244 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb5b8037440 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- 192.168.123.103:0/4008201244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 msgr2=0x7fb5c8106330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/4008201244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb5b8009920 tx=0x7fb5b802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- 192.168.123.103:0/4008201244 shutdown_connections 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/4008201244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c8106330 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- 192.168.123.103:0/4008201244 >> 192.168.123.103:0/4008201244 conn(0x7fb5c8101700 msgr2=0x7fb5c8103b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- 192.168.123.103:0/4008201244 shutdown_connections 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- 192.168.123.103:0/4008201244 wait complete. 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 Processor -- start 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.127+0000 7fb5cf3df640 1 -- start start 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cf3df640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cf3df640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb5b80353c0 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cd154640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cd154640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48382/0 (socket says 192.168.123.103:48382) 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cd154640 1 -- 192.168.123.103:0/2649187824 learned_addr learned my addr 192.168.123.103:0/2649187824 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cd154640 1 -- 192.168.123.103:0/2649187824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5b80095d0 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cd154640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb5b8035dc0 tx=0x7fb5b8035df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb5b8037710 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb5b8037d30 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb5c807c100 con 0x7fb5c8105f30 2026-03-09T16:09:51.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb5b803f570 con 0x7fb5c8105f30 2026-03-09T16:09:51.296 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.128+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb5c807a540 con 0x7fb5c8105f30 2026-03-09T16:09:51.296 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.129+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb590005350 con 0x7fb5c8105f30 2026-03-09T16:09:51.296 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.132+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fb5b803e050 con 0x7fb5c8105f30 2026-03-09T16:09:51.296 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.132+0000 7fb5be7fc640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 0x7fb5a003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.132+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb5b8076240 con 0x7fb5c8105f30 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.132+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb5b8048d60 con 0x7fb5c8105f30 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.133+0000 7fb5cc953640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 0x7fb5a003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.133+0000 7fb5cc953640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 0x7fb5a003f6f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb5b00099c0 tx=0x7fb5b0006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.242+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fb590002bf0 con 0x7fb5a003d230 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.250+0000 7fb5be7fc640 1 -- 192.168.123.103:0/2649187824 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fb590002bf0 con 0x7fb5a003d230 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 msgr2=0x7fb5a003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 0x7fb5a003f6f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb5b00099c0 tx=0x7fb5b0006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 msgr2=0x7fb5c807bbc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb5b8035dc0 tx=0x7fb5b8035df0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 shutdown_connections 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fb5a003d230 0x7fb5a003f6f0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 --2- 192.168.123.103:0/2649187824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb5c8105f30 0x7fb5c807bbc0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.252+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 >> 192.168.123.103:0/2649187824 conn(0x7fb5c8101700 msgr2=0x7fb5c81022b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.253+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 shutdown_connections 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.253+0000 7fb5cf3df640 1 -- 192.168.123.103:0/2649187824 wait complete. 2026-03-09T16:09:51.297 INFO:teuthology.orchestra.run.vm03.stdout:Deploying grafana service with default placement... 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.439+0000 7f00d3fff640 1 Processor -- start 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.439+0000 7f00d3fff640 1 -- start start 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.439+0000 7f00d3fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.439+0000 7f00d3fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00cc005eb0 con 0x7f00cc0054e0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.440+0000 7f00d2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.440+0000 7f00d2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48394/0 (socket says 192.168.123.103:48394) 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.440+0000 7f00d2ffd640 1 -- 192.168.123.103:0/913857997 learned_addr learned my addr 192.168.123.103:0/913857997 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.440+0000 7f00d2ffd640 1 -- 192.168.123.103:0/913857997 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00cc006730 con 0x7f00cc0054e0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.440+0000 7f00d2ffd640 1 --2- 192.168.123.103:0/913857997 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f00c8009b80 tx=0x7f00c802f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=720201737b544313 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d1ffb640 1 -- 192.168.123.103:0/913857997 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00c802fe70 con 0x7f00cc0054e0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d1ffb640 1 -- 192.168.123.103:0/913857997 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f00c8037900 con 0x7f00cc0054e0 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 -- 192.168.123.103:0/913857997 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 msgr2=0x7f00cc0058e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 --2- 192.168.123.103:0/913857997 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f00c8009b80 tx=0x7f00c802f190 comp rx=0 tx=0).stop 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 -- 192.168.123.103:0/913857997 shutdown_connections 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 --2- 192.168.123.103:0/913857997 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc0054e0 0x7f00cc0058e0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.613 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 -- 192.168.123.103:0/913857997 >> 192.168.123.103:0/913857997 conn(0x7f00cc09fbe0 msgr2=0x7f00cc0a2040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.441+0000 7f00d3fff640 1 -- 192.168.123.103:0/913857997 shutdown_connections 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d3fff640 1 -- 192.168.123.103:0/913857997 wait complete. 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d3fff640 1 Processor -- start 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d3fff640 1 -- start start 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d3fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d3fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00c8035870 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48396/0 (socket says 192.168.123.103:48396) 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.442+0000 7f00d2ffd640 1 -- 192.168.123.103:0/946661306 learned_addr learned my addr 192.168.123.103:0/946661306 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.443+0000 7f00d2ffd640 1 -- 192.168.123.103:0/946661306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00c80095d0 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.443+0000 7f00d2ffd640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f00c802f6c0 tx=0x7f00c8004040 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.444+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00c80040b0 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.444+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00cc014b90 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.445+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00cc015030 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.445+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f00c8003710 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.445+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00c80377e0 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.446+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f00c803e070 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.446+0000 7f00b3fff640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 0x7f00b403f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.446+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f00c8079630 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.446+0000 7f00d27fc640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 0x7f00b403f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.447+0000 7f00d27fc640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 0x7f00b403f6a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f00c400ad30 tx=0x7f00c40093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.448+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00cc00a170 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.451+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f00c803c070 con 0x7f00cc015ee0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.571+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f00cc009c00 con 0x7f00b403d1e0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.581+0000 7f00b3fff640 1 -- 192.168.123.103:0/946661306 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f00cc009c00 con 0x7f00b403d1e0 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.583+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 msgr2=0x7f00b403f6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.583+0000 7f00d3fff640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 0x7f00b403f6a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f00c400ad30 tx=0x7f00c40093f0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.583+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 msgr2=0x7f00cc014650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.583+0000 7f00d3fff640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f00c802f6c0 tx=0x7f00c8004040 comp rx=0 tx=0).stop 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 shutdown_connections 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f00b403d1e0 0x7f00b403f6a0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 --2- 192.168.123.103:0/946661306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc015ee0 0x7f00cc014650 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 >> 192.168.123.103:0/946661306 conn(0x7f00cc09fbe0 msgr2=0x7f00cc0a03d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 shutdown_connections 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.584+0000 7f00d3fff640 1 -- 192.168.123.103:0/946661306 wait complete. 2026-03-09T16:09:51.614 INFO:teuthology.orchestra.run.vm03.stdout:Deploying node-exporter service with default placement... 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.777+0000 7f051cd66640 1 Processor -- start 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f051cd66640 1 -- start start 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f051cd66640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f051cd66640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05180725c0 con 0x7f0518071c80 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f0516575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f0516575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48404/0 (socket says 192.168.123.103:48404) 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.778+0000 7f0516575640 1 -- 192.168.123.103:0/368159891 learned_addr learned my addr 192.168.123.103:0/368159891 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.780+0000 7f0516575640 1 -- 192.168.123.103:0/368159891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0518072700 con 0x7f0518071c80 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.780+0000 7f0516575640 1 --2- 192.168.123.103:0/368159891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f050c0073f0 tx=0x7f050c031120 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dee5cb6f058a51dc server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.781+0000 7f0515573640 1 -- 192.168.123.103:0/368159891 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f050c008d10 con 0x7f0518071c80 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.781+0000 7f0515573640 1 -- 192.168.123.103:0/368159891 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f050c008e70 con 0x7f0518071c80 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.781+0000 7f0515573640 1 -- 192.168.123.103:0/368159891 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f050c0383a0 con 0x7f0518071c80 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.781+0000 7f051cd66640 1 -- 192.168.123.103:0/368159891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 msgr2=0x7f0518072080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.781+0000 7f051cd66640 1 --2- 192.168.123.103:0/368159891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f050c0073f0 tx=0x7f050c031120 comp rx=0 tx=0).stop 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 -- 192.168.123.103:0/368159891 shutdown_connections 2026-03-09T16:09:52.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 --2- 192.168.123.103:0/368159891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f0518072080 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 -- 192.168.123.103:0/368159891 >> 192.168.123.103:0/368159891 conn(0x7f051806d2a0 msgr2=0x7f051806f6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 -- 192.168.123.103:0/368159891 shutdown_connections 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 -- 192.168.123.103:0/368159891 wait complete. 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 Processor -- start 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.782+0000 7f051cd66640 1 -- start start 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f051cd66640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f051cd66640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05181ab4f0 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f0516575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f0516575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48416/0 (socket says 192.168.123.103:48416) 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f0516575640 1 -- 192.168.123.103:0/2403114569 learned_addr learned my addr 192.168.123.103:0/2403114569 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f0516575640 1 -- 192.168.123.103:0/2403114569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f050c0070a0 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.783+0000 7f0516575640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f050c00a820 tx=0x7f050c038900 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.784+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f050c00f150 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.784+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05181ab6f0 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.784+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05181abb90 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.784+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f050c031de0 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.784+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f050c042b70 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.785+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0518072100 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.786+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f050c031920 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.789+0000 7f05077fe640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 0x7f04e803f650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.793+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f050c07e340 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.793+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f050c07b050 con 0x7f0518071c80 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.793+0000 7f0515d74640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 0x7f04e803f650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.794+0000 7f0515d74640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 0x7f04e803f650 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f05000099c0 tx=0x7f0500006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.917+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f051806f490 con 0x7f04e803d190 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.952+0000 7f05077fe640 1 -- 192.168.123.103:0/2403114569 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f051806f490 con 0x7f04e803d190 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.955+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 msgr2=0x7f04e803f650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.955+0000 7f051cd66640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 0x7f04e803f650 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f05000099c0 tx=0x7f0500006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.955+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 msgr2=0x7f05181aafb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.955+0000 7f051cd66640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f050c00a820 tx=0x7f050c038900 comp rx=0 tx=0).stop 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 shutdown_connections 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f04e803d190 0x7f04e803f650 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 --2- 192.168.123.103:0/2403114569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0518071c80 0x7f05181aafb0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 >> 192.168.123.103:0/2403114569 conn(0x7f051806d2a0 msgr2=0x7f051806ed80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 shutdown_connections 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:51.956+0000 7f051cd66640 1 -- 192.168.123.103:0/2403114569 wait complete. 2026-03-09T16:09:52.005 INFO:teuthology.orchestra.run.vm03.stdout:Deploying alertmanager service with default placement... 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: Saving service mgr spec with placement count:2 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: Saving service crash spec with placement * 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: Saving service ceph-exporter spec with placement * 2026-03-09T16:09:52.120 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:52.121 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:52.121 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:52.121 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:51 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:52.290 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-09T16:09:52.290 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.123+0000 7fd745d42640 1 Processor -- start 2026-03-09T16:09:52.290 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd745d42640 1 -- start start 2026-03-09T16:09:52.290 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd745d42640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd745d42640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd740108cc0 con 0x7fd7401082f0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd73f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd73f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48424/0 (socket says 192.168.123.103:48424) 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.124+0000 7fd73f7fe640 1 -- 192.168.123.103:0/2593158168 learned_addr learned my addr 192.168.123.103:0/2593158168 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.125+0000 7fd73f7fe640 1 -- 192.168.123.103:0/2593158168 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd740109490 con 0x7fd7401082f0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.125+0000 7fd73f7fe640 1 --2- 192.168.123.103:0/2593158168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fd72c009920 tx=0x7fd72c02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=668722d3c9e3fd0c server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd73e7fc640 1 -- 192.168.123.103:0/2593158168 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd72c02f9b0 con 0x7fd7401082f0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd73e7fc640 1 -- 192.168.123.103:0/2593158168 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd72c037440 con 0x7fd7401082f0 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd745d42640 1 -- 192.168.123.103:0/2593158168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 msgr2=0x7fd7401086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd745d42640 1 --2- 192.168.123.103:0/2593158168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fd72c009920 tx=0x7fd72c02ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd745d42640 1 -- 192.168.123.103:0/2593158168 shutdown_connections 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd745d42640 1 --2- 192.168.123.103:0/2593158168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd7401086f0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.126+0000 7fd745d42640 1 -- 192.168.123.103:0/2593158168 >> 192.168.123.103:0/2593158168 conn(0x7fd74007b8f0 msgr2=0x7fd7401066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.127+0000 7fd745d42640 1 -- 192.168.123.103:0/2593158168 shutdown_connections 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.127+0000 7fd745d42640 1 -- 192.168.123.103:0/2593158168 wait complete. 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.127+0000 7fd745d42640 1 Processor -- start 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.127+0000 7fd745d42640 1 -- start start 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd745d42640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd745d42640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd72c0353c0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd73f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd73f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48430/0 (socket says 192.168.123.103:48430) 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd73f7fe640 1 -- 192.168.123.103:0/2732946377 learned_addr learned my addr 192.168.123.103:0/2732946377 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd73f7fe640 1 -- 192.168.123.103:0/2732946377 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd72c0095d0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.128+0000 7fd73f7fe640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd72c02f4d0 tx=0x7fd72c035dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.129+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd72c037710 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.129+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd72c037d30 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.129+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd72c03f3d0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.129+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd74019e910 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.129+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd74019edb0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.130+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fd72c03e050 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.131+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd74010cde0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.131+0000 7fd73cff9640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 0x7fd71803f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.134+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd72c0768d0 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.134+0000 7fd73effd640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 0x7fd71803f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.134+0000 7fd73effd640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 0x7fd71803f6f0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd7300099c0 tx=0x7fd730006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.135+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd72c04a460 con 0x7fd7401082f0 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.235+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7fd740106560 con 0x7fd71803d230 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.240+0000 7fd73cff9640 1 -- 192.168.123.103:0/2732946377 <== mgr.14118 v2:192.168.123.103:6800/4285644309 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7fd740106560 con 0x7fd71803d230 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.243+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 msgr2=0x7fd71803f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.243+0000 7fd745d42640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 0x7fd71803f6f0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd7300099c0 tx=0x7fd730006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.243+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 msgr2=0x7fd74019e3d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.243+0000 7fd745d42640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd72c02f4d0 tx=0x7fd72c035dc0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 shutdown_connections 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd71803d230 0x7fd71803f6f0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 --2- 192.168.123.103:0/2732946377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7401082f0 0x7fd74019e3d0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 >> 192.168.123.103:0/2732946377 conn(0x7fd74007b8f0 msgr2=0x7fd740105e50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 shutdown_connections 2026-03-09T16:09:52.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.244+0000 7fd745d42640 1 -- 192.168.123.103:0/2732946377 wait complete. 2026-03-09T16:09:52.582 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.416+0000 7f214d21f640 1 Processor -- start 2026-03-09T16:09:52.582 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f214d21f640 1 -- start start 2026-03-09T16:09:52.582 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f214d21f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.582 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f214d21f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2148108cc0 con 0x7f21481082f0 2026-03-09T16:09:52.582 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f2146d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f2146d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48436/0 (socket says 192.168.123.103:48436) 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.417+0000 7f2146d76640 1 -- 192.168.123.103:0/2881541984 learned_addr learned my addr 192.168.123.103:0/2881541984 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.418+0000 7f2146d76640 1 -- 192.168.123.103:0/2881541984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21481094a0 con 0x7f21481082f0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.418+0000 7f2146d76640 1 --2- 192.168.123.103:0/2881541984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f2130009920 tx=0x7f213002ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2c89a777d07d81ac server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.418+0000 7f2145d74640 1 -- 192.168.123.103:0/2881541984 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f213002f9b0 con 0x7f21481082f0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.418+0000 7f2145d74640 1 -- 192.168.123.103:0/2881541984 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2130037440 con 0x7f21481082f0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 -- 192.168.123.103:0/2881541984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 msgr2=0x7f21481086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 --2- 192.168.123.103:0/2881541984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f2130009920 tx=0x7f213002ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 -- 192.168.123.103:0/2881541984 shutdown_connections 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 --2- 192.168.123.103:0/2881541984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f21481086f0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 -- 192.168.123.103:0/2881541984 >> 192.168.123.103:0/2881541984 conn(0x7f214807ba00 msgr2=0x7f21481066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 -- 192.168.123.103:0/2881541984 shutdown_connections 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.419+0000 7f214d21f640 1 -- 192.168.123.103:0/2881541984 wait complete. 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f214d21f640 1 Processor -- start 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f214d21f640 1 -- start start 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f214d21f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f214d21f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21300353c0 con 0x7f21481082f0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f2146d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f2146d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48446/0 (socket says 192.168.123.103:48446) 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.420+0000 7f2146d76640 1 -- 192.168.123.103:0/1418113968 learned_addr learned my addr 192.168.123.103:0/1418113968 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f2146d76640 1 -- 192.168.123.103:0/1418113968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21300095d0 con 0x7f21481082f0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f2146d76640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f213002f4d0 tx=0x7f2130035dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.583 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2130037710 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2130037d30 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f214819e8d0 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.421+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f213003f3d0 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.422+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f214819ed70 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.422+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f213003e050 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.422+0000 7f2127fff640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 0x7f211c03f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.423+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f21300768a0 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.423+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f210c005350 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.423+0000 7f2146575640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 0x7f211c03f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.423+0000 7f2146575640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 0x7f211c03f6f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f213c009a10 tx=0x7f213c006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.427+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f213003c070 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.522+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f210c0051c0 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.527+0000 7f2127fff640 1 -- 192.168.123.103:0/1418113968 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f2130036b80 con 0x7f21481082f0 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.532+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 msgr2=0x7f211c03f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 0x7f211c03f6f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f213c009a10 tx=0x7f213c006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 msgr2=0x7f214819e390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f213002f4d0 tx=0x7f2130035dc0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 shutdown_connections 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f211c03d230 0x7f211c03f6f0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 --2- 192.168.123.103:0/1418113968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21481082f0 0x7f214819e390 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 >> 192.168.123.103:0/1418113968 conn(0x7f214807ba00 msgr2=0x7f2148105e10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.533+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 shutdown_connections 2026-03-09T16:09:52.584 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.534+0000 7f214d21f640 1 -- 192.168.123.103:0/1418113968 wait complete. 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.713+0000 7fd91b374640 1 Processor -- start 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.714+0000 7fd91b374640 1 -- start start 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.714+0000 7fd91b374640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.714+0000 7fd91b374640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd914106a90 con 0x7fd9141060c0 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.715+0000 7fd9190e9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.715+0000 7fd9190e9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48460/0 (socket says 192.168.123.103:48460) 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.715+0000 7fd9190e9640 1 -- 192.168.123.103:0/1904308080 learned_addr learned my addr 192.168.123.103:0/1904308080 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.715+0000 7fd9190e9640 1 -- 192.168.123.103:0/1904308080 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd914107220 con 0x7fd9141060c0 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.716+0000 7fd9190e9640 1 --2- 192.168.123.103:0/1904308080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fd908009920 tx=0x7fd90802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=db0fc3136364cf54 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.897 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.716+0000 7fd903fff640 1 -- 192.168.123.103:0/1904308080 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd90802f9b0 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.716+0000 7fd903fff640 1 -- 192.168.123.103:0/1904308080 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd908037440 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.716+0000 7fd903fff640 1 -- 192.168.123.103:0/1904308080 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd908035560 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 -- 192.168.123.103:0/1904308080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 msgr2=0x7fd9141064c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 --2- 192.168.123.103:0/1904308080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fd908009920 tx=0x7fd90802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 -- 192.168.123.103:0/1904308080 shutdown_connections 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 --2- 192.168.123.103:0/1904308080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd9141064c0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 -- 192.168.123.103:0/1904308080 >> 192.168.123.103:0/1904308080 conn(0x7fd9141018b0 msgr2=0x7fd914103cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 -- 192.168.123.103:0/1904308080 shutdown_connections 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 -- 192.168.123.103:0/1904308080 wait complete. 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.717+0000 7fd91b374640 1 Processor -- start 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.718+0000 7fd91b374640 1 -- start start 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.718+0000 7fd91b374640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.718+0000 7fd91b374640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd91419a0f0 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.719+0000 7fd9190e9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.719+0000 7fd9190e9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48472/0 (socket says 192.168.123.103:48472) 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.719+0000 7fd9190e9640 1 -- 192.168.123.103:0/1746705603 learned_addr learned my addr 192.168.123.103:0/1746705603 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.719+0000 7fd9190e9640 1 -- 192.168.123.103:0/1746705603 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd9080095d0 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.719+0000 7fd9190e9640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd908009a50 tx=0x7fd90802fbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.720+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd908035820 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.720+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd908035e40 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.720+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd91419a2f0 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.720+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd9080363a0 con 0x7fd9141060c0 2026-03-09T16:09:52.898 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.720+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd91419a790 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.722+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fd90803e070 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.722+0000 7fd9027fc640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 0x7fd8e803f650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.723+0000 7fd9188e8640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 0x7fd8e803f650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.723+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd908076790 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.723+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd914106540 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.725+0000 7fd9188e8640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 0x7fd8e803f650 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fd9040099c0 tx=0x7fd904006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.727+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd908049460 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.826+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7fd91410a450 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.835+0000 7fd9027fc640 1 -- 192.168.123.103:0/1746705603 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7fd90803e350 con 0x7fd9141060c0 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.838+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 msgr2=0x7fd8e803f650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.838+0000 7fd91b374640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 0x7fd8e803f650 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fd9040099c0 tx=0x7fd904006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 msgr2=0x7fd914199bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd908009a50 tx=0x7fd90802fbe0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 shutdown_connections 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fd8e803d190 0x7fd8e803f650 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 --2- 192.168.123.103:0/1746705603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd9141060c0 0x7fd914199bb0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.839+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 >> 192.168.123.103:0/1746705603 conn(0x7fd9141018b0 msgr2=0x7fd914102220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.840+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 shutdown_connections 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:52.840+0000 7fd91b374640 1 -- 192.168.123.103:0/1746705603 wait complete. 2026-03-09T16:09:52.899 INFO:teuthology.orchestra.run.vm03.stdout:Enabling the dashboard module... 2026-03-09T16:09:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: Saving service prometheus spec with placement count:1 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: Saving service grafana spec with placement count:1 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: Saving service node-exporter spec with placement * 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='mgr.14118 192.168.123.103:0/1832953116' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1418113968' entity='client.admin' 2026-03-09T16:09:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:52 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1746705603' entity='client.admin' 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.028+0000 7fc9eca8c640 1 Processor -- start 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9eca8c640 1 -- start start 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9eca8c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9eca8c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9e81069a0 con 0x7fc9e8106060 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9e6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9e6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48474/0 (socket says 192.168.123.103:48474) 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.029+0000 7fc9e6575640 1 -- 192.168.123.103:0/3674367791 learned_addr learned my addr 192.168.123.103:0/3674367791 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.030+0000 7fc9e6575640 1 -- 192.168.123.103:0/3674367791 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9e8106ae0 con 0x7fc9e8106060 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.030+0000 7fc9e6575640 1 --2- 192.168.123.103:0/3674367791 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fc9dc009b80 tx=0x7fc9dc02f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6cfb4760962abb6d server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9e5573640 1 -- 192.168.123.103:0/3674367791 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc9dc02fa10 con 0x7fc9e8106060 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9e5573640 1 -- 192.168.123.103:0/3674367791 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc9dc02fb70 con 0x7fc9e8106060 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9eca8c640 1 -- 192.168.123.103:0/3674367791 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 msgr2=0x7fc9e8106460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/3674367791 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fc9dc009b80 tx=0x7fc9dc02f190 comp rx=0 tx=0).stop 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9eca8c640 1 -- 192.168.123.103:0/3674367791 shutdown_connections 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/3674367791 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8106060 0x7fc9e8106460 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.031+0000 7fc9eca8c640 1 -- 192.168.123.103:0/3674367791 >> 192.168.123.103:0/3674367791 conn(0x7fc9e8101870 msgr2=0x7fc9e8103cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:54.031 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.032+0000 7fc9eca8c640 1 -- 192.168.123.103:0/3674367791 shutdown_connections 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.032+0000 7fc9eca8c640 1 -- 192.168.123.103:0/3674367791 wait complete. 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.032+0000 7fc9eca8c640 1 Processor -- start 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.032+0000 7fc9eca8c640 1 -- start start 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9eca8c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9eca8c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9dc035410 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9e6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9e6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48486/0 (socket says 192.168.123.103:48486) 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9e6575640 1 -- 192.168.123.103:0/4263156364 learned_addr learned my addr 192.168.123.103:0/4263156364 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9e6575640 1 -- 192.168.123.103:0/4263156364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9dc0095d0 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.033+0000 7fc9e6575640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fc9dc02f6c0 tx=0x7fc9dc035a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.034+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc9dc035ac0 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.034+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc9dc035c20 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.034+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc9e8192670 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.034+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc9dc0378c0 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.034+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc9e8195210 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.035+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fc9dc03e070 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.036+0000 7fc9d77fe640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 0x7fc9c003f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.036+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fc9dc076460 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.036+0000 7fc9e5d74640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 0x7fc9c003f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.036+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc9e810ab20 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.037+0000 7fc9e5d74640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 0x7fc9c003f6a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc9d00099c0 tx=0x7fc9d0006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.041+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc9dc03c080 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.162+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7fc9e81954c0 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.978+0000 7fc9d77fe640 1 -- 192.168.123.103:0/4263156364 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v8) v1 ==== 88+0+0 (secure 0 0 0) 0x7fc9dc049460 con 0x7fc9e8191d10 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.982+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 msgr2=0x7fc9c003f6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:54.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.982+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 0x7fc9c003f6a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc9d00099c0 tx=0x7fc9d0006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.982+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 msgr2=0x7fc9e8192130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.982+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fc9dc02f6c0 tx=0x7fc9dc035a00 comp rx=0 tx=0).stop 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 shutdown_connections 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7fc9c003d1e0 0x7fc9c003f6a0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 --2- 192.168.123.103:0/4263156364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9e8191d10 0x7fc9e8192130 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 >> 192.168.123.103:0/4263156364 conn(0x7fc9e8101870 msgr2=0x7fc9e807a200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 shutdown_connections 2026-03-09T16:09:54.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:53.984+0000 7fc9eca8c640 1 -- 192.168.123.103:0/4263156364 wait complete. 2026-03-09T16:09:54.230 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:53 vm03 ceph-mon[51019]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:54.230 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:53 vm03 ceph-mon[51019]: Saving service alertmanager spec with placement count:1 2026-03-09T16:09:54.230 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:53 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/4263156364' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 8, 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.gbgzmu", 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.196+0000 7f64e2263640 1 Processor -- start 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.197+0000 7f64e2263640 1 -- start start 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.197+0000 7f64e2263640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.197+0000 7f64e2263640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64dc108b80 con 0x7f64dc1081b0 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.198+0000 7f64db7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.198+0000 7f64db7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48504/0 (socket says 192.168.123.103:48504) 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.198+0000 7f64db7fe640 1 -- 192.168.123.103:0/554564886 learned_addr learned my addr 192.168.123.103:0/554564886 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.198+0000 7f64db7fe640 1 -- 192.168.123.103:0/554564886 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64dc109360 con 0x7f64dc1081b0 2026-03-09T16:09:54.374 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.199+0000 7f64db7fe640 1 --2- 192.168.123.103:0/554564886 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f64c8009920 tx=0x7f64c802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3f5ebbf4cb91781f server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.199+0000 7f64da7fc640 1 -- 192.168.123.103:0/554564886 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64c802f9b0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.200+0000 7f64da7fc640 1 -- 192.168.123.103:0/554564886 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f64c8037440 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.200+0000 7f64da7fc640 1 -- 192.168.123.103:0/554564886 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64c8035560 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.201+0000 7f64e2263640 1 -- 192.168.123.103:0/554564886 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 msgr2=0x7f64dc1085b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.201+0000 7f64e2263640 1 --2- 192.168.123.103:0/554564886 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f64c8009920 tx=0x7f64c802ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.204+0000 7f64e2263640 1 -- 192.168.123.103:0/554564886 shutdown_connections 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.204+0000 7f64e2263640 1 --2- 192.168.123.103:0/554564886 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc1085b0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.204+0000 7f64e2263640 1 -- 192.168.123.103:0/554564886 >> 192.168.123.103:0/554564886 conn(0x7f64dc07b860 msgr2=0x7f64dc07bc90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.205+0000 7f64e2263640 1 -- 192.168.123.103:0/554564886 shutdown_connections 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.205+0000 7f64e2263640 1 -- 192.168.123.103:0/554564886 wait complete. 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.206+0000 7f64e2263640 1 Processor -- start 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.206+0000 7f64e2263640 1 -- start start 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.206+0000 7f64e2263640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.207+0000 7f64db7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.207+0000 7f64db7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48512/0 (socket says 192.168.123.103:48512) 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.207+0000 7f64db7fe640 1 -- 192.168.123.103:0/112578244 learned_addr learned my addr 192.168.123.103:0/112578244 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.207+0000 7f64e2263640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64dc19e5a0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.207+0000 7f64db7fe640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64c80095d0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64db7fe640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f64c8009a50 tx=0x7f64c802fbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64c8035820 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f64c8035e40 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64c803fdc0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64e2263640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64dc19e7a0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.208+0000 7f64e2263640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64dc19ec40 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.209+0000 7f64e2263640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f64a8005350 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.210+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f64c803e070 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.210+0000 7f64d8ff9640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 0x7f64b403f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.210+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f64c80758f0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.213+0000 7f64daffd640 1 -- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 msgr2=0x7f64b403f6a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4285644309 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.213+0000 7f64daffd640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 0x7f64b403f6a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.213+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f64c8036b90 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.328+0000 7f64e2263640 1 -- 192.168.123.103:0/112578244 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f64a80058d0 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.329+0000 7f64d8ff9640 1 -- 192.168.123.103:0/112578244 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v8) v1 ==== 56+0+98 (secure 0 0 0) 0x7f64c803c070 con 0x7f64dc1081b0 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 msgr2=0x7f64b403f6a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 0x7f64b403f6a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 msgr2=0x7f64dc19e060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f64c8009a50 tx=0x7f64c802fbe0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 shutdown_connections 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f64b403d1e0 0x7f64b403f6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 --2- 192.168.123.103:0/112578244 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f64dc1081b0 0x7f64dc19e060 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.332+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 >> 192.168.123.103:0/112578244 conn(0x7f64dc07b860 msgr2=0x7f64dc1056a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.333+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 shutdown_connections 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.333+0000 7f64be7fc640 1 -- 192.168.123.103:0/112578244 wait complete. 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-09T16:09:54.375 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 8... 2026-03-09T16:09:55.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:54 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/4263156364' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-09T16:09:55.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:54 vm03 ceph-mon[51019]: mgrmap e8: vm03.gbgzmu(active, since 8s) 2026-03-09T16:09:55.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:54 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/112578244' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T16:09:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:09:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:09:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: osdmap e3: 0 total, 0 up, 0 in 2026-03-09T16:09:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: mgrmap e9: vm03.gbgzmu(active, starting, since 0.00810527s) 2026-03-09T16:09:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:09:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:57 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:09:58.546 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 10, 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.502+0000 7f5a2c88a640 1 Processor -- start 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.502+0000 7f5a2c88a640 1 -- start start 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.502+0000 7f5a2c88a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.502+0000 7f5a2c88a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a24072e60 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.503+0000 7f5a2a5ff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.503+0000 7f5a2a5ff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48518/0 (socket says 192.168.123.103:48518) 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.503+0000 7f5a2a5ff640 1 -- 192.168.123.103:0/3275462521 learned_addr learned my addr 192.168.123.103:0/3275462521 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.503+0000 7f5a2a5ff640 1 -- 192.168.123.103:0/3275462521 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a24071f80 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.504+0000 7f5a2a5ff640 1 --2- 192.168.123.103:0/3275462521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f5a1c009920 tx=0x7f5a1c02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a5b2b60ed8d4c2d8 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.504+0000 7f5a295fd640 1 -- 192.168.123.103:0/3275462521 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a1c02f9b0 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.504+0000 7f5a295fd640 1 -- 192.168.123.103:0/3275462521 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5a1c037440 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.505+0000 7f5a2c88a640 1 -- 192.168.123.103:0/3275462521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 msgr2=0x7f5a24071a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.505+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/3275462521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f5a1c009920 tx=0x7f5a1c02ef20 comp rx=0 tx=0).stop 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.505+0000 7f5a2c88a640 1 -- 192.168.123.103:0/3275462521 shutdown_connections 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.505+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/3275462521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a24071a40 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.505+0000 7f5a2c88a640 1 -- 192.168.123.103:0/3275462521 >> 192.168.123.103:0/3275462521 conn(0x7f5a2406d2a0 msgr2=0x7f5a2406f6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 -- 192.168.123.103:0/3275462521 shutdown_connections 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 -- 192.168.123.103:0/3275462521 wait complete. 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 Processor -- start 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 -- start start 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.506+0000 7f5a2c88a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a1c0353c0 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.507+0000 7f5a2a5ff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.507+0000 7f5a2a5ff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48522/0 (socket says 192.168.123.103:48522) 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.507+0000 7f5a2a5ff640 1 -- 192.168.123.103:0/2542364998 learned_addr learned my addr 192.168.123.103:0/2542364998 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.507+0000 7f5a2a5ff640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a1c0095d0 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.507+0000 7f5a2a5ff640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f5a1c02f450 tx=0x7f5a1c037c90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.508+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a1c035a30 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.508+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a241ab4b0 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.508+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a241ab9b0 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.508+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5a1c035b90 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.508+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a1c036540 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.509+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f5a1c03e070 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.510+0000 7f5a177fe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.510+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f5a0003fe50 con 0x7f5a0003d280 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.510+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f5a1c075790 con 0x7f5a24071640 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.510+0000 7f5a29dfe640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 msgr2=0x7f5a0003f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4285644309 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.510+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.710+0000 7f5a29dfe640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 msgr2=0x7f5a0003f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4285644309 2026-03-09T16:09:58.547 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:54.710+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:55.111+0000 7f5a29dfe640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 msgr2=0x7f5a0003f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4285644309 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:55.111+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:55.912+0000 7f5a29dfe640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 msgr2=0x7f5a0003f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/4285644309 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:55.912+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:57.475+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 9) v1 ==== 49150+0+0 (secure 0 0 0) 0x7f5a1c035d00 con 0x7f5a24071640 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:57.475+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 msgr2=0x7f5a0003f740 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:57.475+0000 7f5a177fe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.480+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f5a1c046c80 con 0x7f5a24071640 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.480+0000 7f5a177fe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 0x7f5a00043120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.480+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f5a0003fe50 con 0x7f5a00040d30 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.482+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 0x7f5a00043120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.482+0000 7f5a29dfe640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 0x7f5a00043120 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f5a10003a80 tx=0x7f5a100092b0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.483+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+7759 (secure 0 0 0) 0x7f5a0003fe50 con 0x7f5a00040d30 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.486+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f59ec002670 con 0x7f5a00040d30 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a177fe640 1 -- 192.168.123.103:0/2542364998 <== mgr.14162 v2:192.168.123.103:6800/3405276359 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f59ec002670 con 0x7f5a00040d30 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 msgr2=0x7f5a00043120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 0x7f5a00043120 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f5a10003a80 tx=0x7f5a100092b0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 msgr2=0x7f5a241aaf70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f5a1c02f450 tx=0x7f5a1c037c90 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 shutdown_connections 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5a00040d30 0x7f5a00043120 secure :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f5a10003a80 tx=0x7f5a100092b0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:6800/4285644309,v1:192.168.123.103:6801/4285644309] conn(0x7f5a0003d280 0x7f5a0003f740 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 --2- 192.168.123.103:0/2542364998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a24071640 0x7f5a241aaf70 secure :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f5a1c02f450 tx=0x7f5a1c037c90 comp rx=0 tx=0).stop 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.488+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 >> 192.168.123.103:0/2542364998 conn(0x7f5a2406d2a0 msgr2=0x7f5a24112af0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.489+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 shutdown_connections 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.489+0000 7f5a2c88a640 1 -- 192.168.123.103:0/2542364998 wait complete. 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 8 is available 2026-03-09T16:09:58.548 INFO:teuthology.orchestra.run.vm03.stdout:Generating a dashboard self-signed certificate... 2026-03-09T16:09:58.796 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:58 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:09:58.796 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:58 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:58.796 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:58 vm03 ceph-mon[51019]: mgrmap e10: vm03.gbgzmu(active, since 1.01164s) 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.678+0000 7faf4804f640 1 Processor -- start 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.678+0000 7faf4804f640 1 -- start start 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.678+0000 7faf4804f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.678+0000 7faf4804f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf400719d0 con 0x7faf40072f50 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.679+0000 7faf45dc4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.679+0000 7faf45dc4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39232/0 (socket says 192.168.123.103:39232) 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.679+0000 7faf45dc4640 1 -- 192.168.123.103:0/295350932 learned_addr learned my addr 192.168.123.103:0/295350932 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.680+0000 7faf45dc4640 1 -- 192.168.123.103:0/295350932 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf40071b10 con 0x7faf40072f50 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.681+0000 7faf45dc4640 1 --2- 192.168.123.103:0/295350932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7faf3c009de0 tx=0x7faf3c031240 comp rx=0 tx=0).ready entity=mon.0 client_cookie=eb643ad05515755c server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.904 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.681+0000 7faf44dc2640 1 -- 192.168.123.103:0/295350932 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf3c031c90 con 0x7faf40072f50 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.681+0000 7faf44dc2640 1 -- 192.168.123.103:0/295350932 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faf3c031df0 con 0x7faf40072f50 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.681+0000 7faf4804f640 1 -- 192.168.123.103:0/295350932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 msgr2=0x7faf40071400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.681+0000 7faf4804f640 1 --2- 192.168.123.103:0/295350932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7faf3c009de0 tx=0x7faf3c031240 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 -- 192.168.123.103:0/295350932 shutdown_connections 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 --2- 192.168.123.103:0/295350932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf40072f50 0x7faf40071400 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 -- 192.168.123.103:0/295350932 >> 192.168.123.103:0/295350932 conn(0x7faf4006d080 msgr2=0x7faf4006f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 -- 192.168.123.103:0/295350932 shutdown_connections 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 -- 192.168.123.103:0/295350932 wait complete. 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.682+0000 7faf4804f640 1 Processor -- start 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf4804f640 1 -- start start 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf4804f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf4804f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf3c0376b0 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf45dc4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf45dc4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39236/0 (socket says 192.168.123.103:39236) 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf45dc4640 1 -- 192.168.123.103:0/3485908654 learned_addr learned my addr 192.168.123.103:0/3485908654 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.683+0000 7faf45dc4640 1 -- 192.168.123.103:0/3485908654 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf3c0095d0 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.684+0000 7faf45dc4640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7faf3c031770 tx=0x7faf3c0385d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.684+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf3c0388a0 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.684+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf401bcbe0 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.684+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf401bf780 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.685+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faf3c038ec0 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.685+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf3c039e30 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.686+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7faf3c040030 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.686+0000 7faf36ffd640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 0x7faf2c03f650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.687+0000 7faf455c3640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 0x7faf2c03f650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.687+0000 7faf455c3640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 0x7faf2c03f650 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7faf3800ad30 tx=0x7faf380093f0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.686+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7faf3c076990 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.688+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf0c005350 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.691+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7faf3c03e070 con 0x7faf401bc280 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.794+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7faf0c002bf0 con 0x7faf2c03d190 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.847+0000 7faf36ffd640 1 -- 192.168.123.103:0/3485908654 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7faf0c002bf0 con 0x7faf2c03d190 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.850+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 msgr2=0x7faf2c03f650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.850+0000 7faf4804f640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 0x7faf2c03f650 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7faf3800ad30 tx=0x7faf380093f0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.850+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 msgr2=0x7faf401bc6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.850+0000 7faf4804f640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7faf3c031770 tx=0x7faf3c0385d0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 shutdown_connections 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7faf2c03d190 0x7faf2c03f650 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 --2- 192.168.123.103:0/3485908654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf401bc280 0x7faf401bc6a0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 >> 192.168.123.103:0/3485908654 conn(0x7faf4006d080 msgr2=0x7faf4006d8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 shutdown_connections 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:58.851+0000 7faf4804f640 1 -- 192.168.123.103:0/3485908654 wait complete. 2026-03-09T16:09:58.905 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial admin user... 2026-03-09T16:09:59.396 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$3th3JpoqvluOwXW9bQE.2uwqKKuz.5dnARNqo5aeHDC/HP3j7LgPi", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773072599, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.086+0000 7f3378ed8640 1 Processor -- start 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.086+0000 7f3378ed8640 1 -- start start 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.087+0000 7f3378ed8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.087+0000 7f3378ed8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3374106a90 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.087+0000 7f3372575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.087+0000 7f3372575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39248/0 (socket says 192.168.123.103:39248) 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.087+0000 7f3372575640 1 -- 192.168.123.103:0/1805742197 learned_addr learned my addr 192.168.123.103:0/1805742197 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.088+0000 7f3372575640 1 -- 192.168.123.103:0/1805742197 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3374107220 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.089+0000 7f3372575640 1 --2- 192.168.123.103:0/1805742197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3360009b80 tx=0x7f336002f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2194b3ad4cee1cc0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.089+0000 7f3371573640 1 -- 192.168.123.103:0/1805742197 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f336002fa10 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.089+0000 7f3371573640 1 -- 192.168.123.103:0/1805742197 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f336002fb70 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.089+0000 7f3371573640 1 -- 192.168.123.103:0/1805742197 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f33600355b0 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.090+0000 7f3378ed8640 1 -- 192.168.123.103:0/1805742197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 msgr2=0x7f33741064c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.090+0000 7f3378ed8640 1 --2- 192.168.123.103:0/1805742197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3360009b80 tx=0x7f336002f190 comp rx=0 tx=0).stop 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 -- 192.168.123.103:0/1805742197 shutdown_connections 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 --2- 192.168.123.103:0/1805742197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f33741064c0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 -- 192.168.123.103:0/1805742197 >> 192.168.123.103:0/1805742197 conn(0x7f33741018b0 msgr2=0x7f3374103cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 -- 192.168.123.103:0/1805742197 shutdown_connections 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 -- 192.168.123.103:0/1805742197 wait complete. 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.091+0000 7f3378ed8640 1 Processor -- start 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3378ed8640 1 -- start start 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3378ed8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3378ed8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f337419e680 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3372575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3372575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39256/0 (socket says 192.168.123.103:39256) 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.092+0000 7f3372575640 1 -- 192.168.123.103:0/96422844 learned_addr learned my addr 192.168.123.103:0/96422844 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.093+0000 7f3372575640 1 -- 192.168.123.103:0/96422844 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33600095d0 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.093+0000 7f3372575640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3360037670 tx=0x7f33600376a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.094+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f336002fe20 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.094+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3360041400 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.094+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f336003f530 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.094+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f337419e880 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.094+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f337419ec60 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.095+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f336003e030 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.095+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3374106540 con 0x7f33741060c0 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.095+0000 7f335f7fe640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 0x7f334c03f620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.397 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.095+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f3360076b00 con 0x7f33741060c0 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.096+0000 7f3371d74640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 0x7f334c03f620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.097+0000 7f3371d74640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 0x7f334c03f620 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f33680099c0 tx=0x7f3368006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.099+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f336003ed50 con 0x7f33741060c0 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.201+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7f33740621d0 con 0x7f334c03d160 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.361+0000 7f335f7fe640 1 -- 192.168.123.103:0/96422844 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7f33740621d0 con 0x7f334c03d160 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.364+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 msgr2=0x7f334c03f620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.364+0000 7f3378ed8640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 0x7f334c03f620 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f33680099c0 tx=0x7f3368006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.364+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 msgr2=0x7f337419e140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.364+0000 7f3378ed8640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3360037670 tx=0x7f33600376a0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 shutdown_connections 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f334c03d160 0x7f334c03f620 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 --2- 192.168.123.103:0/96422844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33741060c0 0x7f337419e140 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 >> 192.168.123.103:0/96422844 conn(0x7f33741018b0 msgr2=0x7f3374102260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 shutdown_connections 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.365+0000 7f3378ed8640 1 -- 192.168.123.103:0/96422844 wait complete. 2026-03-09T16:09:59.398 INFO:teuthology.orchestra.run.vm03.stdout:Fetching dashboard port number... 2026-03-09T16:09:59.622 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:58] ENGINE Bus STARTING 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:58] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:58] ENGINE Client ('192.168.123.103', 46394) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:58] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: [09/Mar/2026:16:09:58] ENGINE Bus STARTED 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='client.14174 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:59.623 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:09:59 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:09:59.663 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 8443 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21ca14d640 1 Processor -- start 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21ca14d640 1 -- start start 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21ca14d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21ca14d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21c4108cc0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21c37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21c37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39260/0 (socket says 192.168.123.103:39260) 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.505+0000 7f21c37fe640 1 -- 192.168.123.103:0/1496337294 learned_addr learned my addr 192.168.123.103:0/1496337294 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.506+0000 7f21c37fe640 1 -- 192.168.123.103:0/1496337294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21c4109490 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.506+0000 7f21c37fe640 1 --2- 192.168.123.103:0/1496337294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f21b4009b30 tx=0x7f21b402f140 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6ba064c9aff1f563 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21c27fc640 1 -- 192.168.123.103:0/1496337294 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21b402fbd0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21c27fc640 1 -- 192.168.123.103:0/1496337294 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f21b402fd30 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 -- 192.168.123.103:0/1496337294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 msgr2=0x7f21c41086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 --2- 192.168.123.103:0/1496337294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f21b4009b30 tx=0x7f21b402f140 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 -- 192.168.123.103:0/1496337294 shutdown_connections 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 --2- 192.168.123.103:0/1496337294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c41086f0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 -- 192.168.123.103:0/1496337294 >> 192.168.123.103:0/1496337294 conn(0x7f21c407b8c0 msgr2=0x7f21c41066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 -- 192.168.123.103:0/1496337294 shutdown_connections 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.507+0000 7f21ca14d640 1 -- 192.168.123.103:0/1496337294 wait complete. 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21ca14d640 1 Processor -- start 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21ca14d640 1 -- start start 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21ca14d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21ca14d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21b40355e0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21c37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21c37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39274/0 (socket says 192.168.123.103:39274) 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.508+0000 7f21c37fe640 1 -- 192.168.123.103:0/2219442571 learned_addr learned my addr 192.168.123.103:0/2219442571 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21c37fe640 1 -- 192.168.123.103:0/2219442571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21b40095d0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21c37fe640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f21b402f6f0 tx=0x7f21b4037670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21b4037970 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f21b4040450 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21b403f510 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f21c419e8f0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.509+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f21c419ed90 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.510+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f21b403e030 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.510+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2190005350 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.510+0000 7f21c0ff9640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 0x7f219803f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.510+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f21b4075890 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.513+0000 7f21c2ffd640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 0x7f219803f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.513+0000 7f21c2ffd640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 0x7f219803f5d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f21b00099c0 tx=0x7f21b0006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.513+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f21b403eaf0 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.621+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7f2190005b80 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.623+0000 7f21c0ff9640 1 -- 192.168.123.103:0/2219442571 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7f21b403c070 con 0x7f21c41082f0 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 msgr2=0x7f219803f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 0x7f219803f5d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f21b00099c0 tx=0x7f21b0006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 msgr2=0x7f21c419e3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f21b402f6f0 tx=0x7f21b4037670 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 shutdown_connections 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f219803d110 0x7f219803f5d0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 --2- 192.168.123.103:0/2219442571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21c41082f0 0x7f21c419e3b0 secure :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f21b402f6f0 tx=0x7f21b4037670 comp rx=0 tx=0).stop 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 >> 192.168.123.103:0/2219442571 conn(0x7f21c407b8c0 msgr2=0x7f21c4105e40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 shutdown_connections 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.626+0000 7f21ca14d640 1 -- 192.168.123.103:0/2219442571 wait complete. 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-09T16:09:59.664 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout:Ceph Dashboard is now available at: 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout: URL: https://vm03.local:8443/ 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout: User: admin 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout: Password: ia58k5lsnx 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout:Saving cluster configuration to /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config directory 2026-03-09T16:09:59.665 INFO:teuthology.orchestra.run.vm03.stdout:Enabling autotune for osd_memory_target 2026-03-09T16:09:59.967 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.796+0000 7f527c101640 1 Processor -- start 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f527c101640 1 -- start start 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f527c101640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f527c101640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5274106b00 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f5279e76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f5279e76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39282/0 (socket says 192.168.123.103:39282) 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f5279e76640 1 -- 192.168.123.103:0/4165382660 learned_addr learned my addr 192.168.123.103:0/4165382660 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.797+0000 7f5279e76640 1 -- 192.168.123.103:0/4165382660 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5274107290 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.798+0000 7f5279e76640 1 --2- 192.168.123.103:0/4165382660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f5268013420 tx=0x7f52680378b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7136c4ca408a3e69 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.798+0000 7f5278e74640 1 -- 192.168.123.103:0/4165382660 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f526800fd20 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.798+0000 7f5278e74640 1 -- 192.168.123.103:0/4165382660 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f526803a050 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 -- 192.168.123.103:0/4165382660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 msgr2=0x7f5274106530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 --2- 192.168.123.103:0/4165382660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f5268013420 tx=0x7f52680378b0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 -- 192.168.123.103:0/4165382660 shutdown_connections 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 --2- 192.168.123.103:0/4165382660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274106530 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 -- 192.168.123.103:0/4165382660 >> 192.168.123.103:0/4165382660 conn(0x7f52741018e0 msgr2=0x7f5274103d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 -- 192.168.123.103:0/4165382660 shutdown_connections 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.799+0000 7f527c101640 1 -- 192.168.123.103:0/4165382660 wait complete. 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.801+0000 7f527c101640 1 Processor -- start 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.801+0000 7f527c101640 1 -- start start 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.801+0000 7f527c101640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.801+0000 7f527c101640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f527419a210 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5279e76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5279e76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39296/0 (socket says 192.168.123.103:39296) 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5279e76640 1 -- 192.168.123.103:0/1547455839 learned_addr learned my addr 192.168.123.103:0/1547455839 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5279e76640 1 -- 192.168.123.103:0/1547455839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52680130d0 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5279e76640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f5268013550 tx=0x7f526800ff50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f526803ea70 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f527419a410 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.802+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f527419a8b0 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.803+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5268003bd0 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.803+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5268011590 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.803+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f52680116f0 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.804+0000 7f5262ffd640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 0x7f525003f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.804+0000 7f5279675640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 0x7f525003f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.804+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f526807fbb0 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.805+0000 7f5279675640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 0x7f525003f5d0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f52640099c0 tx=0x7f5264006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.806+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f523c005350 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.809+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5268047b40 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.911+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f523c005b80 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.912+0000 7f5262ffd640 1 -- 192.168.123.103:0/1547455839 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f5268047360 con 0x7f5274106130 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 msgr2=0x7f525003f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 0x7f525003f5d0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f52640099c0 tx=0x7f5264006eb0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 msgr2=0x7f5274199cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f5268013550 tx=0x7f526800ff50 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 shutdown_connections 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.914+0000 7f527c101640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f525003d110 0x7f525003f5d0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.915+0000 7f527c101640 1 --2- 192.168.123.103:0/1547455839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5274106130 0x7f5274199cd0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.915+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 >> 192.168.123.103:0/1547455839 conn(0x7f52741018e0 msgr2=0x7f5274102e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.915+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 shutdown_connections 2026-03-09T16:09:59.968 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:09:59.915+0000 7f527c101640 1 -- 192.168.123.103:0/1547455839 wait complete. 2026-03-09T16:10:00.306 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.096+0000 7efd2e0bb640 1 Processor -- start 2026-03-09T16:10:00.306 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.096+0000 7efd2e0bb640 1 -- start start 2026-03-09T16:10:00.306 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.096+0000 7efd2e0bb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:00.306 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.096+0000 7efd2e0bb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd28104630 con 0x7efd28103cf0 2026-03-09T16:10:00.306 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.097+0000 7efd277fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:00.307 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.097+0000 7efd277fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39310/0 (socket says 192.168.123.103:39310) 2026-03-09T16:10:00.307 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.097+0000 7efd277fe640 1 -- 192.168.123.103:0/4092911193 learned_addr learned my addr 192.168.123.103:0/4092911193 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.097+0000 7efd277fe640 1 -- 192.168.123.103:0/4092911193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd28104770 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.098+0000 7efd277fe640 1 --2- 192.168.123.103:0/4092911193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7efd18009920 tx=0x7efd1802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4335cb02f4708368 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.098+0000 7efd267fc640 1 -- 192.168.123.103:0/4092911193 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd1802f9b0 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.098+0000 7efd267fc640 1 -- 192.168.123.103:0/4092911193 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efd18037440 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.098+0000 7efd267fc640 1 -- 192.168.123.103:0/4092911193 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd18035560 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 -- 192.168.123.103:0/4092911193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 msgr2=0x7efd281040f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/4092911193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7efd18009920 tx=0x7efd1802ef20 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 -- 192.168.123.103:0/4092911193 shutdown_connections 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/4092911193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd281040f0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 -- 192.168.123.103:0/4092911193 >> 192.168.123.103:0/4092911193 conn(0x7efd280ff960 msgr2=0x7efd28101dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 -- 192.168.123.103:0/4092911193 shutdown_connections 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.099+0000 7efd2e0bb640 1 -- 192.168.123.103:0/4092911193 wait complete. 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd2e0bb640 1 Processor -- start 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd2e0bb640 1 -- start start 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd2e0bb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd2e0bb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd2819c410 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd277fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd277fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39312/0 (socket says 192.168.123.103:39312) 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.100+0000 7efd277fe640 1 -- 192.168.123.103:0/2938494201 learned_addr learned my addr 192.168.123.103:0/2938494201 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd277fe640 1 -- 192.168.123.103:0/2938494201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd180095d0 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd277fe640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7efd18037d90 tx=0x7efd18037990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd1802fe40 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efd18035ce0 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd1803fd80 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.101+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd2819c610 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.102+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd2819cab0 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.103+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 10) v1 ==== 49277+0+0 (secure 0 0 0) 0x7efd1803e070 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.103+0000 7efd24ff9640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 0x7efd00043ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.103+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7efd18076470 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.103+0000 7efd26ffd640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 0x7efd00043ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.103+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efcf4005350 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.105+0000 7efd26ffd640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 0x7efd00043ae0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7efd14009a10 tx=0x7efd14006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.106+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7efd1803c080 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.240+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7efcf4005b80 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.246+0000 7efd24ff9640 1 -- 192.168.123.103:0/2938494201 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7efd180730c0 con 0x7efd28103cf0 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.249+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 msgr2=0x7efd00043ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.249+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 0x7efd00043ae0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7efd14009a10 tx=0x7efd14006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.249+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 msgr2=0x7efd2819bed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.249+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7efd18037d90 tx=0x7efd18037990 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 shutdown_connections 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd00041620 0x7efd00043ae0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 --2- 192.168.123.103:0/2938494201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd28103cf0 0x7efd2819bed0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 >> 192.168.123.103:0/2938494201 conn(0x7efd280ff960 msgr2=0x7efd281003d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 shutdown_connections 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-09T16:10:00.250+0000 7efd2e0bb640 1 -- 192.168.123.103:0/2938494201 wait complete. 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout:Or, if you are only running a single cluster on this host: 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-09T16:10:00.308 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: ceph telemetry on 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout:For more information see: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:00.309 INFO:teuthology.orchestra.run.vm03.stdout:Bootstrap complete. 2026-03-09T16:10:00.340 INFO:tasks.cephadm:Fetching config... 2026-03-09T16:10:00.341 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:00.341 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-09T16:10:00.402 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-09T16:10:00.402 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:00.402 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-09T16:10:00.459 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-09T16:10:00.474 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:00.474 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/keyring of=/dev/stdout 2026-03-09T16:10:00.533 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-09T16:10:00.533 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:00.533 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-09T16:10:00.602 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-09T16:10:00.602 DEBUG:teuthology.orchestra.run.vm03:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKhDbemk21YojNt8ejP70oZxwYFsPVN4U87El3w1TeUZ ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T16:10:00.700 INFO:teuthology.orchestra.run.vm03.stdout:ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKhDbemk21YojNt8ejP70oZxwYFsPVN4U87El3w1TeUZ ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:00.713 DEBUG:teuthology.orchestra.run.vm05:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKhDbemk21YojNt8ejP70oZxwYFsPVN4U87El3w1TeUZ ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T16:10:00.757 INFO:teuthology.orchestra.run.vm05.stdout:ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKhDbemk21YojNt8ejP70oZxwYFsPVN4U87El3w1TeUZ ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:00.770 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-09T16:10:00.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:00 vm03 ceph-mon[51019]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:00.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:00 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2219442571' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-09T16:10:00.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:00 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2938494201' entity='client.admin' 2026-03-09T16:10:00.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:00 vm03 ceph-mon[51019]: mgrmap e11: vm03.gbgzmu(active, since 3s) 2026-03-09T16:10:00.977 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.290+0000 7f52ff7c6640 1 -- 192.168.123.103:0/4231516098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 msgr2=0x7f52f80ffa90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.290+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/4231516098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f80ffa90 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f52f400b0a0 tx=0x7f52f402f530 comp rx=0 tx=0).stop 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.292+0000 7f52ff7c6640 1 -- 192.168.123.103:0/4231516098 shutdown_connections 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.292+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/4231516098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f80ffa90 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.292+0000 7f52ff7c6640 1 -- 192.168.123.103:0/4231516098 >> 192.168.123.103:0/4231516098 conn(0x7f52f80f9fb0 msgr2=0x7f52f80fc3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.292+0000 7f52ff7c6640 1 -- 192.168.123.103:0/4231516098 shutdown_connections 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.292+0000 7f52ff7c6640 1 -- 192.168.123.103:0/4231516098 wait complete. 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52ff7c6640 1 Processor -- start 2026-03-09T16:10:01.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52ff7c6640 1 -- start start 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52ff7c6640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52ff7c6640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52f8112090 con 0x7f52f80ff690 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52fd53b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52fd53b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39336/0 (socket says 192.168.123.103:39336) 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52fd53b640 1 -- 192.168.123.103:0/582600435 learned_addr learned my addr 192.168.123.103:0/582600435 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.293+0000 7f52fd53b640 1 -- 192.168.123.103:0/582600435 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52f4009d00 con 0x7f52f80ff690 2026-03-09T16:10:01.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.294+0000 7f52fd53b640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f52f4004710 tx=0x7f52f4009510 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:01.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.294+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f52f4004910 con 0x7f52f80ff690 2026-03-09T16:10:01.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.294+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52f8112290 con 0x7f52f80ff690 2026-03-09T16:10:01.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.294+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52f810e690 con 0x7f52f80ff690 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.295+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f52f4007d40 con 0x7f52f80ff690 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.295+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f52f4042a30 con 0x7f52f80ff690 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.295+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f52f404c430 con 0x7f52f80ff690 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.295+0000 7f52e67fc640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 0x7f52cc03f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.296+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f52f403e070 con 0x7f52f80ff690 2026-03-09T16:10:01.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.296+0000 7f52fcd3a640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 0x7f52cc03f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:01.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.296+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52f80ffa90 con 0x7f52f80ff690 2026-03-09T16:10:01.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.296+0000 7f52fcd3a640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 0x7f52cc03f790 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f52e80099c0 tx=0x7f52e8006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:01.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.299+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f52f4047070 con 0x7f52f80ff690 2026-03-09T16:10:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.404+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f52f8103f10 con 0x7f52f80ff690 2026-03-09T16:10:01.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.414+0000 7f52e67fc640 1 -- 192.168.123.103:0/582600435 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f52f8103f10 con 0x7f52f80ff690 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 msgr2=0x7f52cc03f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 0x7f52cc03f790 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f52e80099c0 tx=0x7f52e8006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 msgr2=0x7f52f8111b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f52f4004710 tx=0x7f52f4009510 comp rx=0 tx=0).stop 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 shutdown_connections 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f52cc03d2d0 0x7f52cc03f790 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 --2- 192.168.123.103:0/582600435 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52f80ff690 0x7f52f8111b50 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.417+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 >> 192.168.123.103:0/582600435 conn(0x7f52f80f9fb0 msgr2=0x7f52f80fac00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.418+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 shutdown_connections 2026-03-09T16:10:01.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:01.418+0000 7f52ff7c6640 1 -- 192.168.123.103:0/582600435 wait complete. 2026-03-09T16:10:01.488 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-09T16:10:01.488 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-09T16:10:01.759 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.062+0000 7f93584db640 1 -- 192.168.123.103:0/3249556198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 msgr2=0x7f9350102880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.062+0000 7f93584db640 1 --2- 192.168.123.103:0/3249556198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f9350102880 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f93400099b0 tx=0x7f934002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.063+0000 7f93584db640 1 -- 192.168.123.103:0/3249556198 shutdown_connections 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.063+0000 7f93584db640 1 --2- 192.168.123.103:0/3249556198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f9350102880 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.063+0000 7f93584db640 1 -- 192.168.123.103:0/3249556198 >> 192.168.123.103:0/3249556198 conn(0x7f93500fdca0 msgr2=0x7f9350100090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.063+0000 7f93584db640 1 -- 192.168.123.103:0/3249556198 shutdown_connections 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.063+0000 7f93584db640 1 -- 192.168.123.103:0/3249556198 wait complete. 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f93584db640 1 Processor -- start 2026-03-09T16:10:02.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f93584db640 1 -- start start 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f93584db640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f93584db640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9350199d30 con 0x7f9350102480 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f9356250640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f9356250640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39358/0 (socket says 192.168.123.103:39358) 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f9356250640 1 -- 192.168.123.103:0/2057795926 learned_addr learned my addr 192.168.123.103:0/2057795926 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.064+0000 7f9356250640 1 -- 192.168.123.103:0/2057795926 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9340009660 con 0x7f9350102480 2026-03-09T16:10:02.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f9356250640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f934002f860 tx=0x7f9340004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93400043b0 con 0x7f9350102480 2026-03-09T16:10:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9340038b40 con 0x7f9350102480 2026-03-09T16:10:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9350199f30 con 0x7f9350102480 2026-03-09T16:10:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93400418f0 con 0x7f9350102480 2026-03-09T16:10:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.065+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f935019a3d0 con 0x7f9350102480 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.066+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f9340038cb0 con 0x7f9350102480 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.066+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9350102900 con 0x7f9350102480 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.066+0000 7f933f7fe640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 0x7f932403f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.067+0000 7f9355a4f640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 0x7f932403f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.067+0000 7f9355a4f640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 0x7f932403f6a0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f93440099c0 tx=0x7f9344006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.067+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f93400760e0 con 0x7f9350102480 2026-03-09T16:10:02.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.069+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f93400aa820 con 0x7f9350102480 2026-03-09T16:10:02.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.174+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7f93500fffc0 con 0x7f932403d1e0 2026-03-09T16:10:02.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.178+0000 7f933f7fe640 1 -- 192.168.123.103:0/2057795926 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f93500fffc0 con 0x7f932403d1e0 2026-03-09T16:10:02.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.180+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 msgr2=0x7f932403f6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:02.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.180+0000 7f93584db640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 0x7f932403f6a0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f93440099c0 tx=0x7f9344006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 msgr2=0x7f93501997f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f934002f860 tx=0x7f9340004270 comp rx=0 tx=0).stop 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 shutdown_connections 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f932403d1e0 0x7f932403f6a0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 --2- 192.168.123.103:0/2057795926 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9350102480 0x7f93501997f0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 >> 192.168.123.103:0/2057795926 conn(0x7f93500fdca0 msgr2=0x7f93500fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 shutdown_connections 2026-03-09T16:10:02.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.181+0000 7f93584db640 1 -- 192.168.123.103:0/2057795926 wait complete. 2026-03-09T16:10:02.232 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm05 2026-03-09T16:10:02.232 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:10:02.232 DEBUG:teuthology.orchestra.run.vm05:> dd of=/etc/ceph/ceph.conf 2026-03-09T16:10:02.251 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:10:02.252 DEBUG:teuthology.orchestra.run.vm05:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:02.313 INFO:tasks.cephadm:Adding host vm05 to orchestrator... 2026-03-09T16:10:02.314 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch host add vm05 2026-03-09T16:10:02.483 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:02.526 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:02 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/582600435' entity='client.admin' 2026-03-09T16:10:02.526 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:02 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:02.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.886+0000 7fe9fbfff640 1 -- 192.168.123.103:0/4234321209 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 msgr2=0x7fe9fc0fe470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:02.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.886+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/4234321209 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc0fe470 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fe9ec00b0a0 tx=0x7fe9ec02f530 comp rx=0 tx=0).stop 2026-03-09T16:10:02.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.891+0000 7fe9fbfff640 1 -- 192.168.123.103:0/4234321209 shutdown_connections 2026-03-09T16:10:02.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.891+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/4234321209 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc0fe470 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:02.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.891+0000 7fe9fbfff640 1 -- 192.168.123.103:0/4234321209 >> 192.168.123.103:0/4234321209 conn(0x7fe9fc0f9b60 msgr2=0x7fe9fc0fbf80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:02.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.892+0000 7fe9fbfff640 1 -- 192.168.123.103:0/4234321209 shutdown_connections 2026-03-09T16:10:02.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.892+0000 7fe9fbfff640 1 -- 192.168.123.103:0/4234321209 wait complete. 2026-03-09T16:10:02.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.892+0000 7fe9fbfff640 1 Processor -- start 2026-03-09T16:10:02.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.895+0000 7fe9fbfff640 1 -- start start 2026-03-09T16:10:02.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.896+0000 7fe9fbfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:02.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.896+0000 7fe9fbfff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9fc106210 con 0x7fe9fc0fe070 2026-03-09T16:10:02.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.896+0000 7fe9faffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:02.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.896+0000 7fe9faffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39376/0 (socket says 192.168.123.103:39376) 2026-03-09T16:10:02.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.896+0000 7fe9faffd640 1 -- 192.168.123.103:0/197673908 learned_addr learned my addr 192.168.123.103:0/197673908 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fe9faffd640 1 -- 192.168.123.103:0/197673908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9ec009d00 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fe9faffd640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe9ec002790 tx=0x7fe9ec0049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe9ec004050 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9fc106410 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe9ec007d40 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.899+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9fc1068b0 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.900+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe9ec004050 con 0x7fe9fc0fe070 2026-03-09T16:10:02.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.900+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe9c0005350 con 0x7fe9fc0fe070 2026-03-09T16:10:02.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.901+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 49383+0+0 (secure 0 0 0) 0x7fe9ec0041b0 con 0x7fe9fc0fe070 2026-03-09T16:10:02.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.902+0000 7fea00a21640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 0x7fe9d003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:02.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.902+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fe9ec03e070 con 0x7fe9fc0fe070 2026-03-09T16:10:02.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.902+0000 7fe9fa7fc640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 0x7fe9d003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:02.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.902+0000 7fe9fa7fc640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 0x7fe9d003f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fe9f00099c0 tx=0x7fe9f0006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:02.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:02.905+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe9ec037c90 con 0x7fe9fc0fe070 2026-03-09T16:10:03.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:03.017+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm05", "target": ["mon-mgr", ""]}) v1 -- 0x7fe9c0002bf0 con 0x7fe9d003d230 2026-03-09T16:10:03.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:03.902+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fe9ec0041b0 con 0x7fe9fc0fe070 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:10:04.334 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:03 vm03 ceph-mon[51019]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm05", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.513+0000 7fea00a21640 1 -- 192.168.123.103:0/197673908 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7fe9c0002bf0 con 0x7fe9d003d230 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stdout:Added host 'vm05' with addr '192.168.123.105' 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 msgr2=0x7fe9d003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 0x7fe9d003f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fe9f00099c0 tx=0x7fe9f0006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 msgr2=0x7fe9fc105cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:04.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe9ec002790 tx=0x7fe9ec0049e0 comp rx=0 tx=0).stop 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 shutdown_connections 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe9d003d230 0x7fe9d003f6f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 --2- 192.168.123.103:0/197673908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9fc0fe070 0x7fe9fc105cd0 secure :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe9ec002790 tx=0x7fe9ec0049e0 comp rx=0 tx=0).stop 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.516+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 >> 192.168.123.103:0/197673908 conn(0x7fe9fc0f9b60 msgr2=0x7fe9fc0fa760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.517+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 shutdown_connections 2026-03-09T16:10:04.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:04.517+0000 7fe9fbfff640 1 -- 192.168.123.103:0/197673908 wait complete. 2026-03-09T16:10:04.611 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch host ls --format=json 2026-03-09T16:10:04.953 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:05.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: Deploying cephadm binary to vm05 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: mgrmap e12: vm03.gbgzmu(active, since 6s) 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-09T16:10:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:04 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:05.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 -- 192.168.123.103:0/4019349106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 msgr2=0x7fc678071d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc6767fc640 1 -- 192.168.123.103:0/4019349106 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc66002fb30 con 0x7fc678071990 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 --2- 192.168.123.103:0/4019349106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc678071d70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fc6600099b0 tx=0x7fc66002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 -- 192.168.123.103:0/4019349106 shutdown_connections 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 --2- 192.168.123.103:0/4019349106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc678071d70 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 -- 192.168.123.103:0/4019349106 >> 192.168.123.103:0/4019349106 conn(0x7fc67806b190 msgr2=0x7fc67806b5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 -- 192.168.123.103:0/4019349106 shutdown_connections 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.260+0000 7fc677fff640 1 -- 192.168.123.103:0/4019349106 wait complete. 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc677fff640 1 Processor -- start 2026-03-09T16:10:05.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc677fff640 1 -- start start 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc677fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc677fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6781a6f20 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc676ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc676ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39414/0 (socket says 192.168.123.103:39414) 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.261+0000 7fc676ffd640 1 -- 192.168.123.103:0/2643611366 learned_addr learned my addr 192.168.123.103:0/2643611366 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.262+0000 7fc676ffd640 1 -- 192.168.123.103:0/2643611366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc660009660 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.262+0000 7fc676ffd640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fc660009980 tx=0x7fc660031d80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.262+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc66003c050 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.262+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6781a7120 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.262+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6781a7560 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc66003d040 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc660038470 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fc660038690 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc674ff9640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 0x7fc64403f4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fc660077250 con 0x7fc678071990 2026-03-09T16:10:05.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.263+0000 7fc66e5ff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 0x7fc64403f4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:05.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.264+0000 7fc66e5ff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 0x7fc64403f4c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fc6580099c0 tx=0x7fc658006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:05.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.264+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc63c005350 con 0x7fc678071990 2026-03-09T16:10:05.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.267+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc660049960 con 0x7fc678071990 2026-03-09T16:10:05.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.409+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fc63c002bf0 con 0x7fc64403d000 2026-03-09T16:10:05.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.411+0000 7fc674ff9640 1 -- 192.168.123.103:0/2643611366 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7fc63c002bf0 con 0x7fc64403d000 2026-03-09T16:10:05.413 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:05.413 INFO:teuthology.orchestra.run.vm03.stdout:[{"addr": "192.168.123.103", "hostname": "vm03", "labels": [], "status": ""}, {"addr": "192.168.123.105", "hostname": "vm05", "labels": [], "status": ""}] 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 msgr2=0x7fc64403f4c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 0x7fc64403f4c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fc6580099c0 tx=0x7fc658006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 msgr2=0x7fc6781a69e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fc660009980 tx=0x7fc660031d80 comp rx=0 tx=0).stop 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 shutdown_connections 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fc64403d000 0x7fc64403f4c0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 --2- 192.168.123.103:0/2643611366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc678071990 0x7fc6781a69e0 secure :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fc660009980 tx=0x7fc660031d80 comp rx=0 tx=0).stop 2026-03-09T16:10:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.415+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 >> 192.168.123.103:0/2643611366 conn(0x7fc67806b190 msgr2=0x7fc67806eb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:05.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.417+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 shutdown_connections 2026-03-09T16:10:05.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:05.418+0000 7fc677fff640 1 -- 192.168.123.103:0/2643611366 wait complete. 2026-03-09T16:10:05.616 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-09T16:10:05.616 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd crush tunables default 2026-03-09T16:10:05.889 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:06.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- 192.168.123.103:0/1919987982 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec0a48b0 msgr2=0x7f7aec0a4c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:06.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 --2- 192.168.123.103:0/1919987982 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec0a48b0 0x7f7aec0a4c90 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f7af0009a00 tx=0x7f7af002f310 comp rx=0 tx=0).stop 2026-03-09T16:10:06.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- 192.168.123.103:0/1919987982 shutdown_connections 2026-03-09T16:10:06.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 --2- 192.168.123.103:0/1919987982 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec0a48b0 0x7f7aec0a4c90 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:06.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- 192.168.123.103:0/1919987982 >> 192.168.123.103:0/1919987982 conn(0x7f7aec01a150 msgr2=0x7f7aec01a560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- 192.168.123.103:0/1919987982 shutdown_connections 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- 192.168.123.103:0/1919987982 wait complete. 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 Processor -- start 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- start start 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.221+0000 7f7afa872640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7af0002e30 con 0x7f7aec15cac0 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.222+0000 7f7af9870640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:06.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.222+0000 7f7af9870640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39438/0 (socket says 192.168.123.103:39438) 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.222+0000 7f7af9870640 1 -- 192.168.123.103:0/4185990574 learned_addr learned my addr 192.168.123.103:0/4185990574 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.222+0000 7f7af9870640 1 -- 192.168.123.103:0/4185990574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7af0009660 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.222+0000 7f7af9870640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f7af0004400 tx=0x7f7af0004430 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.223+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7af00044d0 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.223+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7aec15a0e0 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.223+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7aec15a560 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.223+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7af0038b50 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.223+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7af0040a70 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.225+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f7af003f070 con 0x7f7aec15cac0 2026-03-09T16:10:06.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.225+0000 7f7aeaffd640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 0x7f7ad403f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:06.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.225+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f7af0075b20 con 0x7f7aec15cac0 2026-03-09T16:10:06.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.225+0000 7f7af906f640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 0x7f7ad403f7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:06.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.225+0000 7f7af906f640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 0x7f7ad403f7e0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f7af4066a90 tx=0x7f7af4067070 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:06.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.226+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ab8005350 con 0x7f7aec15cac0 2026-03-09T16:10:06.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.229+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7af003d070 con 0x7f7aec15cac0 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: Added host vm05 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: Deploying daemon crash.vm03 on vm03 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:06 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:06.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:06.334+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f7ab80051c0 con 0x7f7aec15cac0 2026-03-09T16:10:07.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.010+0000 7f7aeaffd640 1 -- 192.168.123.103:0/4185990574 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f7af0038690 con 0x7f7aec15cac0 2026-03-09T16:10:07.014 INFO:teuthology.orchestra.run.vm03.stderr:adjusted tunables profile to default 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.014+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 msgr2=0x7f7ad403f7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.014+0000 7f7afa872640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 0x7f7ad403f7e0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f7af4066a90 tx=0x7f7af4067070 comp rx=0 tx=0).stop 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 msgr2=0x7f7aec159ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f7af0004400 tx=0x7f7af0004430 comp rx=0 tx=0).stop 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 shutdown_connections 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f7ad403d320 0x7f7ad403f7e0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 --2- 192.168.123.103:0/4185990574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7aec15cac0 0x7f7aec159ba0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 >> 192.168.123.103:0/4185990574 conn(0x7f7aec01a150 msgr2=0x7f7aec0a3640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 shutdown_connections 2026-03-09T16:10:07.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:07.015+0000 7f7afa872640 1 -- 192.168.123.103:0/4185990574 wait complete. 2026-03-09T16:10:07.095 INFO:tasks.cephadm:Adding mon.vm03 on vm03 2026-03-09T16:10:07.096 INFO:tasks.cephadm:Adding mon.vm05 on vm05 2026-03-09T16:10:07.096 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch apply mon '2;vm03:192.168.123.103=vm03;vm05:192.168.123.105=vm05' 2026-03-09T16:10:07.249 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:07.294 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:07 vm03 ceph-mon[51019]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T16:10:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:07 vm03 ceph-mon[51019]: Deploying daemon node-exporter.vm03 on vm03 2026-03-09T16:10:07.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:07 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/4185990574' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-09T16:10:08.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.380+0000 7f0035213640 1 -- 192.168.123.105:0/3249512681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 msgr2=0x7f0030102870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:08.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.380+0000 7f0035213640 1 --2- 192.168.123.105:0/3249512681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f0030102870 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f00200099b0 tx=0x7f002002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:08.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.380+0000 7f0035213640 1 -- 192.168.123.105:0/3249512681 shutdown_connections 2026-03-09T16:10:08.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.380+0000 7f0035213640 1 --2- 192.168.123.105:0/3249512681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f0030102870 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:08.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.380+0000 7f0035213640 1 -- 192.168.123.105:0/3249512681 >> 192.168.123.105:0/3249512681 conn(0x7f00300fdca0 msgr2=0x7f0030100090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.381+0000 7f0035213640 1 -- 192.168.123.105:0/3249512681 shutdown_connections 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.381+0000 7f0035213640 1 -- 192.168.123.105:0/3249512681 wait complete. 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.381+0000 7f0035213640 1 Processor -- start 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.381+0000 7f0035213640 1 -- start start 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f0035213640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:08.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f0035213640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0030199d00 con 0x7f0030102470 2026-03-09T16:10:08.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f002ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:08.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f002ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58772/0 (socket says 192.168.123.105:58772) 2026-03-09T16:10:08.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f002ffff640 1 -- 192.168.123.105:0/1286288296 learned_addr learned my addr 192.168.123.105:0/1286288296 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:08.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f002ffff640 1 -- 192.168.123.105:0/1286288296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0020009660 con 0x7f0030102470 2026-03-09T16:10:08.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.382+0000 7f002ffff640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f002002f860 tx=0x7f0020004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:08.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.383+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00200043b0 con 0x7f0030102470 2026-03-09T16:10:08.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.383+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0020038b40 con 0x7f0030102470 2026-03-09T16:10:08.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.383+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00200418f0 con 0x7f0030102470 2026-03-09T16:10:08.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.383+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0030199f00 con 0x7f0030102470 2026-03-09T16:10:08.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.383+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f003019a3a0 con 0x7f0030102470 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f0020038cb0 con 0x7f0030102470 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efffc005350 con 0x7f0030102470 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f002d7fa640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 0x7f001003f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f0020076150 con 0x7f0030102470 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f002f7fe640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 0x7f001003f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:08.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.384+0000 7f002f7fe640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 0x7f001003f740 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f001c0099c0 tx=0x7f001c006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:08.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.388+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0020041400 con 0x7f0030102470 2026-03-09T16:10:08.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:08 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/4185990574' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-09T16:10:08.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:08 vm03 ceph-mon[51019]: osdmap e4: 0 total, 0 up, 0 in 2026-03-09T16:10:08.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:08 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:08.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.484+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 --> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm05:192.168.123.105=vm05", "target": ["mon-mgr", ""]}) v1 -- 0x7efffc002bf0 con 0x7f001003d280 2026-03-09T16:10:08.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.490+0000 7f002d7fa640 1 -- 192.168.123.105:0/1286288296 <== mgr.14162 v2:192.168.123.103:6800/3405276359 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7efffc002bf0 con 0x7f001003d280 2026-03-09T16:10:08.491 INFO:teuthology.orchestra.run.vm05.stdout:Scheduled mon update... 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 msgr2=0x7f001003f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 0x7f001003f740 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f001c0099c0 tx=0x7f001c006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 msgr2=0x7f00301997c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f002002f860 tx=0x7f0020004270 comp rx=0 tx=0).stop 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 shutdown_connections 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f001003d280 0x7f001003f740 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 --2- 192.168.123.105:0/1286288296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0030102470 0x7f00301997c0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.492+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 >> 192.168.123.105:0/1286288296 conn(0x7f00300fdca0 msgr2=0x7f00300fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.493+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 shutdown_connections 2026-03-09T16:10:08.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:08.493+0000 7f0035213640 1 -- 192.168.123.105:0/1286288296 wait complete. 2026-03-09T16:10:08.555 DEBUG:teuthology.orchestra.run.vm05:mon.vm05> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm05.service 2026-03-09T16:10:08.557 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:08.557 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:08.762 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:08.807 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.094+0000 7fe9065e9640 1 -- 192.168.123.105:0/2342551666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 msgr2=0x7fe9000ff660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.094+0000 7fe9065e9640 1 --2- 192.168.123.105:0/2342551666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9000ff660 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fe8f00099b0 tx=0x7fe8f002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.096+0000 7fe9065e9640 1 -- 192.168.123.105:0/2342551666 shutdown_connections 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.096+0000 7fe9065e9640 1 --2- 192.168.123.105:0/2342551666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9000ff660 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.096+0000 7fe9065e9640 1 -- 192.168.123.105:0/2342551666 >> 192.168.123.105:0/2342551666 conn(0x7fe9000faa10 msgr2=0x7fe9000fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:09.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.097+0000 7fe9065e9640 1 -- 192.168.123.105:0/2342551666 shutdown_connections 2026-03-09T16:10:09.098 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.097+0000 7fe9065e9640 1 -- 192.168.123.105:0/2342551666 wait complete. 2026-03-09T16:10:09.098 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.097+0000 7fe9065e9640 1 Processor -- start 2026-03-09T16:10:09.098 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe9065e9640 1 -- start start 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe9065e9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe9065e9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe900199f30 con 0x7fe9000ff260 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe8fffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe8fffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58788/0 (socket says 192.168.123.105:58788) 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.098+0000 7fe8fffff640 1 -- 192.168.123.105:0/1649197338 learned_addr learned my addr 192.168.123.105:0/1649197338 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:09.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe8fffff640 1 -- 192.168.123.105:0/1649197338 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8f0009660 con 0x7fe9000ff260 2026-03-09T16:10:09.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe8fffff640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe8f002f860 tx=0x7fe8f0004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe8f00043d0 con 0x7fe9000ff260 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe8f0038b40 con 0x7fe9000ff260 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe8f0041860 con 0x7fe9000ff260 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe90019a130 con 0x7fe9000ff260 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.099+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe90019a5d0 con 0x7fe9000ff260 2026-03-09T16:10:09.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.101+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe8c4005350 con 0x7fe9000ff260 2026-03-09T16:10:09.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.101+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fe8f0038cb0 con 0x7fe9000ff260 2026-03-09T16:10:09.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.102+0000 7fe8fd7fa640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 0x7fe8d403f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:09.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.102+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fe8f0076470 con 0x7fe9000ff260 2026-03-09T16:10:09.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.105+0000 7fe8ff7fe640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 0x7fe8d403f7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:09.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.106+0000 7fe8ff7fe640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 0x7fe8d403f7e0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fe8ec0099c0 tx=0x7fe8ec006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:09.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.109+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe8f0048310 con 0x7fe9000ff260 2026-03-09T16:10:09.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.236+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fe8c40051c0 con 0x7fe9000ff260 2026-03-09T16:10:09.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.237+0000 7fe8fd7fa640 1 -- 192.168.123.105:0/1649197338 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fe8f0051310 con 0x7fe9000ff260 2026-03-09T16:10:09.238 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:09.238 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:09.238 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 msgr2=0x7fe8d403f7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 0x7fe8d403f7e0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fe8ec0099c0 tx=0x7fe8ec006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 msgr2=0x7fe9001999f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe8f002f860 tx=0x7fe8f0004290 comp rx=0 tx=0).stop 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 shutdown_connections 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe8d403d320 0x7fe8d403f7e0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 --2- 192.168.123.105:0/1649197338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9000ff260 0x7fe9001999f0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.240+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 >> 192.168.123.105:0/1649197338 conn(0x7fe9000faa10 msgr2=0x7fe9000fb230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.241+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 shutdown_connections 2026-03-09T16:10:09.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:09.241+0000 7fe9065e9640 1 -- 192.168.123.105:0/1649197338 wait complete. 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='client.14195 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm05:192.168.123.105=vm05", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: Saving service mon spec with placement vm03:192.168.123.103=vm03;vm05:192.168.123.105=vm05;count:2 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:09 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1649197338' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:10.297 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:10.297 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:10.453 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:10.496 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.755+0000 7fdf2168e640 1 -- 192.168.123.105:0/2343073709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 msgr2=0x7fdf1c102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.755+0000 7fdf2168e640 1 --2- 192.168.123.105:0/2343073709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c102a20 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fdf040099b0 tx=0x7fdf0402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.756+0000 7fdf2168e640 1 -- 192.168.123.105:0/2343073709 shutdown_connections 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.756+0000 7fdf2168e640 1 --2- 192.168.123.105:0/2343073709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c102a20 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.756+0000 7fdf2168e640 1 -- 192.168.123.105:0/2343073709 >> 192.168.123.105:0/2343073709 conn(0x7fdf1c0fde70 msgr2=0x7fdf1c100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.756+0000 7fdf2168e640 1 -- 192.168.123.105:0/2343073709 shutdown_connections 2026-03-09T16:10:10.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.757+0000 7fdf2168e640 1 -- 192.168.123.105:0/2343073709 wait complete. 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.757+0000 7fdf2168e640 1 Processor -- start 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.757+0000 7fdf2168e640 1 -- start start 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.757+0000 7fdf2168e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.757+0000 7fdf2168e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf1c199ed0 con 0x7fdf1c102620 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdf1affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:10.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdf1affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58806/0 (socket says 192.168.123.105:58806) 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdf1affd640 1 -- 192.168.123.105:0/3326109782 learned_addr learned my addr 192.168.123.105:0/3326109782 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdf1affd640 1 -- 192.168.123.105:0/3326109782 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdf04009660 con 0x7fdf1c102620 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdf1affd640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fdf0402f860 tx=0x7fdf04004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdf040043b0 con 0x7fdf1c102620 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.758+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdf04038b40 con 0x7fdf1c102620 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.759+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdf1c19a0d0 con 0x7fdf1c102620 2026-03-09T16:10:10.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.759+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdf1c19a570 con 0x7fdf1c102620 2026-03-09T16:10:10.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.760+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdf040418f0 con 0x7fdf1c102620 2026-03-09T16:10:10.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.760+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdf1c102aa0 con 0x7fdf1c102620 2026-03-09T16:10:10.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.760+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fdf04041b10 con 0x7fdf1c102620 2026-03-09T16:10:10.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.760+0000 7fdefbfff640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 0x7fdef003f4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:10.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.761+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fdf04077240 con 0x7fdf1c102620 2026-03-09T16:10:10.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.762+0000 7fdf1a7fc640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 0x7fdef003f4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:10.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.762+0000 7fdf1a7fc640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 0x7fdef003f4c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fdf100099c0 tx=0x7fdf10006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:10.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.763+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fdf04037bb0 con 0x7fdf1c102620 2026-03-09T16:10:10.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:10 vm03 ceph-mon[51019]: Deploying daemon alertmanager.vm03 on vm03 2026-03-09T16:10:10.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.894+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdf1c108530 con 0x7fdf1c102620 2026-03-09T16:10:10.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.894+0000 7fdefbfff640 1 -- 192.168.123.105:0/3326109782 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fdf040373d0 con 0x7fdf1c102620 2026-03-09T16:10:10.896 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:10.896 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:10.896 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:10.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 msgr2=0x7fdef003f4c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:10.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 0x7fdef003f4c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fdf100099c0 tx=0x7fdf10006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 msgr2=0x7fdf1c199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fdf0402f860 tx=0x7fdf04004270 comp rx=0 tx=0).stop 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 shutdown_connections 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdef003d000 0x7fdef003f4c0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 --2- 192.168.123.105:0/3326109782 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf1c102620 0x7fdf1c199990 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.898+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 >> 192.168.123.105:0/3326109782 conn(0x7fdf1c0fde70 msgr2=0x7fdf1c0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:10.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.899+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 shutdown_connections 2026-03-09T16:10:10.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:10.899+0000 7fdf2168e640 1 -- 192.168.123.105:0/3326109782 wait complete. 2026-03-09T16:10:11.745 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:11 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/3326109782' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:11.964 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:11.964 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:12.128 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:12.161 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.431+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/542255443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 msgr2=0x7f5d74071a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.431+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/542255443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d74071a80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5d6c00b0a0 tx=0x7f5d6c02f530 comp rx=0 tx=0).stop 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.431+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/542255443 shutdown_connections 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.431+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/542255443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d74071a80 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.431+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/542255443 >> 192.168.123.105:0/542255443 conn(0x7f5d7406d1c0 msgr2=0x7f5d7406f600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/542255443 shutdown_connections 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/542255443 wait complete. 2026-03-09T16:10:12.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 Processor -- start 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 -- start start 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.432+0000 7f5d7b2a2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d741a6d90 con 0x7f5d74071680 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.433+0000 7f5d79017640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.433+0000 7f5d79017640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58828/0 (socket says 192.168.123.105:58828) 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.433+0000 7f5d79017640 1 -- 192.168.123.105:0/973260015 learned_addr learned my addr 192.168.123.105:0/973260015 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.433+0000 7f5d79017640 1 -- 192.168.123.105:0/973260015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d6c009d00 con 0x7f5d74071680 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.433+0000 7f5d79017640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5d6c00c090 tx=0x7f5d6c0049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:12.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.434+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d6c002c70 con 0x7f5d74071680 2026-03-09T16:10:12.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.434+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5d6c002dd0 con 0x7f5d74071680 2026-03-09T16:10:12.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.434+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d741a6f90 con 0x7f5d74071680 2026-03-09T16:10:12.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.434+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d741a73d0 con 0x7f5d74071680 2026-03-09T16:10:12.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.434+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d6c037c30 con 0x7f5d74071680 2026-03-09T16:10:12.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.435+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f5d6c007880 con 0x7f5d74071680 2026-03-09T16:10:12.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.435+0000 7f5d627fc640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 0x7f5d5003f420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:12.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.435+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f5d6c03e070 con 0x7f5d74071680 2026-03-09T16:10:12.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.436+0000 7f5d78816640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 0x7f5d5003f420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:12.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.436+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d3c005350 con 0x7f5d74071680 2026-03-09T16:10:12.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.439+0000 7f5d78816640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 0x7f5d5003f420 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5d640099c0 tx=0x7f5d64006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:12.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.439+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5d6c037d90 con 0x7f5d74071680 2026-03-09T16:10:12.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.565+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5d3c005600 con 0x7f5d74071680 2026-03-09T16:10:12.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.566+0000 7f5d627fc640 1 -- 192.168.123.105:0/973260015 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f5d6c049340 con 0x7f5d74071680 2026-03-09T16:10:12.567 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:12.567 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:12.567 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 msgr2=0x7f5d5003f420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 0x7f5d5003f420 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5d640099c0 tx=0x7f5d64006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 msgr2=0x7f5d741a6850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5d6c00c090 tx=0x7f5d6c0049e0 comp rx=0 tx=0).stop 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 shutdown_connections 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f5d5003cf60 0x7f5d5003f420 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 --2- 192.168.123.105:0/973260015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d74071680 0x7f5d741a6850 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.568+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 >> 192.168.123.105:0/973260015 conn(0x7f5d7406d1c0 msgr2=0x7f5d7406fe20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.569+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 shutdown_connections 2026-03-09T16:10:12.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:12.569+0000 7f5d7b2a2640 1 -- 192.168.123.105:0/973260015 wait complete. 2026-03-09T16:10:13.631 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:13.631 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:13.763 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:13.793 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:13.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: Deploying daemon grafana.vm03 on vm03 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:13 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/973260015' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.008+0000 7feef0fed640 1 -- 192.168.123.105:0/1299068942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 msgr2=0x7feeec102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.008+0000 7feef0fed640 1 --2- 192.168.123.105:0/1299068942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec102a20 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7feee00099b0 tx=0x7feee002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.009+0000 7feef0fed640 1 -- 192.168.123.105:0/1299068942 shutdown_connections 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.009+0000 7feef0fed640 1 --2- 192.168.123.105:0/1299068942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec102a20 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.009+0000 7feef0fed640 1 -- 192.168.123.105:0/1299068942 >> 192.168.123.105:0/1299068942 conn(0x7feeec0fde70 msgr2=0x7feeec100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.009+0000 7feef0fed640 1 -- 192.168.123.105:0/1299068942 shutdown_connections 2026-03-09T16:10:14.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.009+0000 7feef0fed640 1 -- 192.168.123.105:0/1299068942 wait complete. 2026-03-09T16:10:14.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.010+0000 7feef0fed640 1 Processor -- start 2026-03-09T16:10:14.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.010+0000 7feef0fed640 1 -- start start 2026-03-09T16:10:14.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.010+0000 7feef0fed640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:14.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.010+0000 7feef0fed640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feeec199ed0 con 0x7feeec102620 2026-03-09T16:10:14.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.011+0000 7feeea575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.011+0000 7feeea575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58836/0 (socket says 192.168.123.105:58836) 2026-03-09T16:10:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.011+0000 7feeea575640 1 -- 192.168.123.105:0/3544640266 learned_addr learned my addr 192.168.123.105:0/3544640266 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.011+0000 7feeea575640 1 -- 192.168.123.105:0/3544640266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feee0009660 con 0x7feeec102620 2026-03-09T16:10:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.011+0000 7feeea575640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7feee0031d80 tx=0x7feee0031db0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.012+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feee003c050 con 0x7feeec102620 2026-03-09T16:10:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.012+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feeec19a0d0 con 0x7feeec102620 2026-03-09T16:10:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.012+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feeec19a510 con 0x7feeec102620 2026-03-09T16:10:14.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7feee003d040 con 0x7feeec102620 2026-03-09T16:10:14.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feee0038470 con 0x7feeec102620 2026-03-09T16:10:14.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7feee00386e0 con 0x7feeec102620 2026-03-09T16:10:14.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feeec102aa0 con 0x7feeec102620 2026-03-09T16:10:14.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feecf7fe640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 0x7feec403f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:14.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.013+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7feee00764a0 con 0x7feeec102620 2026-03-09T16:10:14.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.018+0000 7feee9d74640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 0x7feec403f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:14.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.018+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7feee0049540 con 0x7feeec102620 2026-03-09T16:10:14.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.018+0000 7feee9d74640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 0x7feec403f740 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7feed80099c0 tx=0x7feed8006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:14.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.147+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7feeec108530 con 0x7feeec102620 2026-03-09T16:10:14.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.148+0000 7feecf7fe640 1 -- 192.168.123.105:0/3544640266 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7feee0037630 con 0x7feeec102620 2026-03-09T16:10:14.149 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:14.149 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:14.149 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:14.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.151+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 msgr2=0x7feec403f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:14.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.151+0000 7feef0fed640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 0x7feec403f740 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7feed80099c0 tx=0x7feed8006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.151+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 msgr2=0x7feeec199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:14.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.151+0000 7feef0fed640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7feee0031d80 tx=0x7feee0031db0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 shutdown_connections 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7feec403d280 0x7feec403f740 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 --2- 192.168.123.105:0/3544640266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeec102620 0x7feeec199990 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 >> 192.168.123.105:0/3544640266 conn(0x7feeec0fde70 msgr2=0x7feeec0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 shutdown_connections 2026-03-09T16:10:14.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:14.152+0000 7feef0fed640 1 -- 192.168.123.105:0/3544640266 wait complete. 2026-03-09T16:10:14.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:14 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/3544640266' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:15.215 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:15.215 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:15.348 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:15.381 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.610+0000 7f48e51d3640 1 -- 192.168.123.105:0/1791120742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 msgr2=0x7f48e0102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.610+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1791120742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0102a40 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f48d40099b0 tx=0x7f48d402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.611+0000 7f48e51d3640 1 -- 192.168.123.105:0/1791120742 shutdown_connections 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.611+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1791120742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0102a40 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.611+0000 7f48e51d3640 1 -- 192.168.123.105:0/1791120742 >> 192.168.123.105:0/1791120742 conn(0x7f48e00fde70 msgr2=0x7f48e0100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.611+0000 7f48e51d3640 1 -- 192.168.123.105:0/1791120742 shutdown_connections 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.611+0000 7f48e51d3640 1 -- 192.168.123.105:0/1791120742 wait complete. 2026-03-09T16:10:15.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48e51d3640 1 Processor -- start 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48e51d3640 1 -- start start 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48e51d3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48e51d3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48e0199e90 con 0x7f48e0102640 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48ded76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48ded76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58860/0 (socket says 192.168.123.105:58860) 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.612+0000 7f48ded76640 1 -- 192.168.123.105:0/1751240814 learned_addr learned my addr 192.168.123.105:0/1751240814 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:15.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48ded76640 1 -- 192.168.123.105:0/1751240814 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48d4009660 con 0x7f48e0102640 2026-03-09T16:10:15.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48ded76640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f48d402f860 tx=0x7f48d4004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:15.615 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48d40043b0 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f48d4038b40 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48d40418f0 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48e019a090 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.613+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48e019a470 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.614+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f48d4038cb0 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.614+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f48a4005350 con 0x7f48e0102640 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.614+0000 7f48bffff640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 0x7f48b403f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:15.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.615+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f48d4076250 con 0x7f48e0102640 2026-03-09T16:10:15.617 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.617+0000 7f48de575640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 0x7f48b403f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:15.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.617+0000 7f48de575640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 0x7f48b403f790 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f48c80099c0 tx=0x7f48c8006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:15.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.617+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f48d4035320 con 0x7f48e0102640 2026-03-09T16:10:15.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.752+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f48a40051c0 con 0x7f48e0102640 2026-03-09T16:10:15.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.753+0000 7f48bffff640 1 -- 192.168.123.105:0/1751240814 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f48d40373d0 con 0x7f48e0102640 2026-03-09T16:10:15.755 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:15.755 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:15.755 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.757+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 msgr2=0x7f48b403f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.757+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 0x7f48b403f790 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f48c80099c0 tx=0x7f48c8006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.757+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 msgr2=0x7f48e0199950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.757+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f48d402f860 tx=0x7f48d4004270 comp rx=0 tx=0).stop 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.757+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 shutdown_connections 2026-03-09T16:10:15.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.758+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f48b403d2d0 0x7f48b403f790 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:15.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.758+0000 7f48e51d3640 1 --2- 192.168.123.105:0/1751240814 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48e0102640 0x7f48e0199950 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:15.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.758+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 >> 192.168.123.105:0/1751240814 conn(0x7f48e00fde70 msgr2=0x7f48e00fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:15.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.758+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 shutdown_connections 2026-03-09T16:10:15.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:15.758+0000 7f48e51d3640 1 -- 192.168.123.105:0/1751240814 wait complete. 2026-03-09T16:10:16.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:15 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1751240814' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:16.834 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:16.835 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:16.986 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:17.031 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.296+0000 7fdd6d323640 1 -- 192.168.123.105:0/1588348917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 msgr2=0x7fdd680ff660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.296+0000 7fdd6d323640 1 --2- 192.168.123.105:0/1588348917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd680ff660 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fdd500099b0 tx=0x7fdd5002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.297+0000 7fdd6d323640 1 -- 192.168.123.105:0/1588348917 shutdown_connections 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.297+0000 7fdd6d323640 1 --2- 192.168.123.105:0/1588348917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd680ff660 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.297+0000 7fdd6d323640 1 -- 192.168.123.105:0/1588348917 >> 192.168.123.105:0/1588348917 conn(0x7fdd680faa10 msgr2=0x7fdd680fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.297+0000 7fdd6d323640 1 -- 192.168.123.105:0/1588348917 shutdown_connections 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.297+0000 7fdd6d323640 1 -- 192.168.123.105:0/1588348917 wait complete. 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd6d323640 1 Processor -- start 2026-03-09T16:10:17.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd6d323640 1 -- start start 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd6d323640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd6d323640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdd68195a20 con 0x7fdd680ff260 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd66ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd66ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:58878/0 (socket says 192.168.123.105:58878) 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.298+0000 7fdd66ffd640 1 -- 192.168.123.105:0/491834950 learned_addr learned my addr 192.168.123.105:0/491834950 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:17.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd66ffd640 1 -- 192.168.123.105:0/491834950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdd50009660 con 0x7fdd680ff260 2026-03-09T16:10:17.300 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd66ffd640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fdd5002f860 tx=0x7fdd50004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:17.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdd500043d0 con 0x7fdd680ff260 2026-03-09T16:10:17.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdd50038b40 con 0x7fdd680ff260 2026-03-09T16:10:17.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdd50041860 con 0x7fdd680ff260 2026-03-09T16:10:17.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdd68195c20 con 0x7fdd680ff260 2026-03-09T16:10:17.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.299+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdd68196000 con 0x7fdd680ff260 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.300+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fdd50038cb0 con 0x7fdd680ff260 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.301+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdd2c005350 con 0x7fdd680ff260 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.301+0000 7fdd47fff640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 0x7fdd3c03f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.301+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fdd500761d0 con 0x7fdd680ff260 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.301+0000 7fdd667fc640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 0x7fdd3c03f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:17.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.301+0000 7fdd667fc640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 0x7fdd3c03f740 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fdd5c0099c0 tx=0x7fdd5c006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:17.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.304+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fdd50035530 con 0x7fdd680ff260 2026-03-09T16:10:17.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.429+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdd2c0051c0 con 0x7fdd680ff260 2026-03-09T16:10:17.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.430+0000 7fdd47fff640 1 -- 192.168.123.105:0/491834950 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fdd5004ae40 con 0x7fdd680ff260 2026-03-09T16:10:17.432 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:17.432 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:17.432 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:17.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.434+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 msgr2=0x7fdd3c03f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:17.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.434+0000 7fdd6d323640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 0x7fdd3c03f740 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fdd5c0099c0 tx=0x7fdd5c006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:17.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.434+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 msgr2=0x7fdd681954e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:17.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.434+0000 7fdd6d323640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fdd5002f860 tx=0x7fdd50004290 comp rx=0 tx=0).stop 2026-03-09T16:10:17.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 shutdown_connections 2026-03-09T16:10:17.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fdd3c03d280 0x7fdd3c03f740 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:17.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 --2- 192.168.123.105:0/491834950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdd680ff260 0x7fdd681954e0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:17.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 >> 192.168.123.105:0/491834950 conn(0x7fdd680faa10 msgr2=0x7fdd680fb230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:17.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 shutdown_connections 2026-03-09T16:10:17.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:17.435+0000 7fdd6d323640 1 -- 192.168.123.105:0/491834950 wait complete. 2026-03-09T16:10:17.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:17 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/491834950' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:18.505 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:18.505 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:18.651 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:18.690 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:18.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:18 vm03 ceph-mon[51019]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.968+0000 7f3a1e774640 1 -- 192.168.123.105:0/3575152637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 msgr2=0x7f3a180fedb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.968+0000 7f3a1e774640 1 --2- 192.168.123.105:0/3575152637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a180fedb0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f3a080099b0 tx=0x7f3a0802f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.969+0000 7f3a1e774640 1 -- 192.168.123.105:0/3575152637 shutdown_connections 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.969+0000 7f3a1e774640 1 --2- 192.168.123.105:0/3575152637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a180fedb0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.969+0000 7f3a1e774640 1 -- 192.168.123.105:0/3575152637 >> 192.168.123.105:0/3575152637 conn(0x7f3a180fa160 msgr2=0x7f3a180fc580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.969+0000 7f3a1e774640 1 -- 192.168.123.105:0/3575152637 shutdown_connections 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.969+0000 7f3a1e774640 1 -- 192.168.123.105:0/3575152637 wait complete. 2026-03-09T16:10:18.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a1e774640 1 Processor -- start 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a1e774640 1 -- start start 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a1e774640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a1e774640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a18199ec0 con 0x7f3a180fe9b0 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a17fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a17fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49078/0 (socket says 192.168.123.105:49078) 2026-03-09T16:10:18.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.970+0000 7f3a17fff640 1 -- 192.168.123.105:0/1857893612 learned_addr learned my addr 192.168.123.105:0/1857893612 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:18.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.971+0000 7f3a17fff640 1 -- 192.168.123.105:0/1857893612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a08009660 con 0x7f3a180fe9b0 2026-03-09T16:10:18.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.971+0000 7f3a17fff640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f3a08031d80 tx=0x7f3a08031db0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:18.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.971+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3a0803c050 con 0x7f3a180fe9b0 2026-03-09T16:10:18.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.971+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a1819a0c0 con 0x7f3a180fe9b0 2026-03-09T16:10:18.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.971+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a1819a500 con 0x7f3a180fe9b0 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.972+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f39dc005350 con 0x7f3a180fe9b0 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.972+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3a0803d040 con 0x7f3a180fe9b0 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.972+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3a08038470 con 0x7f3a180fe9b0 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.973+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f3a080386e0 con 0x7f3a180fe9b0 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.973+0000 7f3a157fa640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 0x7f39ec03f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:18.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.973+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f3a08076ea0 con 0x7f3a180fe9b0 2026-03-09T16:10:18.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.975+0000 7f3a177fe640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 0x7f39ec03f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:18.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.976+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3a08036b60 con 0x7f3a180fe9b0 2026-03-09T16:10:18.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:18.976+0000 7f3a177fe640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 0x7f39ec03f790 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3a040099c0 tx=0x7f3a04006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:19.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.101+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f39dc0051c0 con 0x7f3a180fe9b0 2026-03-09T16:10:19.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.103+0000 7f3a157fa640 1 -- 192.168.123.105:0/1857893612 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f3a08036b60 con 0x7f3a180fe9b0 2026-03-09T16:10:19.104 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:19.104 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:19.104 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:19.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.105+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 msgr2=0x7f39ec03f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:19.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.105+0000 7f3a1e774640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 0x7f39ec03f790 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3a040099c0 tx=0x7f3a04006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:19.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.105+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 msgr2=0x7f3a18199980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:19.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.105+0000 7f3a1e774640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f3a08031d80 tx=0x7f3a08031db0 comp rx=0 tx=0).stop 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 shutdown_connections 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f39ec03d2d0 0x7f39ec03f790 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 --2- 192.168.123.105:0/1857893612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a180fe9b0 0x7f3a18199980 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 >> 192.168.123.105:0/1857893612 conn(0x7f3a180fa160 msgr2=0x7f3a180fad60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 shutdown_connections 2026-03-09T16:10:19.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:19.106+0000 7f3a1e774640 1 -- 192.168.123.105:0/1857893612 wait complete. 2026-03-09T16:10:19.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:19 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1857893612' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:20.177 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:20.177 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:20.344 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:20.385 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.645+0000 7f8a36670640 1 -- 192.168.123.105:0/2790342850 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 msgr2=0x7f8a30073600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.645+0000 7f8a36670640 1 --2- 192.168.123.105:0/2790342850 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30073600 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f8a1c0099b0 tx=0x7f8a1c02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.646+0000 7f8a36670640 1 -- 192.168.123.105:0/2790342850 shutdown_connections 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.646+0000 7f8a36670640 1 --2- 192.168.123.105:0/2790342850 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30073600 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.646+0000 7f8a36670640 1 -- 192.168.123.105:0/2790342850 >> 192.168.123.105:0/2790342850 conn(0x7f8a300fbb30 msgr2=0x7f8a300fdf70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.646+0000 7f8a36670640 1 -- 192.168.123.105:0/2790342850 shutdown_connections 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.646+0000 7f8a36670640 1 -- 192.168.123.105:0/2790342850 wait complete. 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a36670640 1 Processor -- start 2026-03-09T16:10:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a36670640 1 -- start start 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a36670640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a36670640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a30199e40 con 0x7f8a300751a0 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a2ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a2ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49098/0 (socket says 192.168.123.105:49098) 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.647+0000 7f8a2ffff640 1 -- 192.168.123.105:0/400368236 learned_addr learned my addr 192.168.123.105:0/400368236 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.648+0000 7f8a2ffff640 1 -- 192.168.123.105:0/400368236 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a1c009660 con 0x7f8a300751a0 2026-03-09T16:10:20.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.648+0000 7f8a2ffff640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f8a1c02f860 tx=0x7f8a1c004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:20.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.648+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a1c0043b0 con 0x7f8a300751a0 2026-03-09T16:10:20.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.648+0000 7f8a36670640 1 -- 192.168.123.105:0/400368236 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a3019a040 con 0x7f8a300751a0 2026-03-09T16:10:20.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.648+0000 7f8a36670640 1 -- 192.168.123.105:0/400368236 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a3019a4e0 con 0x7f8a300751a0 2026-03-09T16:10:20.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.649+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8a1c038b40 con 0x7f8a300751a0 2026-03-09T16:10:20.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.649+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a1c041810 con 0x7f8a300751a0 2026-03-09T16:10:20.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.650+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a30073680 con 0x7f8a300751a0 2026-03-09T16:10:20.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.650+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f8a1c038cb0 con 0x7f8a300751a0 2026-03-09T16:10:20.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.650+0000 7f8a2d7fa640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 0x7f8a0403f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:20.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.651+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f8a1c0769d0 con 0x7f8a300751a0 2026-03-09T16:10:20.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.651+0000 7f8a2f7fe640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 0x7f8a0403f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:20.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.651+0000 7f8a2f7fe640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 0x7f8a0403f740 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8a140099c0 tx=0x7f8a14006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:20.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.653+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8a1c054770 con 0x7f8a300751a0 2026-03-09T16:10:20.777 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:20.777 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:20.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.774+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8a30108510 con 0x7f8a300751a0 2026-03-09T16:10:20.777 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:20.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.775+0000 7f8a2d7fa640 1 -- 192.168.123.105:0/400368236 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f8a1c0497d0 con 0x7f8a300751a0 2026-03-09T16:10:20.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.777+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 msgr2=0x7f8a0403f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:20.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.777+0000 7f8a22ffd640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 0x7f8a0403f740 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8a140099c0 tx=0x7f8a14006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 msgr2=0x7f8a30199900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f8a1c02f860 tx=0x7f8a1c004270 comp rx=0 tx=0).stop 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 shutdown_connections 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f8a0403d280 0x7f8a0403f740 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 --2- 192.168.123.105:0/400368236 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a300751a0 0x7f8a30199900 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 >> 192.168.123.105:0/400368236 conn(0x7f8a300fbb30 msgr2=0x7f8a300fc780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 shutdown_connections 2026-03-09T16:10:20.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:20.778+0000 7f8a22ffd640 1 -- 192.168.123.105:0/400368236 wait complete. 2026-03-09T16:10:20.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:20 vm03 ceph-mon[51019]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:21.839 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:21.840 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:21.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:21 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/400368236' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:21.992 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:22.029 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:22.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.273+0000 7efdc58df640 1 -- 192.168.123.105:0/467635433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 msgr2=0x7efdc0102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.273+0000 7efdc58df640 1 --2- 192.168.123.105:0/467635433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0102a20 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7efdb40099b0 tx=0x7efdb402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.274+0000 7efdc58df640 1 -- 192.168.123.105:0/467635433 shutdown_connections 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.274+0000 7efdc58df640 1 --2- 192.168.123.105:0/467635433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0102a20 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.274+0000 7efdc58df640 1 -- 192.168.123.105:0/467635433 >> 192.168.123.105:0/467635433 conn(0x7efdc00fde70 msgr2=0x7efdc0100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.274+0000 7efdc58df640 1 -- 192.168.123.105:0/467635433 shutdown_connections 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.274+0000 7efdc58df640 1 -- 192.168.123.105:0/467635433 wait complete. 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.275+0000 7efdc58df640 1 Processor -- start 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.275+0000 7efdc58df640 1 -- start start 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.275+0000 7efdc58df640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.275+0000 7efdc58df640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdc0199ed0 con 0x7efdc0102620 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdbeffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdbeffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49116/0 (socket says 192.168.123.105:49116) 2026-03-09T16:10:22.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdbeffd640 1 -- 192.168.123.105:0/2784668311 learned_addr learned my addr 192.168.123.105:0/2784668311 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:22.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdbeffd640 1 -- 192.168.123.105:0/2784668311 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efdb4009660 con 0x7efdc0102620 2026-03-09T16:10:22.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdbeffd640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7efdb40042c0 tx=0x7efdb40042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:22.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.276+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efdb4038680 con 0x7efdc0102620 2026-03-09T16:10:22.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.277+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efdc019a0d0 con 0x7efdc0102620 2026-03-09T16:10:22.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.277+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efdc019a570 con 0x7efdc0102620 2026-03-09T16:10:22.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.277+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efdb4038ca0 con 0x7efdc0102620 2026-03-09T16:10:22.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.277+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efdb40418f0 con 0x7efdc0102620 2026-03-09T16:10:22.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.278+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7efdb40387e0 con 0x7efdc0102620 2026-03-09T16:10:22.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.278+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efdc0102aa0 con 0x7efdc0102620 2026-03-09T16:10:22.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.278+0000 7efdc48dd640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 0x7efd9403f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:22.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.278+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7efdb4076350 con 0x7efdc0102620 2026-03-09T16:10:22.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.278+0000 7efdbe7fc640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 0x7efd9403f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:22.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.279+0000 7efdbe7fc640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 0x7efd9403f790 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efdac0099c0 tx=0x7efdac006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:22.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.281+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7efdb4041400 con 0x7efdc0102620 2026-03-09T16:10:22.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.417+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7efdc0108530 con 0x7efdc0102620 2026-03-09T16:10:22.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.418+0000 7efdc48dd640 1 -- 192.168.123.105:0/2784668311 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7efdb404ae40 con 0x7efdc0102620 2026-03-09T16:10:22.420 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:22.420 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:22.420 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:22.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 msgr2=0x7efd9403f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 0x7efd9403f790 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efdac0099c0 tx=0x7efdac006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 msgr2=0x7efdc0199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7efdb40042c0 tx=0x7efdb40042f0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 shutdown_connections 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7efd9403d2d0 0x7efd9403f790 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.422+0000 7efdc58df640 1 --2- 192.168.123.105:0/2784668311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdc0102620 0x7efdc0199990 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:22.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.423+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 >> 192.168.123.105:0/2784668311 conn(0x7efdc00fde70 msgr2=0x7efdc00fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:22.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.423+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 shutdown_connections 2026-03-09T16:10:22.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:22.423+0000 7efdc58df640 1 -- 192.168.123.105:0/2784668311 wait complete. 2026-03-09T16:10:23.185 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:22 vm03 ceph-mon[51019]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:23.185 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:22 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2784668311' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:23.495 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:23.495 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:23.636 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:23.667 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.911+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1693740133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 msgr2=0x7f84f81029b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.911+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1693740133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f81029b0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f84e40099b0 tx=0x7f84e402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.912+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1693740133 shutdown_connections 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.912+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1693740133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f81029b0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.912+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1693740133 >> 192.168.123.105:0/1693740133 conn(0x7f84f80fde70 msgr2=0x7f84f8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.912+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1693740133 shutdown_connections 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.912+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1693740133 wait complete. 2026-03-09T16:10:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.913+0000 7f84fd0d5640 1 Processor -- start 2026-03-09T16:10:23.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.913+0000 7f84fd0d5640 1 -- start start 2026-03-09T16:10:23.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.913+0000 7f84fd0d5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:23.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.913+0000 7f84fd0d5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84f80793b0 con 0x7f84f81025b0 2026-03-09T16:10:23.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.913+0000 7f84f6d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:23.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.914+0000 7f84f6d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49134/0 (socket says 192.168.123.105:49134) 2026-03-09T16:10:23.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.914+0000 7f84f6d76640 1 -- 192.168.123.105:0/1285193566 learned_addr learned my addr 192.168.123.105:0/1285193566 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:23.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.914+0000 7f84f6d76640 1 -- 192.168.123.105:0/1285193566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84e4009660 con 0x7f84f81025b0 2026-03-09T16:10:23.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84f6d76640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f84e402f860 tx=0x7f84e4004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84e40043b0 con 0x7f84f81025b0 2026-03-09T16:10:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84f80795b0 con 0x7f84f81025b0 2026-03-09T16:10:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84f8075a00 con 0x7f84f81025b0 2026-03-09T16:10:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f84e4038b40 con 0x7f84f81025b0 2026-03-09T16:10:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.915+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84e4041810 con 0x7f84f81025b0 2026-03-09T16:10:23.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.917+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f84e4041a80 con 0x7f84f81025b0 2026-03-09T16:10:23.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.917+0000 7f84d3fff640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 0x7f84cc03f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:23.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.917+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f84e40771a0 con 0x7f84f81025b0 2026-03-09T16:10:23.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.917+0000 7f84f6575640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 0x7f84cc03f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:23.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.917+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f84f8102a30 con 0x7f84f81025b0 2026-03-09T16:10:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.921+0000 7f84f6575640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 0x7f84cc03f790 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f84ec0099c0 tx=0x7f84ec006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:23.921+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f84e404a3c0 con 0x7f84f81025b0 2026-03-09T16:10:24.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.053+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f84f8075d50 con 0x7f84f81025b0 2026-03-09T16:10:24.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.065+0000 7f84d3fff640 1 -- 192.168.123.105:0/1285193566 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f84e40373d0 con 0x7f84f81025b0 2026-03-09T16:10:24.068 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:24.068 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 msgr2=0x7f84cc03f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 0x7f84cc03f790 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f84ec0099c0 tx=0x7f84ec006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 msgr2=0x7f84f8078e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f84e402f860 tx=0x7f84e4004270 comp rx=0 tx=0).stop 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 shutdown_connections 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f84cc03d2d0 0x7f84cc03f790 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 --2- 192.168.123.105:0/1285193566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f81025b0 0x7f84f8078e70 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 >> 192.168.123.105:0/1285193566 conn(0x7f84f80fde70 msgr2=0x7f84f80ff2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.068+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 shutdown_connections 2026-03-09T16:10:24.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:24.069+0000 7f84fd0d5640 1 -- 192.168.123.105:0/1285193566 wait complete. 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: Deploying daemon prometheus.vm03 on vm03 2026-03-09T16:10:24.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:24 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1285193566' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:25.133 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:25.133 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:25.275 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:25.316 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:25.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 -- 192.168.123.105:0/3569863873 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 msgr2=0x7fe33c102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:25.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 --2- 192.168.123.105:0/3569863873 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c102a40 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fe32c0099b0 tx=0x7fe32c02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:25.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 -- 192.168.123.105:0/3569863873 shutdown_connections 2026-03-09T16:10:25.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 --2- 192.168.123.105:0/3569863873 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c102a40 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:25.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 -- 192.168.123.105:0/3569863873 >> 192.168.123.105:0/3569863873 conn(0x7fe33c0fde70 msgr2=0x7fe33c100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.564+0000 7fe343e80640 1 -- 192.168.123.105:0/3569863873 shutdown_connections 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.565+0000 7fe343e80640 1 -- 192.168.123.105:0/3569863873 wait complete. 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.565+0000 7fe343e80640 1 Processor -- start 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.565+0000 7fe343e80640 1 -- start start 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.565+0000 7fe343e80640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:25.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.565+0000 7fe343e80640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe33c079400 con 0x7fe33c102640 2026-03-09T16:10:25.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.566+0000 7fe341bf5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:25.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.566+0000 7fe341bf5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49148/0 (socket says 192.168.123.105:49148) 2026-03-09T16:10:25.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.566+0000 7fe341bf5640 1 -- 192.168.123.105:0/1945043803 learned_addr learned my addr 192.168.123.105:0/1945043803 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:25.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.566+0000 7fe341bf5640 1 -- 192.168.123.105:0/1945043803 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe32c009660 con 0x7fe33c102640 2026-03-09T16:10:25.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.566+0000 7fe341bf5640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fe32c02f860 tx=0x7fe32c004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.567+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe32c0043b0 con 0x7fe33c102640 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.567+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe32c038b40 con 0x7fe33c102640 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.567+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe32c0418f0 con 0x7fe33c102640 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.567+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe33c079600 con 0x7fe33c102640 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.567+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe33c075a00 con 0x7fe33c102640 2026-03-09T16:10:25.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.568+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe304005350 con 0x7fe33c102640 2026-03-09T16:10:25.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.569+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fe32c038cb0 con 0x7fe33c102640 2026-03-09T16:10:25.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.569+0000 7fe32affd640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 0x7fe31403f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:25.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.569+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fe32c0764f0 con 0x7fe33c102640 2026-03-09T16:10:25.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.569+0000 7fe3413f4640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 0x7fe31403f7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:25.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.570+0000 7fe3413f4640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 0x7fe31403f7e0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fe3300099c0 tx=0x7fe330006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:25.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.571+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe32c048310 con 0x7fe33c102640 2026-03-09T16:10:25.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.701+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fe3040051c0 con 0x7fe33c102640 2026-03-09T16:10:25.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.703+0000 7fe32affd640 1 -- 192.168.123.105:0/1945043803 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fe32c051310 con 0x7fe33c102640 2026-03-09T16:10:25.705 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:25.705 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:25.705 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.706+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 msgr2=0x7fe31403f7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.706+0000 7fe343e80640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 0x7fe31403f7e0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fe3300099c0 tx=0x7fe330006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.706+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 msgr2=0x7fe33c078ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.706+0000 7fe343e80640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fe32c02f860 tx=0x7fe32c004270 comp rx=0 tx=0).stop 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 shutdown_connections 2026-03-09T16:10:25.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fe31403d320 0x7fe31403f7e0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:25.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 --2- 192.168.123.105:0/1945043803 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe33c102640 0x7fe33c078ec0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:25.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 >> 192.168.123.105:0/1945043803 conn(0x7fe33c0fde70 msgr2=0x7fe33c0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:25.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 shutdown_connections 2026-03-09T16:10:25.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:25.707+0000 7fe343e80640 1 -- 192.168.123.105:0/1945043803 wait complete. 2026-03-09T16:10:26.781 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:26.781 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:26 vm03 ceph-mon[51019]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:26 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1945043803' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:26.923 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:26.960 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:27.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 -- 192.168.123.105:0/1751462696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 msgr2=0x7f0fc0102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:27.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/1751462696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0102a20 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f0fac0099b0 tx=0x7f0fac02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:27.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 -- 192.168.123.105:0/1751462696 shutdown_connections 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/1751462696 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0102a20 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 -- 192.168.123.105:0/1751462696 >> 192.168.123.105:0/1751462696 conn(0x7f0fc00fde70 msgr2=0x7f0fc0100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.199+0000 7f0fc4acf640 1 -- 192.168.123.105:0/1751462696 shutdown_connections 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.200+0000 7f0fc4acf640 1 -- 192.168.123.105:0/1751462696 wait complete. 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.200+0000 7f0fc4acf640 1 Processor -- start 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.200+0000 7f0fc4acf640 1 -- start start 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fc4acf640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:27.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fc4acf640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0fc0199ed0 con 0x7f0fc0102620 2026-03-09T16:10:27.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fbe575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:27.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fbe575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:49166/0 (socket says 192.168.123.105:49166) 2026-03-09T16:10:27.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fbe575640 1 -- 192.168.123.105:0/2555569122 learned_addr learned my addr 192.168.123.105:0/2555569122 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:27.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.201+0000 7f0fbe575640 1 -- 192.168.123.105:0/2555569122 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0fac009660 con 0x7f0fc0102620 2026-03-09T16:10:27.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.202+0000 7f0fbe575640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f0fac02f860 tx=0x7f0fac004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:27.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.202+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0fac0043b0 con 0x7f0fc0102620 2026-03-09T16:10:27.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.202+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0fc019a0d0 con 0x7f0fc0102620 2026-03-09T16:10:27.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.202+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0fc019a570 con 0x7f0fc0102620 2026-03-09T16:10:27.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.203+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0fac038b40 con 0x7f0fc0102620 2026-03-09T16:10:27.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.203+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0fac041810 con 0x7f0fc0102620 2026-03-09T16:10:27.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.204+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f0fac041ac0 con 0x7f0fc0102620 2026-03-09T16:10:27.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.204+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0fc0102aa0 con 0x7f0fc0102620 2026-03-09T16:10:27.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.204+0000 7f0fab7fe640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 0x7f0f9003f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:27.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.204+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f0fac076250 con 0x7f0fc0102620 2026-03-09T16:10:27.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.204+0000 7f0fbdd74640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 0x7f0f9003f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:27.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.205+0000 7f0fbdd74640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 0x7f0f9003f790 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f0fb40099c0 tx=0x7f0fb4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:27.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.207+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0fac049b50 con 0x7f0fc0102620 2026-03-09T16:10:27.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.331+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0fc0108530 con 0x7f0fc0102620 2026-03-09T16:10:27.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.332+0000 7f0fab7fe640 1 -- 192.168.123.105:0/2555569122 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f0fac05a090 con 0x7f0fc0102620 2026-03-09T16:10:27.333 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:27.333 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:27.333 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:27.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.334+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 msgr2=0x7f0f9003f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:27.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.334+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 0x7f0f9003f790 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f0fb40099c0 tx=0x7f0fb4006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:27.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.334+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 msgr2=0x7f0fc0199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:27.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f0fac02f860 tx=0x7f0fac004270 comp rx=0 tx=0).stop 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 shutdown_connections 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f0f9003d2d0 0x7f0f9003f790 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 --2- 192.168.123.105:0/2555569122 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0fc0102620 0x7f0fc0199990 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 >> 192.168.123.105:0/2555569122 conn(0x7f0fc00fde70 msgr2=0x7f0fc00fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.335+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 shutdown_connections 2026-03-09T16:10:27.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:27.336+0000 7f0fc4acf640 1 -- 192.168.123.105:0/2555569122 wait complete. 2026-03-09T16:10:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:27 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2555569122' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:27.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:27 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:28.412 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:28.413 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:28.560 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:28.600 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.854+0000 7f9d859d5640 1 -- 192.168.123.105:0/1275111300 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 msgr2=0x7f9d80075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.854+0000 7f9d859d5640 1 --2- 192.168.123.105:0/1275111300 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d80075fa0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f9d680099b0 tx=0x7f9d6802f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 -- 192.168.123.105:0/1275111300 shutdown_connections 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 --2- 192.168.123.105:0/1275111300 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d80075fa0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 -- 192.168.123.105:0/1275111300 >> 192.168.123.105:0/1275111300 conn(0x7f9d800fdca0 msgr2=0x7f9d80100090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 -- 192.168.123.105:0/1275111300 shutdown_connections 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 -- 192.168.123.105:0/1275111300 wait complete. 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 Processor -- start 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.855+0000 7f9d859d5640 1 -- start start 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d859d5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:28.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d859d5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d8019e200 con 0x7f9d80075ba0 2026-03-09T16:10:28.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d849d3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:28.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d849d3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35646/0 (socket says 192.168.123.105:35646) 2026-03-09T16:10:28.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d849d3640 1 -- 192.168.123.105:0/2490688106 learned_addr learned my addr 192.168.123.105:0/2490688106 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:28.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.856+0000 7f9d849d3640 1 -- 192.168.123.105:0/2490688106 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d68009660 con 0x7f9d80075ba0 2026-03-09T16:10:28.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.857+0000 7f9d849d3640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9d6802f860 tx=0x7f9d68004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:28.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.857+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d680043b0 con 0x7f9d80075ba0 2026-03-09T16:10:28.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.857+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d8019e400 con 0x7f9d80075ba0 2026-03-09T16:10:28.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.857+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d8019e8a0 con 0x7f9d80075ba0 2026-03-09T16:10:28.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.858+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9d68038b40 con 0x7f9d80075ba0 2026-03-09T16:10:28.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.858+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d68041810 con 0x7f9d80075ba0 2026-03-09T16:10:28.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.858+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d80076020 con 0x7f9d80075ba0 2026-03-09T16:10:28.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.859+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f9d68038cb0 con 0x7f9d80075ba0 2026-03-09T16:10:28.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.859+0000 7f9d75ffb640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 0x7f9d5003f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:28.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.859+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f9d68076be0 con 0x7f9d80075ba0 2026-03-09T16:10:28.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.859+0000 7f9d77fff640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 0x7f9d5003f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:28.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.860+0000 7f9d77fff640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 0x7f9d5003f740 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9d700099c0 tx=0x7f9d70006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:28.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.863+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9d68036b60 con 0x7f9d80075ba0 2026-03-09T16:10:28.901 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:28 vm03 ceph-mon[51019]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:28.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.995+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9d8010c8d0 con 0x7f9d80075ba0 2026-03-09T16:10:28.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.997+0000 7f9d75ffb640 1 -- 192.168.123.105:0/2490688106 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f9d68036b60 con 0x7f9d80075ba0 2026-03-09T16:10:28.998 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:28.998 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:28.998 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 msgr2=0x7f9d5003f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 0x7f9d5003f740 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9d700099c0 tx=0x7f9d70006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 msgr2=0x7f9d8019dcc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9d6802f860 tx=0x7f9d68004270 comp rx=0 tx=0).stop 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 shutdown_connections 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7f9d5003d280 0x7f9d5003f740 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 --2- 192.168.123.105:0/2490688106 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d80075ba0 0x7f9d8019dcc0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:29.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:28.999+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 >> 192.168.123.105:0/2490688106 conn(0x7f9d800fdca0 msgr2=0x7f9d800fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:29.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:29.000+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 shutdown_connections 2026-03-09T16:10:29.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:29.000+0000 7f9d859d5640 1 -- 192.168.123.105:0/2490688106 wait complete. 2026-03-09T16:10:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:29 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2490688106' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:29 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:29 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:29 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:29 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-09T16:10:30.061 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:30.061 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:30.193 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:30.233 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:30.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.478+0000 7fd1cdcef640 1 -- 192.168.123.105:0/1336004719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 msgr2=0x7fd1c8102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.478+0000 7fd1cdcef640 1 --2- 192.168.123.105:0/1336004719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8102a20 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fd1ac0099b0 tx=0x7fd1ac02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.479+0000 7fd1cdcef640 1 -- 192.168.123.105:0/1336004719 shutdown_connections 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.479+0000 7fd1cdcef640 1 --2- 192.168.123.105:0/1336004719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8102a20 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.479+0000 7fd1cdcef640 1 -- 192.168.123.105:0/1336004719 >> 192.168.123.105:0/1336004719 conn(0x7fd1c80fde70 msgr2=0x7fd1c8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.480+0000 7fd1cdcef640 1 -- 192.168.123.105:0/1336004719 shutdown_connections 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.480+0000 7fd1cdcef640 1 -- 192.168.123.105:0/1336004719 wait complete. 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.480+0000 7fd1cdcef640 1 Processor -- start 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.480+0000 7fd1cdcef640 1 -- start start 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1cdcef640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:30.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1cdcef640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1c8079400 con 0x7fd1c8102620 2026-03-09T16:10:30.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1c77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:30.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1c77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35652/0 (socket says 192.168.123.105:35652) 2026-03-09T16:10:30.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1c77fe640 1 -- 192.168.123.105:0/855717499 learned_addr learned my addr 192.168.123.105:0/855717499 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:30.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.481+0000 7fd1c77fe640 1 -- 192.168.123.105:0/855717499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1ac009660 con 0x7fd1c8102620 2026-03-09T16:10:30.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1c77fe640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fd1ac02f860 tx=0x7fd1ac004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:30.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd1ac0043b0 con 0x7fd1c8102620 2026-03-09T16:10:30.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd1ac038b40 con 0x7fd1c8102620 2026-03-09T16:10:30.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1c8079600 con 0x7fd1c8102620 2026-03-09T16:10:30.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1c8075a00 con 0x7fd1c8102620 2026-03-09T16:10:30.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.482+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd1ac0418f0 con 0x7fd1c8102620 2026-03-09T16:10:30.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.483+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fd1ac038cb0 con 0x7fd1c8102620 2026-03-09T16:10:30.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.483+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd1c8102aa0 con 0x7fd1c8102620 2026-03-09T16:10:30.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.483+0000 7fd1c4ff9640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:30.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.483+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fd1ac075e80 con 0x7fd1c8102620 2026-03-09T16:10:30.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.483+0000 7fd1c6ffd640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:30.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.484+0000 7fd1c6ffd640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fd1b40099c0 tx=0x7fd1b4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:30.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.486+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd1ac079030 con 0x7fd1c8102620 2026-03-09T16:10:30.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.552+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 13) v1 ==== 49443+0+0 (secure 0 0 0) 0x7fd1ac04a3c0 con 0x7fd1c8102620 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.583+0000 7fd1c6ffd640 1 -- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 msgr2=0x7fd1a003f420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 16 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.584+0000 7fd1c6ffd640 1 -- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 msgr2=0x7fd1a003f420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.584+0000 7fd1c6ffd640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fd1b40099c0 tx=0x7fd1b4006eb0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.584+0000 7fd1c6ffd640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fd1b40099c0 tx=0x7fd1b4006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.584+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 msgr2=0x7fd1a003f420 unknown :-1 s=STATE_CLOSED l=1).mark_down 2026-03-09T16:10:30.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.584+0000 7fd1c4ff9640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.633+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fd1c8108530 con 0x7fd1c8102620 2026-03-09T16:10:30.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.636+0000 7fd1c4ff9640 1 -- 192.168.123.105:0/855717499 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fd1ac035cd0 con 0x7fd1c8102620 2026-03-09T16:10:30.637 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:30.637 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:30.637 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 msgr2=0x7fd1c8078ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fd1ac02f860 tx=0x7fd1ac004270 comp rx=0 tx=0).stop 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 shutdown_connections 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fd1a003cf60 0x7fd1a003f420 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 --2- 192.168.123.105:0/855717499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1c8102620 0x7fd1c8078ec0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.638+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 >> 192.168.123.105:0/855717499 conn(0x7fd1c80fde70 msgr2=0x7fd1c80fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.639+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 shutdown_connections 2026-03-09T16:10:30.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:30.639+0000 7fd1cdcef640 1 -- 192.168.123.105:0/855717499 wait complete. 2026-03-09T16:10:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:30 vm03 ceph-mon[51019]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:30 vm03 ceph-mon[51019]: from='mgr.14162 192.168.123.103:0/2596380218' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-09T16:10:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:30 vm03 ceph-mon[51019]: mgrmap e13: vm03.gbgzmu(active, since 33s) 2026-03-09T16:10:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:30 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/855717499' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:31.679 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:31.679 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:31.826 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:31.863 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:32.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.144+0000 7fefb6395640 1 -- 192.168.123.105:0/3607319807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 msgr2=0x7fefb0103090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.144+0000 7fefb6395640 1 --2- 192.168.123.105:0/3607319807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0103090 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fef980099b0 tx=0x7fef9802f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 -- 192.168.123.105:0/3607319807 shutdown_connections 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 --2- 192.168.123.105:0/3607319807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0103090 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 -- 192.168.123.105:0/3607319807 >> 192.168.123.105:0/3607319807 conn(0x7fefb00fa840 msgr2=0x7fefb00fcc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 -- 192.168.123.105:0/3607319807 shutdown_connections 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 -- 192.168.123.105:0/3607319807 wait complete. 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 Processor -- start 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.145+0000 7fefb6395640 1 -- start start 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb6395640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:32.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb6395640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fefb0195790 con 0x7fefb0100ca0 2026-03-09T16:10:32.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb5393640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:32.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb5393640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35672/0 (socket says 192.168.123.105:35672) 2026-03-09T16:10:32.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb5393640 1 -- 192.168.123.105:0/1491814480 learned_addr learned my addr 192.168.123.105:0/1491814480 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:32.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb5393640 1 -- 192.168.123.105:0/1491814480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fef98009660 con 0x7fefb0100ca0 2026-03-09T16:10:32.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.146+0000 7fefb5393640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fef9802f860 tx=0x7fef98004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:32.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.147+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef980043b0 con 0x7fefb0100ca0 2026-03-09T16:10:32.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.147+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fef98038b40 con 0x7fefb0100ca0 2026-03-09T16:10:32.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.147+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef980418f0 con 0x7fefb0100ca0 2026-03-09T16:10:32.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.147+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fefb0195990 con 0x7fefb0100ca0 2026-03-09T16:10:32.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.147+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fefb0195e30 con 0x7fefb0100ca0 2026-03-09T16:10:32.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.148+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 49443+0+0 (secure 0 0 0) 0x7fef98038cb0 con 0x7fefb0100ca0 2026-03-09T16:10:32.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.148+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fef74005350 con 0x7fefb0100ca0 2026-03-09T16:10:32.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.148+0000 7fefa67fc640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 0x7fef8003f470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:32.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.149+0000 7fefb4b92640 1 -- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 msgr2=0x7fef8003f470 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/3405276359 2026-03-09T16:10:32.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.149+0000 7fefb4b92640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 0x7fef8003f470 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T16:10:32.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.149+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fef98075c20 con 0x7fefb0100ca0 2026-03-09T16:10:32.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.151+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fef98035320 con 0x7fefb0100ca0 2026-03-09T16:10:32.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.288+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fef74005600 con 0x7fefb0100ca0 2026-03-09T16:10:32.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.289+0000 7fefa67fc640 1 -- 192.168.123.105:0/1491814480 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fef98048b00 con 0x7fefb0100ca0 2026-03-09T16:10:32.290 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:32.290 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:32.290 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:32.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.292+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 msgr2=0x7fef8003f470 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:10:32.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.292+0000 7fefb6395640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 0x7fef8003f470 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:32.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.292+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 msgr2=0x7fefb0195250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:32.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.292+0000 7fefb6395640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fef9802f860 tx=0x7fef98004270 comp rx=0 tx=0).stop 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 shutdown_connections 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:6800/3405276359,v1:192.168.123.103:6801/3405276359] conn(0x7fef8003cfb0 0x7fef8003f470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 --2- 192.168.123.105:0/1491814480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefb0100ca0 0x7fefb0195250 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 >> 192.168.123.105:0/1491814480 conn(0x7fefb00fa840 msgr2=0x7fefb00fb080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 shutdown_connections 2026-03-09T16:10:32.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:32.293+0000 7fefb6395640 1 -- 192.168.123.105:0/1491814480 wait complete. 2026-03-09T16:10:32.433 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:32 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1491814480' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:33.367 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:33.367 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:33.535 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:33.568 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:33.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.860+0000 7f605ebee640 1 -- 192.168.123.105:0/2813874496 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058102640 msgr2=0x7f6058102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:33.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.860+0000 7f605ebee640 1 --2- 192.168.123.105:0/2813874496 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058102640 0x7f6058102a40 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f60400099b0 tx=0x7f604002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:33.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.861+0000 7f605ebee640 1 -- 192.168.123.105:0/2813874496 shutdown_connections 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.861+0000 7f605ebee640 1 --2- 192.168.123.105:0/2813874496 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058102640 0x7f6058102a40 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.861+0000 7f605ebee640 1 -- 192.168.123.105:0/2813874496 >> 192.168.123.105:0/2813874496 conn(0x7f60580fde70 msgr2=0x7f6058100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.862+0000 7f605ebee640 1 -- 192.168.123.105:0/2813874496 shutdown_connections 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.862+0000 7f605ebee640 1 -- 192.168.123.105:0/2813874496 wait complete. 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.862+0000 7f605ebee640 1 Processor -- start 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.862+0000 7f605ebee640 1 -- start start 2026-03-09T16:10:33.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605ebee640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:33.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605ebee640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f605819a4d0 con 0x7f6058199b70 2026-03-09T16:10:33.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605c963640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:33.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605c963640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35686/0 (socket says 192.168.123.105:35686) 2026-03-09T16:10:33.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605c963640 1 -- 192.168.123.105:0/1931985318 learned_addr learned my addr 192.168.123.105:0/1931985318 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:33.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.863+0000 7f605c963640 1 -- 192.168.123.105:0/1931985318 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6040009660 con 0x7f6058199b70 2026-03-09T16:10:33.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.864+0000 7f605c963640 1 --2- 192.168.123.105:0/1931985318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f60581036a0 tx=0x7f6040004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.867+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6040038470 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.867+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6040038a90 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.867+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6040041920 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.867+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f605819a6d0 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.867+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f605819d240 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.868+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 14) v1 ==== 49164+0+0 (secure 0 0 0) 0x7f60400385d0 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.868+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6024005350 con 0x7f6058199b70 2026-03-09T16:10:33.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.868+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f6040075da0 con 0x7f6058199b70 2026-03-09T16:10:33.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:33.871+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6040037bc0 con 0x7f6058199b70 2026-03-09T16:10:34.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.016+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f60240051c0 con 0x7f6058199b70 2026-03-09T16:10:34.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.017+0000 7f604dffb640 1 -- 192.168.123.105:0/1931985318 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f60400501f0 con 0x7f6058199b70 2026-03-09T16:10:34.019 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:34.019 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:34.019 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:34.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.019+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 msgr2=0x7f6058199f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:34.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.019+0000 7f605ebee640 1 --2- 192.168.123.105:0/1931985318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f60581036a0 tx=0x7f6040004290 comp rx=0 tx=0).stop 2026-03-09T16:10:34.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.020+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 shutdown_connections 2026-03-09T16:10:34.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.020+0000 7f605ebee640 1 --2- 192.168.123.105:0/1931985318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6058199b70 0x7f6058199f90 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:34.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.020+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 >> 192.168.123.105:0/1931985318 conn(0x7f60580fde70 msgr2=0x7f60580fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:34.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.020+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 shutdown_connections 2026-03-09T16:10:34.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:34.020+0000 7f605ebee640 1 -- 192.168.123.105:0/1931985318 wait complete. 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: mgrmap e14: vm03.gbgzmu(active, starting, since 0.0055223s) 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:10:34.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:10:35.023 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:10:35.023 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1931985318' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:35.023 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:35.023 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:35.023 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: mgrmap e15: vm03.gbgzmu(active, since 1.00935s) 2026-03-09T16:10:35.024 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:34 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:35.101 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:35.101 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:35.303 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:35.356 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T16:10:35.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 -- 192.168.123.105:0/3149294313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca4071610 msgr2=0x7f7ca4071a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 --2- 192.168.123.105:0/3149294313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca4071610 0x7f7ca4071a10 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f7c94007920 tx=0x7f7c94030050 comp rx=0 tx=0).stop 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 -- 192.168.123.105:0/3149294313 shutdown_connections 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 --2- 192.168.123.105:0/3149294313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca4071610 0x7f7ca4071a10 secure :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f7c94007920 tx=0x7f7c94030050 comp rx=0 tx=0).stop 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 -- 192.168.123.105:0/3149294313 >> 192.168.123.105:0/3149294313 conn(0x7f7ca406d010 msgr2=0x7f7ca406f450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 -- 192.168.123.105:0/3149294313 shutdown_connections 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.640+0000 7f7ca950b640 1 -- 192.168.123.105:0/3149294313 wait complete. 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.641+0000 7f7ca950b640 1 Processor -- start 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca950b640 1 -- start start 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca950b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca950b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ca407a050 con 0x7f7ca40796f0 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35708/0 (socket says 192.168.123.105:35708) 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca2ffd640 1 -- 192.168.123.105:0/1866873874 learned_addr learned my addr 192.168.123.105:0/1866873874 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca2ffd640 1 -- 192.168.123.105:0/1866873874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c940075d0 con 0x7f7ca40796f0 2026-03-09T16:10:35.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.642+0000 7f7ca2ffd640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f7c94002bf0 tx=0x7f7c94030b00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:35.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.643+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7c94039470 con 0x7f7ca40796f0 2026-03-09T16:10:35.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.643+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7c94039a90 con 0x7f7ca40796f0 2026-03-09T16:10:35.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.643+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7c94044b20 con 0x7f7ca40796f0 2026-03-09T16:10:35.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.643+0000 7f7ca950b640 1 -- 192.168.123.105:0/1866873874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ca407a250 con 0x7f7ca40796f0 2026-03-09T16:10:35.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.643+0000 7f7ca950b640 1 -- 192.168.123.105:0/1866873874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ca407ce40 con 0x7f7ca40796f0 2026-03-09T16:10:35.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.644+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 15) v1 ==== 49291+0+0 (secure 0 0 0) 0x7f7c94042070 con 0x7f7ca40796f0 2026-03-09T16:10:35.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.644+0000 7f7c83fff640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 0x7f7c9003f670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:35.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.644+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f7c94075a10 con 0x7f7ca40796f0 2026-03-09T16:10:35.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.644+0000 7f7ca950b640 1 -- 192.168.123.105:0/1866873874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c70005350 con 0x7f7ca40796f0 2026-03-09T16:10:35.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.644+0000 7f7ca27fc640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 0x7f7c9003f670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:35.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.647+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7c94037370 con 0x7f7ca40796f0 2026-03-09T16:10:35.658 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.648+0000 7f7ca27fc640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 0x7f7c9003f670 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f7c9c00ad30 tx=0x7f7c9c0093f0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:35.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.803+0000 7f7ca950b640 1 -- 192.168.123.105:0/1866873874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f7c700051c0 con 0x7f7ca40796f0 2026-03-09T16:10:35.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.804+0000 7f7c83fff640 1 -- 192.168.123.105:0/1866873874 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f7c94066030 con 0x7f7ca40796f0 2026-03-09T16:10:35.806 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:35.806 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:35.806 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:35.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 msgr2=0x7f7c9003f670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:35.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 0x7f7c9003f670 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f7c9c00ad30 tx=0x7f7c9c0093f0 comp rx=0 tx=0).stop 2026-03-09T16:10:35.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 msgr2=0x7f7ca4079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f7c94002bf0 tx=0x7f7c94030b00 comp rx=0 tx=0).stop 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 shutdown_connections 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7c9003d1b0 0x7f7c9003f670 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.810+0000 7f7c81ffb640 1 --2- 192.168.123.105:0/1866873874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ca40796f0 0x7f7ca4079b10 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.811+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 >> 192.168.123.105:0/1866873874 conn(0x7f7ca406d010 msgr2=0x7f7ca406db40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.811+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 shutdown_connections 2026-03-09T16:10:35.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:35.811+0000 7f7c81ffb640 1 -- 192.168.123.105:0/1866873874 wait complete. 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: [09/Mar/2026:16:10:34] ENGINE Bus STARTING 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: [09/Mar/2026:16:10:35] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: [09/Mar/2026:16:10:35] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: [09/Mar/2026:16:10:35] ENGINE Bus STARTED 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: [09/Mar/2026:16:10:35] ENGINE Client ('192.168.123.103', 58016) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:36.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:10:36.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:36 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1866873874' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:36.881 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:36.882 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:37.053 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:37.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.325+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2132424434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 msgr2=0x7f3bb4105580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:37.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.325+0000 7f3bb8a4f640 1 --2- 192.168.123.105:0/2132424434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4105580 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c0099b0 tx=0x7f3b9c02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:37.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.327+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2132424434 shutdown_connections 2026-03-09T16:10:37.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.327+0000 7f3bb8a4f640 1 --2- 192.168.123.105:0/2132424434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4105580 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:37.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.327+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2132424434 >> 192.168.123.105:0/2132424434 conn(0x7f3bb40fa430 msgr2=0x7f3bb40fc850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:37.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.327+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2132424434 shutdown_connections 2026-03-09T16:10:37.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.328+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2132424434 wait complete. 2026-03-09T16:10:37.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.328+0000 7f3bb8a4f640 1 Processor -- start 2026-03-09T16:10:37.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.328+0000 7f3bb8a4f640 1 -- start start 2026-03-09T16:10:37.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.328+0000 7f3bb8a4f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:37.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.328+0000 7f3bb8a4f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3bb4073260 con 0x7f3bb41051a0 2026-03-09T16:10:37.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.329+0000 7f3bb2d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:37.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.329+0000 7f3bb2d76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35730/0 (socket says 192.168.123.105:35730) 2026-03-09T16:10:37.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.329+0000 7f3bb2d76640 1 -- 192.168.123.105:0/2619702659 learned_addr learned my addr 192.168.123.105:0/2619702659 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:37.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.329+0000 7f3bb2d76640 1 -- 192.168.123.105:0/2619702659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b9c009660 con 0x7f3bb41051a0 2026-03-09T16:10:37.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.330+0000 7f3bb2d76640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c02f860 tx=0x7f3b9c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:37.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.330+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b9c002a50 con 0x7f3bb41051a0 2026-03-09T16:10:37.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.330+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2619702659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3bb406fdb0 con 0x7f3bb41051a0 2026-03-09T16:10:37.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.330+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2619702659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3bb4070250 con 0x7f3bb41051a0 2026-03-09T16:10:37.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.331+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3b9c038930 con 0x7f3bb41051a0 2026-03-09T16:10:37.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.331+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b9c041810 con 0x7f3bb41051a0 2026-03-09T16:10:37.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.331+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2619702659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3bb4101a40 con 0x7f3bb41051a0 2026-03-09T16:10:37.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.332+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7f3b9c038aa0 con 0x7f3bb41051a0 2026-03-09T16:10:37.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.333+0000 7f3b93fff640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 0x7f3b8403f8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:37.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.333+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f3b9c0763d0 con 0x7f3bb41051a0 2026-03-09T16:10:37.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.333+0000 7f3bb2575640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 0x7f3b8403f8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:37.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.333+0000 7f3bb2575640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 0x7f3b8403f8d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f3ba80099c0 tx=0x7f3ba8006eb0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:37.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.335+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3b9c038470 con 0x7f3bb41051a0 2026-03-09T16:10:37.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.468+0000 7f3bb8a4f640 1 -- 192.168.123.105:0/2619702659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3bb4100e00 con 0x7f3bb41051a0 2026-03-09T16:10:37.470 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:37.470 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:37.470 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:37.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.469+0000 7f3b93fff640 1 -- 192.168.123.105:0/2619702659 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f3b9c037bc0 con 0x7f3bb41051a0 2026-03-09T16:10:37.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.472+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 msgr2=0x7f3b8403f8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:37.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.472+0000 7f3b91ffb640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 0x7f3b8403f8d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f3ba80099c0 tx=0x7f3ba8006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:37.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.472+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 msgr2=0x7f3bb4072d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:37.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.472+0000 7f3b91ffb640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c02f860 tx=0x7f3b9c004290 comp rx=0 tx=0).stop 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.473+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 shutdown_connections 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.473+0000 7f3b91ffb640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3b8403d410 0x7f3b8403f8d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.473+0000 7f3b91ffb640 1 --2- 192.168.123.105:0/2619702659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb41051a0 0x7f3bb4072d20 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.473+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 >> 192.168.123.105:0/2619702659 conn(0x7f3bb40fa430 msgr2=0x7f3bb4107c40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.474+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 shutdown_connections 2026-03-09T16:10:37.474 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:37.474+0000 7f3b91ffb640 1 -- 192.168.123.105:0/2619702659 wait complete. 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:10:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:10:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:10:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: mgrmap e16: vm03.gbgzmu(active, since 2s) 2026-03-09T16:10:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:38.525 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:38.525 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:38.733 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-09T16:10:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:38 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2619702659' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.002+0000 7f6d7cf82640 1 -- 192.168.123.105:0/4081954600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 msgr2=0x7f6d78071d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.002+0000 7f6d7cf82640 1 --2- 192.168.123.105:0/4081954600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78071d70 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f6d6c0099b0 tx=0x7f6d6c02f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.002+0000 7f6d7cf82640 1 -- 192.168.123.105:0/4081954600 shutdown_connections 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.002+0000 7f6d7cf82640 1 --2- 192.168.123.105:0/4081954600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78071d70 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.002+0000 7f6d7cf82640 1 -- 192.168.123.105:0/4081954600 >> 192.168.123.105:0/4081954600 conn(0x7f6d7806b190 msgr2=0x7f6d7806b5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 -- 192.168.123.105:0/4081954600 shutdown_connections 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 -- 192.168.123.105:0/4081954600 wait complete. 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 Processor -- start 2026-03-09T16:10:39.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 -- start start 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d7cf82640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d78114210 con 0x7f6d78071990 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d777fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d777fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:37120/0 (socket says 192.168.123.105:37120) 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.003+0000 7f6d777fe640 1 -- 192.168.123.105:0/921830394 learned_addr learned my addr 192.168.123.105:0/921830394 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d777fe640 1 -- 192.168.123.105:0/921830394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d6c009660 con 0x7f6d78071990 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d777fe640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6d6c02f860 tx=0x7f6d6c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:39.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6d6c002a50 con 0x7f6d78071990 2026-03-09T16:10:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6d6c038930 con 0x7f6d78071990 2026-03-09T16:10:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6d6c041810 con 0x7f6d78071990 2026-03-09T16:10:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d7cf82640 1 -- 192.168.123.105:0/921830394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d781143b0 con 0x7f6d78071990 2026-03-09T16:10:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.004+0000 7f6d7cf82640 1 -- 192.168.123.105:0/921830394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d781147d0 con 0x7f6d78071990 2026-03-09T16:10:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.005+0000 7f6d7cf82640 1 -- 192.168.123.105:0/921830394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d3c005350 con 0x7f6d78071990 2026-03-09T16:10:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.005+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7f6d6c038aa0 con 0x7f6d78071990 2026-03-09T16:10:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.006+0000 7f6d74ff9640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 0x7f6d4c03f8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.006+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f6d6c076490 con 0x7f6d78071990 2026-03-09T16:10:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.006+0000 7f6d76ffd640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 0x7f6d4c03f8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:39.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.007+0000 7f6d76ffd640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 0x7f6d4c03f8d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f6d600099c0 tx=0x7f6d60006eb0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:39.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.008+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6d6c038470 con 0x7f6d78071990 2026-03-09T16:10:39.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.162+0000 7f6d7cf82640 1 -- 192.168.123.105:0/921830394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6d3c0051c0 con 0x7f6d78071990 2026-03-09T16:10:39.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.165+0000 7f6d74ff9640 1 -- 192.168.123.105:0/921830394 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f6d6c051920 con 0x7f6d78071990 2026-03-09T16:10:39.166 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:39.166 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:39.166 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:39.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.168+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 msgr2=0x7f6d4c03f8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:39.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.168+0000 7f6d567fc640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 0x7f6d4c03f8d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f6d600099c0 tx=0x7f6d60006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:39.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.168+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 msgr2=0x7f6d78117180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.168+0000 7f6d567fc640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6d6c02f860 tx=0x7f6d6c004290 comp rx=0 tx=0).stop 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 shutdown_connections 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6d4c03d410 0x7f6d4c03f8d0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 --2- 192.168.123.105:0/921830394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d78071990 0x7f6d78117180 secure :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6d6c02f860 tx=0x7f6d6c004290 comp rx=0 tx=0).stop 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 >> 192.168.123.105:0/921830394 conn(0x7f6d7806b190 msgr2=0x7f6d7806ec50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 shutdown_connections 2026-03-09T16:10:39.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:39.169+0000 7f6d567fc640 1 -- 192.168.123.105:0/921830394 wait complete. 2026-03-09T16:10:40.029 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: Deploying daemon crash.vm05 on vm05 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/921830394' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:40.362 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:40.362 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:40.502 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.753+0000 7f8df427d640 1 -- 192.168.123.105:0/1506000485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 msgr2=0x7f8dec108a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.753+0000 7f8df427d640 1 --2- 192.168.123.105:0/1506000485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec108a40 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f8de00099b0 tx=0x7f8de002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.754+0000 7f8df427d640 1 -- 192.168.123.105:0/1506000485 shutdown_connections 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.754+0000 7f8df427d640 1 --2- 192.168.123.105:0/1506000485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec108a40 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.754+0000 7f8df427d640 1 -- 192.168.123.105:0/1506000485 >> 192.168.123.105:0/1506000485 conn(0x7f8dec0fe2b0 msgr2=0x7f8dec1006d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.754+0000 7f8df427d640 1 -- 192.168.123.105:0/1506000485 shutdown_connections 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.754+0000 7f8df427d640 1 -- 192.168.123.105:0/1506000485 wait complete. 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df427d640 1 Processor -- start 2026-03-09T16:10:40.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df427d640 1 -- start start 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df427d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df427d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8dec19fea0 con 0x7f8dec108660 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df1ff2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df1ff2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:37144/0 (socket says 192.168.123.105:37144) 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df1ff2640 1 -- 192.168.123.105:0/2921053374 learned_addr learned my addr 192.168.123.105:0/2921053374 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.755+0000 7f8df1ff2640 1 -- 192.168.123.105:0/2921053374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8de0009660 con 0x7f8dec108660 2026-03-09T16:10:40.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8df1ff2640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f8de0002410 tx=0x7f8de0004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:40.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8de00043e0 con 0x7f8dec108660 2026-03-09T16:10:40.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8dec1a0100 con 0x7f8dec108660 2026-03-09T16:10:40.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8dec199a50 con 0x7f8dec108660 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8de0038930 con 0x7f8dec108660 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.756+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8de0041840 con 0x7f8dec108660 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.757+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7f8de0038aa0 con 0x7f8dec108660 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.757+0000 7f8ddaffd640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 0x7f8db403f880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.757+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f8de0076330 con 0x7f8dec108660 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.757+0000 7f8df17f1640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 0x7f8db403f880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.758+0000 7f8df17f1640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 0x7f8db403f880 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f8ddc0099c0 tx=0x7f8ddc006eb0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.758+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8dec103cd0 con 0x7f8dec108660 2026-03-09T16:10:40.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.761+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8de0035530 con 0x7f8dec108660 2026-03-09T16:10:40.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.889+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8dec19af60 con 0x7f8dec108660 2026-03-09T16:10:40.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.890+0000 7f8ddaffd640 1 -- 192.168.123.105:0/2921053374 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f8de005a090 con 0x7f8dec108660 2026-03-09T16:10:40.891 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:40.891 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:40.891 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:40.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.892+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 msgr2=0x7f8db403f880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:40.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.892+0000 7f8df427d640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 0x7f8db403f880 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f8ddc0099c0 tx=0x7f8ddc006eb0 comp rx=0 tx=0).stop 2026-03-09T16:10:40.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 msgr2=0x7f8dec19f960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:40.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f8de0002410 tx=0x7f8de0004060 comp rx=0 tx=0).stop 2026-03-09T16:10:40.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 shutdown_connections 2026-03-09T16:10:40.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8db403d3c0 0x7f8db403f880 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:40.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 --2- 192.168.123.105:0/2921053374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8dec108660 0x7f8dec19f960 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:40.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 >> 192.168.123.105:0/2921053374 conn(0x7f8dec0fe2b0 msgr2=0x7f8dec0fefb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:40.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 shutdown_connections 2026-03-09T16:10:40.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:40.893+0000 7f8df427d640 1 -- 192.168.123.105:0/2921053374 wait complete. 2026-03-09T16:10:41.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:40 vm03 ceph-mon[51019]: Deploying daemon node-exporter.vm05 on vm05 2026-03-09T16:10:41.952 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:41.952 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:42.115 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:42.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:41 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2921053374' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:42.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 -- 192.168.123.105:0/165193945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c071b60 msgr2=0x7f639c071f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:42.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 --2- 192.168.123.105:0/165193945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c071b60 0x7f639c071f40 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f6390007920 tx=0x7f6390030050 comp rx=0 tx=0).stop 2026-03-09T16:10:42.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 -- 192.168.123.105:0/165193945 shutdown_connections 2026-03-09T16:10:42.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 --2- 192.168.123.105:0/165193945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c071b60 0x7f639c071f40 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:42.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 -- 192.168.123.105:0/165193945 >> 192.168.123.105:0/165193945 conn(0x7f639c06b1d0 msgr2=0x7f639c06b5e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:42.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 -- 192.168.123.105:0/165193945 shutdown_connections 2026-03-09T16:10:42.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.400+0000 7f63a49d5640 1 -- 192.168.123.105:0/165193945 wait complete. 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a49d5640 1 Processor -- start 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a49d5640 1 -- start start 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a49d5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a49d5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6390002dc0 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a274a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a274a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:37160/0 (socket says 192.168.123.105:37160) 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a274a640 1 -- 192.168.123.105:0/3275305463 learned_addr learned my addr 192.168.123.105:0/3275305463 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a274a640 1 -- 192.168.123.105:0/3275305463 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63900075d0 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.401+0000 7f63a274a640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6390030600 tx=0x7f6390031b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.402+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6390002a50 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.402+0000 7f63a49d5640 1 -- 192.168.123.105:0/3275305463 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f639c084370 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.402+0000 7f63a49d5640 1 -- 192.168.123.105:0/3275305463 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f639c07cce0 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.402+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6390039d60 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.402+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6390041d40 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.403+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7f6390049050 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.403+0000 7f638f7fe640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 0x7f637803f900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.403+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f6390075cb0 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.403+0000 7f63a1f49640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 0x7f637803f900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.403+0000 7f63a49d5640 1 -- 192.168.123.105:0/3275305463 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f639c072de0 con 0x7f639c0839f0 2026-03-09T16:10:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.404+0000 7f63a1f49640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 0x7f637803f900 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f639800ad30 tx=0x7f63980093f0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:42.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.409+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6390041570 con 0x7f639c0839f0 2026-03-09T16:10:42.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.541+0000 7f63a49d5640 1 -- 192.168.123.105:0/3275305463 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f639c071b60 con 0x7f639c0839f0 2026-03-09T16:10:42.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.543+0000 7f638f7fe640 1 -- 192.168.123.105:0/3275305463 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f639004ae40 con 0x7f639c0839f0 2026-03-09T16:10:42.544 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:42.544 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:42.544 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:42.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 msgr2=0x7f637803f900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:42.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 0x7f637803f900 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f639800ad30 tx=0x7f63980093f0 comp rx=0 tx=0).stop 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 msgr2=0x7f639c083dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6390030600 tx=0x7f6390031b10 comp rx=0 tx=0).stop 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 shutdown_connections 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f637803d440 0x7f637803f900 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 --2- 192.168.123.105:0/3275305463 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f639c0839f0 0x7f639c083dd0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 >> 192.168.123.105:0/3275305463 conn(0x7f639c06b1d0 msgr2=0x7f639c074230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 shutdown_connections 2026-03-09T16:10:42.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:42.548+0000 7f638d7fa640 1 -- 192.168.123.105:0/3275305463 wait complete. 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/3275305463' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:10:43.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:43.618 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:43.618 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:43.943 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.603+0000 7fdc0b577640 1 -- 192.168.123.105:0/281036805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 msgr2=0x7fdc0c071d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.603+0000 7fdc0b577640 1 --2- 192.168.123.105:0/281036805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c071d70 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7fdc000099b0 tx=0x7fdc0002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- 192.168.123.105:0/281036805 shutdown_connections 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 --2- 192.168.123.105:0/281036805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c071d70 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- 192.168.123.105:0/281036805 >> 192.168.123.105:0/281036805 conn(0x7fdc0c06b190 msgr2=0x7fdc0c06b5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- 192.168.123.105:0/281036805 shutdown_connections 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- 192.168.123.105:0/281036805 wait complete. 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 Processor -- start 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- start start 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:44.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.604+0000 7fdc0b577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc0c1175e0 con 0x7fdc0c071990 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0a575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0a575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:37190/0 (socket says 192.168.123.105:37190) 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0a575640 1 -- 192.168.123.105:0/2270010029 learned_addr learned my addr 192.168.123.105:0/2270010029 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0a575640 1 -- 192.168.123.105:0/2270010029 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc00009660 con 0x7fdc0c071990 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0a575640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fdc00002410 tx=0x7fdc00004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc000043e0 con 0x7fdc0c071990 2026-03-09T16:10:44.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc0c1141d0 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc0c114670 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdc00038930 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.605+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc00041840 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.606+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7fdc000419a0 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.606+0000 7fdbf37fe640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 0x7fdbe803f880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.606+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7fdc00077280 con 0x7fdc0c071990 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.606+0000 7fdc09d74640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 0x7fdbe803f880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:44.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.607+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdbd8005350 con 0x7fdc0c071990 2026-03-09T16:10:44.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.607+0000 7fdc09d74640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 0x7fdbe803f880 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdbfc00ad30 tx=0x7fdbfc0093f0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:44.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.610+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fdc00037bb0 con 0x7fdc0c071990 2026-03-09T16:10:44.729 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression: NoCompression 2026-03-09T16:10:44.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.773+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdbd80051c0 con 0x7fdc0c071990 2026-03-09T16:10:44.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.775+0000 7fdbf37fe640 1 -- 192.168.123.105:0/2270010029 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fdc000373d0 con 0x7fdc0c071990 2026-03-09T16:10:44.776 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:44.776 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":1,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:09:32.695561Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T16:10:44.776 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 1 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 msgr2=0x7fdbe803f880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 0x7fdbe803f880 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdbfc00ad30 tx=0x7fdbfc0093f0 comp rx=0 tx=0).stop 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 msgr2=0x7fdc0c1170a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fdc00002410 tx=0x7fdc00004060 comp rx=0 tx=0).stop 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 shutdown_connections 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fdbe803d3c0 0x7fdbe803f880 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 --2- 192.168.123.105:0/2270010029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc0c071990 0x7fdc0c1170a0 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 >> 192.168.123.105:0/2270010029 conn(0x7fdc0c06b190 msgr2=0x7fdc0c06ebf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 shutdown_connections 2026-03-09T16:10:44.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:44.782+0000 7fdc0b577640 1 -- 192.168.123.105:0/2270010029 wait complete. 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: Deploying daemon mon.vm05 on vm05 2026-03-09T16:10:44.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:44 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.num_levels: 7 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T16:10:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.inplace_update_support: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.bloom_locality: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.max_successive_merges: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.ttl: 2592000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.enable_blob_files: false 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.min_blob_size: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c060cd5f-9607-4af3-8dde-0dc24b7e724a 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072644739017, "job": 1, "event": "recovery_started", "wal_files": [4]} 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072644739686, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1643, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773072644, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c060cd5f-9607-4af3-8dde-0dc24b7e724a", "db_session_id": "V2F2Z3ORIPFKIEGNPE9C", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773072644739757, "job": 1, "event": "recovery_finished"} 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557cda02ae00 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: DB pointer 0x557cda138000 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05 does not exist in monmap, will attempt to join an existing cluster 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: using public_addr v2:192.168.123.105:0/0 -> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T16:10:45.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** DB Stats ** 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L0 1/0 1.60 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.5 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Sum 1/0 1.60 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.5 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.5 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 2.5 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache BinnedLRUCache@0x557cda029350#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6e-06 secs_since: 0 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%) 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: starting mon.vm05 rank -1 at public addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] at bind addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon_data /var/lib/ceph/mon/ceph-vm05 fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(???) e0 preinit fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).mds e1 new map 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).mds e1 print_map 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: e1 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: legacy client fscid: -1 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout: No filesystems configured 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e4 e4: 0 total, 0 up, 0 in 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e5 e5: 0 total, 0 up, 0 in 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e5 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:45.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/2619702659' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Deploying daemon crash.vm05 on vm05 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/921830394' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Deploying daemon node-exporter.vm05 on vm05 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/2921053374' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/3275305463' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:45.030 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: Deploying daemon mon.vm05 on vm05 2026-03-09T16:10:45.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:45.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:44 vm05 ceph-mon[58702]: mon.vm05@-1(synchronizing).paxosservice(auth 1..8) refresh upgraded, format 0 -> 3 2026-03-09T16:10:45.862 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T16:10:45.862 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mon dump -f json 2026-03-09T16:10:46.018 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.885+0000 7fb126268640 1 -- 192.168.123.105:0/3327886063 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 msgr2=0x7fb100005680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.885+0000 7fb126268640 1 --2- 192.168.123.105:0/3327886063 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb100005680 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fb104002a00 tx=0x7fb104030bd0 comp rx=0 tx=0).stop 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.886+0000 7fb126268640 1 -- 192.168.123.105:0/3327886063 shutdown_connections 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.886+0000 7fb126268640 1 --2- 192.168.123.105:0/3327886063 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb100005680 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.886+0000 7fb126268640 1 -- 192.168.123.105:0/3327886063 >> 192.168.123.105:0/3327886063 conn(0x7fb1200fe3b0 msgr2=0x7fb1201007d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:49.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.886+0000 7fb126268640 1 -- 192.168.123.105:0/3327886063 shutdown_connections 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.886+0000 7fb126268640 1 -- 192.168.123.105:0/3327886063 wait complete. 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 Processor -- start 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 -- start start 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb12006fd40 0x7fb1201a4040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1200702f0 con 0x7fb120108690 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.887+0000 7fb126268640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb120070460 con 0x7fb12006fd40 2026-03-09T16:10:49.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:49.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:35688/0 (socket says 192.168.123.105:35688) 2026-03-09T16:10:49.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 -- 192.168.123.105:0/3328206894 learned_addr learned my addr 192.168.123.105:0/3328206894 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:10:49.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 -- 192.168.123.105:0/3328206894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb12006fd40 msgr2=0x7fb1201a4040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:49.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb12006fd40 0x7fb1201a4040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:49.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.888+0000 7fb11f7fe640 1 -- 192.168.123.105:0/3328206894 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1040026e0 con 0x7fb120108690 2026-03-09T16:10:49.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.889+0000 7fb11f7fe640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fb104012680 tx=0x7fb1040126b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:49.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.889+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb104034700 con 0x7fb120108690 2026-03-09T16:10:49.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.889+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb104034d20 con 0x7fb120108690 2026-03-09T16:10:49.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.889+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb10403dc60 con 0x7fb120108690 2026-03-09T16:10:49.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.890+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb1201a4580 con 0x7fb120108690 2026-03-09T16:10:49.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.890+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb1201a4a80 con 0x7fb120108690 2026-03-09T16:10:49.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.891+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb120103dd0 con 0x7fb120108690 2026-03-09T16:10:49.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.891+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 49489+0+0 (secure 0 0 0) 0x7fb104034860 con 0x7fb120108690 2026-03-09T16:10:49.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.891+0000 7fb11cff9640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 0x7fb0f803f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:49.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.891+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7fb104078600 con 0x7fb120108690 2026-03-09T16:10:49.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.893+0000 7fb11effd640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 0x7fb0f803f7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:49.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.894+0000 7fb11effd640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 0x7fb0f803f7e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb10c006fd0 tx=0x7fb10c006e40 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:49.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.894+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb10403ddc0 con 0x7fb120108690 2026-03-09T16:10:49.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:49.936+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7fb104061d10 con 0x7fb120108690 2026-03-09T16:10:50.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.042+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb120073200 con 0x7fb120108690 2026-03-09T16:10:50.044 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:10:50.044 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":2,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","modified":"2026-03-09T16:10:44.821492Z","created":"2026-03-09T16:09:32.695561Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-09T16:10:50.044 INFO:teuthology.orchestra.run.vm05.stderr:dumped monmap epoch 2 2026-03-09T16:10:50.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.043+0000 7fb11cff9640 1 -- 192.168.123.105:0/3328206894 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1028 (secure 0 0 0) 0x7fb120073200 con 0x7fb120108690 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 msgr2=0x7fb0f803f7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 0x7fb0f803f7e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb10c006fd0 tx=0x7fb10c006e40 comp rx=0 tx=0).stop 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 msgr2=0x7fb12006f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fb104012680 tx=0x7fb1040126b0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 shutdown_connections 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb0f803d320 0x7fb0f803f7e0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb12006fd40 0x7fb1201a4040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 --2- 192.168.123.105:0/3328206894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb120108690 0x7fb12006f800 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 >> 192.168.123.105:0/3328206894 conn(0x7fb1200fe3b0 msgr2=0x7fb1201005d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 shutdown_connections 2026-03-09T16:10:50.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:10:50.046+0000 7fb126268640 1 -- 192.168.123.105:0/3328206894 wait complete. 2026-03-09T16:10:50.121 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-09T16:10:50.121 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph config generate-minimal-conf 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: mon.vm03 calling monitor election 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: mon.vm05 calling monitor election 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: monmap e2: 2 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0],vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: fsmap 2026-03-09T16:10:50.218 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: mgrmap e16: vm03.gbgzmu(active, since 16s) 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: overall HEALTH_OK 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:10:50.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:50.256 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: mon.vm03 calling monitor election 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: mon.vm05 calling monitor election 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:10:50.280 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: monmap e2: 2 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0],vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: fsmap 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: mgrmap e16: vm03.gbgzmu(active, since 16s) 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: overall HEALTH_OK 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.? 192.168.123.105:0/884301469' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:10:50.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.493+0000 7f3eca0a1640 1 -- 192.168.123.103:0/2253871258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4108660 msgr2=0x7f3ec4108a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.493+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/2253871258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4108660 0x7f3ec4108a40 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f3ea80099b0 tx=0x7f3ea802f2b0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.494+0000 7f3eca0a1640 1 -- 192.168.123.103:0/2253871258 shutdown_connections 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.494+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/2253871258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4108660 0x7f3ec4108a40 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.494+0000 7f3eca0a1640 1 -- 192.168.123.103:0/2253871258 >> 192.168.123.103:0/2253871258 conn(0x7f3ec40fe2b0 msgr2=0x7f3ec41006d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.494+0000 7f3eca0a1640 1 -- 192.168.123.103:0/2253871258 shutdown_connections 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.494+0000 7f3eca0a1640 1 -- 192.168.123.103:0/2253871258 wait complete. 2026-03-09T16:10:50.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3eca0a1640 1 Processor -- start 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3eca0a1640 1 -- start start 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3eca0a1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 0x7f3ec4075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3eca0a1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3ec2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.495+0000 7f3eca0a1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ec4076210 con 0x7f3ec4079680 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57330/0 (socket says 192.168.123.103:57330) 2026-03-09T16:10:50.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 -- 192.168.123.103:0/3288144585 learned_addr learned my addr 192.168.123.103:0/3288144585 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec37fe640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 0x7f3ec4075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ec4076380 con 0x7f3ec4108660 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 -- 192.168.123.103:0/3288144585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 msgr2=0x7f3ec4075700 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 0x7f3ec4075700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ea8009660 con 0x7f3ec4079680 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.496+0000 7f3ec2ffd640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f3eb000b840 tx=0x7f3eb000bd10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:50.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3eb000c850 con 0x7f3ec4079680 2026-03-09T16:10:50.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3eb000ce70 con 0x7f3ec4079680 2026-03-09T16:10:50.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3ec37fe640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 0x7f3ec4075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:10:50.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3eb0012550 con 0x7f3ec4079680 2026-03-09T16:10:50.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3ec41a8690 con 0x7f3ec4079680 2026-03-09T16:10:50.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.497+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3ec41a8b80 con 0x7f3ec4079680 2026-03-09T16:10:50.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.498+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e88005350 con 0x7f3ec4079680 2026-03-09T16:10:50.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.501+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f3eb001b020 con 0x7f3ec4079680 2026-03-09T16:10:50.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.501+0000 7f3ec0ff9640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 0x7f3e980786c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:50.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.501+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f3eb0095aa0 con 0x7f3ec4079680 2026-03-09T16:10:50.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.501+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3eb00c38c0 con 0x7f3ec4079680 2026-03-09T16:10:50.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.502+0000 7f3ec37fe640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 0x7f3e980786c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:50.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.502+0000 7f3ec37fe640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 0x7f3e980786c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3ea80040c0 tx=0x7f3ea80023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stdout:# minimal ceph.conf for 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stdout:[global] 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stdout: fsid = 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stdout: mon_host = [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.614+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f3e880051c0 con 0x7f3ec4079680 2026-03-09T16:10:50.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.614+0000 7f3ec0ff9640 1 -- 192.168.123.103:0/3288144585 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f3eb0019100 con 0x7f3ec4079680 2026-03-09T16:10:50.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.621+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 msgr2=0x7f3e980786c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.621+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 0x7f3e980786c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3ea80040c0 tx=0x7f3ea80023d0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.621+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 msgr2=0x7f3ec4075c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:50.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.621+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f3eb000b840 tx=0x7f3eb000bd10 comp rx=0 tx=0).stop 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.622+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 shutdown_connections 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.622+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3e98076200 0x7f3e980786c0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.622+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3ec4079680 0x7f3ec4075c40 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.622+0000 7f3eca0a1640 1 --2- 192.168.123.103:0/3288144585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ec4108660 0x7f3ec4075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.622+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 >> 192.168.123.103:0/3288144585 conn(0x7f3ec40fe2b0 msgr2=0x7f3ec40feeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:50.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.623+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 shutdown_connections 2026-03-09T16:10:50.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:50.624+0000 7f3eca0a1640 1 -- 192.168.123.103:0/3288144585 wait complete. 2026-03-09T16:10:50.688 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-09T16:10:50.688 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:50.688 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T16:10:50.724 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:50.724 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:50.804 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:10:50.804 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T16:10:50.836 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:10:50.836 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:10:50.910 INFO:tasks.cephadm:Deploying OSDs... 2026-03-09T16:10:50.910 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:10:50.910 DEBUG:teuthology.orchestra.run.vm03:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T16:10:50.941 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:10:50.941 DEBUG:teuthology.orchestra.run.vm03:> ls /dev/[sv]d? 2026-03-09T16:10:51.004 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vda 2026-03-09T16:10:51.004 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdb 2026-03-09T16:10:51.004 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdc 2026-03-09T16:10:51.004 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdd 2026-03-09T16:10:51.004 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vde 2026-03-09T16:10:51.004 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T16:10:51.004 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T16:10:51.004 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdb 2026-03-09T16:10:51.062 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdb 2026-03-09T16:10:51.062 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:51.062 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-09T16:10:51.062 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:51.062 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:51.063 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-09 16:10:02.333880591 +0000 2026-03-09T16:10:51.063 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 16:03:59.159000000 +0000 2026-03-09T16:10:51.063 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 16:03:59.159000000 +0000 2026-03-09T16:10:51.063 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-09 16:03:57.261000000 +0000 2026-03-09T16:10:51.063 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T16:10:51.129 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T16:10:51.129 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T16:10:51.129 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000118813 s, 4.3 MB/s 2026-03-09T16:10:51.130 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T16:10:51.192 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdc 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdc 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-09 16:10:02.426880728 +0000 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 16:03:59.158000000 +0000 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 16:03:59.158000000 +0000 2026-03-09T16:10:51.250 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-09 16:03:57.266000000 +0000 2026-03-09T16:10:51.251 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: mgrmap e17: vm03.gbgzmu(active, since 16s), standbys: vm05.dygxfv 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/3328206894' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:10:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3288144585' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:51.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:50 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: mgrmap e17: vm03.gbgzmu(active, since 16s), standbys: vm05.dygxfv 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/3328206894' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3288144585' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:51.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:50 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:10:51.317 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T16:10:51.317 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T16:10:51.317 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000140664 s, 3.6 MB/s 2026-03-09T16:10:51.319 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T16:10:51.398 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdd 2026-03-09T16:10:51.467 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdd 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-09 16:10:02.493880826 +0000 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 16:03:59.172000000 +0000 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 16:03:59.172000000 +0000 2026-03-09T16:10:51.468 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-09 16:03:57.273000000 +0000 2026-03-09T16:10:51.468 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T16:10:51.539 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T16:10:51.539 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T16:10:51.539 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000150512 s, 3.4 MB/s 2026-03-09T16:10:51.540 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T16:10:51.605 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vde 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vde 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-09 16:10:02.563880929 +0000 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 16:03:59.180000000 +0000 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 16:03:59.180000000 +0000 2026-03-09T16:10:51.678 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-09 16:03:57.283000000 +0000 2026-03-09T16:10:51.679 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T16:10:51.780 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T16:10:51.780 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T16:10:51.780 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000221335 s, 2.3 MB/s 2026-03-09T16:10:51.782 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T16:10:51.806 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:10:51.806 DEBUG:teuthology.orchestra.run.vm05:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T16:10:51.825 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:10:51.825 DEBUG:teuthology.orchestra.run.vm05:> ls /dev/[sv]d? 2026-03-09T16:10:51.883 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vda 2026-03-09T16:10:51.883 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdb 2026-03-09T16:10:51.883 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdc 2026-03-09T16:10:51.883 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdd 2026-03-09T16:10:51.883 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vde 2026-03-09T16:10:51.883 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T16:10:51.883 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T16:10:51.883 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdb 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdb 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 221 Links: 1 Device type: fc,10 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-09 16:10:35.970959507 +0000 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-09 16:03:28.136000000 +0000 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-09 16:03:28.136000000 +0000 2026-03-09T16:10:51.943 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-09 16:03:26.260000000 +0000 2026-03-09T16:10:51.943 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T16:10:52.012 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-09T16:10:52.012 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-09T16:10:52.012 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000160922 s, 3.2 MB/s 2026-03-09T16:10:52.015 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T16:10:52.073 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdc 2026-03-09T16:10:52.132 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdc 2026-03-09T16:10:52.132 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,20 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-09 16:10:36.035959579 +0000 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-09 16:03:28.120000000 +0000 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-09 16:03:28.120000000 +0000 2026-03-09T16:10:52.133 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-09 16:03:26.280000000 +0000 2026-03-09T16:10:52.133 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T16:10:52.199 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-09T16:10:52.199 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-09T16:10:52.199 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000181169 s, 2.8 MB/s 2026-03-09T16:10:52.200 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Reconfiguring mgr.vm03.gbgzmu (unknown last config time)... 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: Reconfiguring daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:10:52.261 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.261 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdd 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Reconfiguring mgr.vm03.gbgzmu (unknown last config time)... 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: Reconfiguring daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:10:52.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdd 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 225 Links: 1 Device type: fc,30 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-09 16:10:36.106959659 +0000 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-09 16:03:28.111000000 +0000 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-09 16:03:28.111000000 +0000 2026-03-09T16:10:52.297 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-09 16:03:26.289000000 +0000 2026-03-09T16:10:52.297 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T16:10:52.365 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-09T16:10:52.366 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-09T16:10:52.366 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000179665 s, 2.8 MB/s 2026-03-09T16:10:52.367 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T16:10:52.429 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vde 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vde 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 226 Links: 1 Device type: fc,40 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-09 16:10:36.180959742 +0000 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-09 16:03:28.118000000 +0000 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-09 16:03:28.118000000 +0000 2026-03-09T16:10:52.491 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-09 16:03:26.294000000 +0000 2026-03-09T16:10:52.492 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T16:10:52.559 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-09T16:10:52.559 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-09T16:10:52.559 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000196367 s, 2.6 MB/s 2026-03-09T16:10:52.560 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T16:10:52.621 INFO:tasks.cephadm:Deploying osd.0 on vm03 with /dev/vde... 2026-03-09T16:10:52.621 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vde 2026-03-09T16:10:52.824 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:53.380 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-09T16:10:53.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:53 vm03 ceph-mon[51019]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-09T16:10:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:53 vm05 ceph-mon[58702]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-09T16:10:53.603 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:10:53.615 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm03:/dev/vde 2026-03-09T16:10:53.940 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:10:54.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.247+0000 7f56f8a22640 1 -- 192.168.123.103:0/2571496080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 msgr2=0x7f56ec0a47f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:54.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.247+0000 7f56f8a22640 1 --2- 192.168.123.103:0/2571496080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0a47f0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f56f4066a00 tx=0x7f56f4092a10 comp rx=0 tx=0).stop 2026-03-09T16:10:54.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 -- 192.168.123.103:0/2571496080 shutdown_connections 2026-03-09T16:10:54.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 --2- 192.168.123.103:0/2571496080 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0a47f0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:54.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 --2- 192.168.123.103:0/2571496080 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 0x7f56ec0a6130 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:54.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 -- 192.168.123.103:0/2571496080 >> 192.168.123.103:0/2571496080 conn(0x7f56ec09fec0 msgr2=0x7f56ec0a2320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:10:54.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 -- 192.168.123.103:0/2571496080 shutdown_connections 2026-03-09T16:10:54.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 -- 192.168.123.103:0/2571496080 wait complete. 2026-03-09T16:10:54.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.251+0000 7f56f8a22640 1 Processor -- start 2026-03-09T16:10:54.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f8a22640 1 -- start start 2026-03-09T16:10:54.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f8a22640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f8a22640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 0x7f56ec0d0500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f8a22640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56ec0d1a00 con 0x7f56ec0a4370 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f8a22640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56ec0d1b70 con 0x7f56ec0a5d30 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57354/0 (socket says 192.168.123.103:57354) 2026-03-09T16:10:54.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 -- 192.168.123.103:0/72040983 learned_addr learned my addr 192.168.123.103:0/72040983 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f1d74640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 0x7f56ec0d0500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 -- 192.168.123.103:0/72040983 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 msgr2=0x7f56ec0d0500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 0x7f56ec0d0500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 -- 192.168.123.103:0/72040983 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56f404f090 con 0x7f56ec0a4370 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.252+0000 7f56f2575640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f56e400b4f0 tx=0x7f56e400b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56e4004280 con 0x7f56ec0a4370 2026-03-09T16:10:54.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f56e40043e0 con 0x7f56ec0a4370 2026-03-09T16:10:54.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56e4010bf0 con 0x7f56ec0a4370 2026-03-09T16:10:54.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56ec0d0b00 con 0x7f56ec0a4370 2026-03-09T16:10:54.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56ec0111f0 con 0x7f56ec0a4370 2026-03-09T16:10:54.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.255+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56c0005350 con 0x7f56ec0a4370 2026-03-09T16:10:54.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.257+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f56e40026e0 con 0x7f56ec0a4370 2026-03-09T16:10:54.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.257+0000 7f56e37fe640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 0x7f56c40785f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:10:54.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.257+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f56e4096a90 con 0x7f56ec0a4370 2026-03-09T16:10:54.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.258+0000 7f56f1d74640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 0x7f56c40785f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:10:54.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.260+0000 7f56f1d74640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 0x7f56c40785f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f56f4050410 tx=0x7f56f40674b0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:10:54.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.260+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f56e4061170 con 0x7f56ec0a4370 2026-03-09T16:10:54.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:10:54.368+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f56c0002bf0 con 0x7f56c4076130 2026-03-09T16:10:54.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:54.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:54.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:54 vm03 ceph-mon[51019]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-09T16:10:54.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:54 vm03 ceph-mon[51019]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-09T16:10:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:54 vm05 ceph-mon[58702]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-09T16:10:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:54 vm05 ceph-mon[58702]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:55.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:55 vm03 ceph-mon[51019]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:10:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:55 vm05 ceph-mon[58702]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/937446830' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d36e00ca-e7bc-4475-866a-be22243d455f"}]: dispatch 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/937446830' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d36e00ca-e7bc-4475-866a-be22243d455f"}]': finished 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:10:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:56 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2039746932' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/937446830' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d36e00ca-e7bc-4475-866a-be22243d455f"}]: dispatch 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/937446830' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d36e00ca-e7bc-4475-866a-be22243d455f"}]': finished 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:10:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:56 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2039746932' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:10:57.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:57 vm03 ceph-mon[51019]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:57 vm05 ceph-mon[58702]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:10:59 vm03 ceph-mon[51019]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:10:59.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:10:59 vm05 ceph-mon[58702]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring daemon crash.vm05 on vm05 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Deploying daemon osd.0 on vm03 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring mgr.vm05.dygxfv (monmap changed)... 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: Reconfiguring daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:11:00.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:00 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring daemon crash.vm05 on vm05 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Deploying daemon osd.0 on vm03 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring mgr.vm05.dygxfv (monmap changed)... 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: Reconfiguring daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:11:00.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:00 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:02.313 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: Reconfiguring daemon mon.vm05 on vm05 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:02 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: Reconfiguring daemon mon.vm05 on vm05 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:02.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:02 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.402+0000 7f56e37fe640 1 -- 192.168.123.103:0/72040983 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f56c0002bf0 con 0x7f56c4076130 2026-03-09T16:11:03.405 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 0 on host 'vm03' 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 msgr2=0x7f56c40785f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 0x7f56c40785f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f56f4050410 tx=0x7f56f40674b0 comp rx=0 tx=0).stop 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 msgr2=0x7f56ec0cffc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f56e400b4f0 tx=0x7f56e400b9c0 comp rx=0 tx=0).stop 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 shutdown_connections 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f56c4076130 0x7f56c40785f0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56ec0a5d30 0x7f56ec0d0500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 --2- 192.168.123.103:0/72040983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56ec0a4370 0x7f56ec0cffc0 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.404+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 >> 192.168.123.103:0/72040983 conn(0x7f56ec09fec0 msgr2=0x7f56ec0049b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.405+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 shutdown_connections 2026-03-09T16:11:03.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:03.405+0000 7f56f8a22640 1 -- 192.168.123.103:0/72040983 wait complete. 2026-03-09T16:11:03.499 DEBUG:teuthology.orchestra.run.vm03:osd.0> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.0.service 2026-03-09T16:11:03.512 INFO:tasks.cephadm:Deploying osd.1 on vm03 with /dev/vdd... 2026-03-09T16:11:03.512 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vdd 2026-03-09T16:11:03.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:03.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:03.752 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:03.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:04.292 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:04.312 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm03:/dev/vdd 2026-03-09T16:11:04.405 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:04 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:11:04.149+0000 7f7f88894740 -1 osd.0 0 log_to_monitors true 2026-03-09T16:11:04.514 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:04.789 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:04 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:04.789 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:04 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:04.789 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:04 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:04.789 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:04 vm03 ceph-mon[51019]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.938+0000 7f8bd3fff640 1 -- 192.168.123.103:0/1717088127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd4072370 msgr2=0x7f8bd410c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.938+0000 7f8bd3fff640 1 --2- 192.168.123.103:0/1717088127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd4072370 0x7f8bd410c590 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f8bcc01c7f0 tx=0x7f8bcc040af0 comp rx=0 tx=0).stop 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 -- 192.168.123.103:0/1717088127 shutdown_connections 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 --2- 192.168.123.103:0/1717088127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd4072370 0x7f8bd410c590 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 --2- 192.168.123.103:0/1717088127 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8bd40719a0 0x7f8bd4071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 -- 192.168.123.103:0/1717088127 >> 192.168.123.103:0/1717088127 conn(0x7f8bd406d4f0 msgr2=0x7f8bd406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 -- 192.168.123.103:0/1717088127 shutdown_connections 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.939+0000 7f8bd3fff640 1 -- 192.168.123.103:0/1717088127 wait complete. 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 Processor -- start 2026-03-09T16:11:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 -- start start 2026-03-09T16:11:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8bd40719a0 0x7f8bd419e730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8bd419f0f0 con 0x7f8bd419ec70 2026-03-09T16:11:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd3fff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8bd419f260 con 0x7f8bd40719a0 2026-03-09T16:11:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59824/0 (socket says 192.168.123.103:59824) 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 -- 192.168.123.103:0/2373015186 learned_addr learned my addr 192.168.123.103:0/2373015186 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 -- 192.168.123.103:0/2373015186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8bd40719a0 msgr2=0x7f8bd419e730 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8bd40719a0 0x7f8bd419e730 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.940+0000 7f8bd27fc640 1 -- 192.168.123.103:0/2373015186 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8bcc009d00 con 0x7f8bd419ec70 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.941+0000 7f8bd27fc640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f8bcc004a70 tx=0x7f8bcc0079f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.941+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bcc007e00 con 0x7f8bd419ec70 2026-03-09T16:11:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.941+0000 7f8bd3fff640 1 -- 192.168.123.103:0/2373015186 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8bd41a4220 con 0x7f8bd419ec70 2026-03-09T16:11:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.941+0000 7f8bd3fff640 1 -- 192.168.123.103:0/2373015186 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8bd41a4720 con 0x7f8bd419ec70 2026-03-09T16:11:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.942+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8bcc004510 con 0x7f8bd419ec70 2026-03-09T16:11:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.942+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bcc045400 con 0x7f8bd419ec70 2026-03-09T16:11:04.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.943+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f8bcc04c070 con 0x7f8bd419ec70 2026-03-09T16:11:04.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.945+0000 7f8bb3fff640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 0x7f8ba80786c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:04.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.945+0000 7f8bd2ffd640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 0x7f8ba80786c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:04.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.946+0000 7f8bd2ffd640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 0x7f8ba80786c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8bd4071800 tx=0x7f8bbc006cd0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:04.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.946+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1485+0+0 (secure 0 0 0) 0x7f8bcc0ce1e0 con 0x7f8bd419ec70 2026-03-09T16:11:04.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.946+0000 7f8bd3fff640 1 -- 192.168.123.103:0/2373015186 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ba0005350 con 0x7f8bd419ec70 2026-03-09T16:11:04.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:04.953+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8bcc0987c0 con 0x7f8bd419ec70 2026-03-09T16:11:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:04 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:04 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:04 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:04 vm05 ceph-mon[58702]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T16:11:05.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:05.090+0000 7f8bd3fff640 1 -- 192.168.123.103:0/2373015186 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f8ba0002bf0 con 0x7f8ba8076200 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:05 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:06.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:05 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:06.056 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:05 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:11:05.979+0000 7f7f83ff0640 -1 osd.0 0 waiting for initial osdmap 2026-03-09T16:11:06.056 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:05 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:11:05.984+0000 7f7f8062c640 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/472175681' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "77efea00-570c-4571-a7a6-968cc4097343"}]: dispatch 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/472175681' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "77efea00-570c-4571-a7a6-968cc4097343"}]': finished 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: osdmap e8: 2 total, 0 up, 2 in 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:06 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1908936801' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/472175681' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "77efea00-570c-4571-a7a6-968cc4097343"}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/472175681' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "77efea00-570c-4571-a7a6-968cc4097343"}]': finished 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: osdmap e8: 2 total, 0 up, 2 in 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:07.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:06 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1908936801' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:07.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] boot 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:07.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: osd.0 [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] boot 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:11:08.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:09.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:08 vm05 ceph-mon[58702]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:09.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:08 vm05 ceph-mon[58702]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T16:11:09.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:08 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:09.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:08 vm03 ceph-mon[51019]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T16:11:09.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:08 vm03 ceph-mon[51019]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T16:11:09.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:08 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: Detected new or changed devices on vm03 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:11:11.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: Detected new or changed devices on vm03 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:11:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:12 vm03 ceph-mon[51019]: Deploying daemon osd.1 on vm03 2026-03-09T16:11:12.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:12 vm05 ceph-mon[58702]: Deploying daemon osd.1 on vm03 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.412 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 1 on host 'vm03' 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.609+0000 7f8bb3fff640 1 -- 192.168.123.103:0/2373015186 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f8ba0002bf0 con 0x7f8ba8076200 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.611+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 msgr2=0x7f8ba80786c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.611+0000 7f8bb1ffb640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 0x7f8ba80786c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8bd4071800 tx=0x7f8bbc006cd0 comp rx=0 tx=0).stop 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.611+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 msgr2=0x7f8bd41a3ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:13.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.611+0000 7f8bb1ffb640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f8bcc004a70 tx=0x7f8bcc0079f0 comp rx=0 tx=0).stop 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 shutdown_connections 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8ba8076200 0x7f8ba80786c0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8bd419ec70 0x7f8bd41a3ce0 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 --2- 192.168.123.103:0/2373015186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8bd40719a0 0x7f8bd419e730 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 >> 192.168.123.103:0/2373015186 conn(0x7f8bd406d4f0 msgr2=0x7f8bd40703a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 shutdown_connections 2026-03-09T16:11:13.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:13.612+0000 7f8bb1ffb640 1 -- 192.168.123.103:0/2373015186 wait complete. 2026-03-09T16:11:13.690 DEBUG:teuthology.orchestra.run.vm03:osd.1> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.1.service 2026-03-09T16:11:13.692 INFO:tasks.cephadm:Deploying osd.2 on vm03 with /dev/vdc... 2026-03-09T16:11:13.692 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vdc 2026-03-09T16:11:13.918 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:14.521 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:14.533 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:14 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:11:14.271+0000 7f9daeb1a740 -1 osd.1 0 log_to_monitors true 2026-03-09T16:11:14.542 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm03:/dev/vdc 2026-03-09T16:11:14.772 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:14.798 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:14.798 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:14.798 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:14 vm03 ceph-mon[51019]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T16:11:14.798 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:14.798 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:14 vm05 ceph-mon[58702]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T16:11:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.053+0000 7f936ddfd640 1 -- 192.168.123.103:0/3146860953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368072710 msgr2=0x7f936810c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.053+0000 7f936ddfd640 1 --2- 192.168.123.103:0/3146860953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368072710 0x7f936810c590 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f9354009a00 tx=0x7f935402f290 comp rx=0 tx=0).stop 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.055+0000 7f936ddfd640 1 -- 192.168.123.103:0/3146860953 shutdown_connections 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.055+0000 7f936ddfd640 1 --2- 192.168.123.103:0/3146860953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368072710 0x7f936810c590 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.055+0000 7f936ddfd640 1 --2- 192.168.123.103:0/3146860953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9368071d40 0x7f9368072140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.055+0000 7f936ddfd640 1 -- 192.168.123.103:0/3146860953 >> 192.168.123.103:0/3146860953 conn(0x7f936806d660 msgr2=0x7f936806faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:15.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.056+0000 7f936ddfd640 1 -- 192.168.123.103:0/3146860953 shutdown_connections 2026-03-09T16:11:15.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.056+0000 7f936ddfd640 1 -- 192.168.123.103:0/3146860953 wait complete. 2026-03-09T16:11:15.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 Processor -- start 2026-03-09T16:11:15.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 -- start start 2026-03-09T16:11:15.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:15.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9368072710 0x7f93681a78e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:15.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93681a7eb0 con 0x7f9368071d40 2026-03-09T16:11:15.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.057+0000 7f936ddfd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93681a8020 con 0x7f9368072710 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33998/0 (socket says 192.168.123.103:33998) 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 -- 192.168.123.103:0/2062560296 learned_addr learned my addr 192.168.123.103:0/2062560296 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 -- 192.168.123.103:0/2062560296 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9368072710 msgr2=0x7f93681a78e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9368072710 0x7f93681a78e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 -- 192.168.123.103:0/2062560296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9354009660 con 0x7f9368071d40 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f93677fe640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f935c00ba50 tx=0x7f935c00bf20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f935c002c70 con 0x7f9368071d40 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.058+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f935c002dd0 con 0x7f9368071d40 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.059+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f935c004920 con 0x7f9368071d40 2026-03-09T16:11:15.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.059+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9368110000 con 0x7f9368071d40 2026-03-09T16:11:15.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.059+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9368110550 con 0x7f9368071d40 2026-03-09T16:11:15.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.060+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f935c00c780 con 0x7f9368071d40 2026-03-09T16:11:15.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.061+0000 7f9364ff9640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 0x7f933c0784b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:15.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.061+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(11..11 src has 1..11) v4 ==== 2108+0+0 (secure 0 0 0) 0x7f935c01a020 con 0x7f9368071d40 2026-03-09T16:11:15.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.062+0000 7f9366ffd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 0x7f933c0784b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:15.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.061+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9368118f60 con 0x7f9368071d40 2026-03-09T16:11:15.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.065+0000 7f9366ffd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 0x7f933c0784b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f935402f7a0 tx=0x7f93540023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:15.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.065+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f935c060690 con 0x7f9368071d40 2026-03-09T16:11:15.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:15.194+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f936810bf80 con 0x7f933c075ff0 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:15.717 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:16.622 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:16.622 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: osdmap e12: 2 total, 1 up, 2 in 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2975442672' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5f4a9aed-e670-4b8f-b945-c157bdccafca"}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2975442672' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "5f4a9aed-e670-4b8f-b945-c157bdccafca"}]': finished 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: osdmap e13: 3 total, 1 up, 3 in 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:16.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: osdmap e12: 2 total, 1 up, 2 in 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2975442672' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5f4a9aed-e670-4b8f-b945-c157bdccafca"}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2975442672' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "5f4a9aed-e670-4b8f-b945-c157bdccafca"}]': finished 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: osdmap e13: 3 total, 1 up, 3 in 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:17.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:18.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:17 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:18.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:17 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:18.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:18.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:17 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2176450792' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:18 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:18 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:18 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2176450792' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:19.390 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:19 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:11:19.022+0000 7f9dabab0640 -1 osd.1 0 waiting for initial osdmap 2026-03-09T16:11:19.390 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:19 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:11:19.148+0000 7f9da60b1640 -1 osd.1 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:19 vm03 ceph-mon[51019]: pgmap v23: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:19 vm05 ceph-mon[58702]: pgmap v23: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:20.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' 2026-03-09T16:11:20.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:20.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:20.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072]' entity='osd.1' 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:21.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:21 vm03 ceph-mon[51019]: pgmap v24: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:21.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:21 vm03 ceph-mon[51019]: osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] boot 2026-03-09T16:11:21.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:21 vm03 ceph-mon[51019]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T16:11:21.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:21 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:21.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:21 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:21 vm05 ceph-mon[58702]: pgmap v24: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T16:11:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:21 vm05 ceph-mon[58702]: osd.1 [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] boot 2026-03-09T16:11:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:21 vm05 ceph-mon[58702]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T16:11:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:21 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:11:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:21 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:22.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:22 vm03 ceph-mon[51019]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T16:11:22.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:22 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:22 vm05 ceph-mon[58702]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T16:11:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:22 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:23.059 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:23 vm03 ceph-mon[51019]: pgmap v27: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:23.059 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:23 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:11:23.059 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:23 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:23.059 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:23 vm03 ceph-mon[51019]: Deploying daemon osd.2 on vm03 2026-03-09T16:11:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:23 vm05 ceph-mon[58702]: pgmap v27: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:23 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:11:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:23 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:23 vm05 ceph-mon[58702]: Deploying daemon osd.2 on vm03 2026-03-09T16:11:24.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:24 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:24.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:24 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:24.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:24 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:24 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:24 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:24 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:24.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.787+0000 7f9364ff9640 1 -- 192.168.123.103:0/2062560296 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f936810bf80 con 0x7f933c075ff0 2026-03-09T16:11:24.791 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 2 on host 'vm03' 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 msgr2=0x7f933c0784b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 0x7f933c0784b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f935402f7a0 tx=0x7f93540023d0 comp rx=0 tx=0).stop 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 msgr2=0x7f93681a73a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f935c00ba50 tx=0x7f935c00bf20 comp rx=0 tx=0).stop 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 shutdown_connections 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.789+0000 7f936ddfd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f933c075ff0 0x7f933c0784b0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.790+0000 7f936ddfd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9368072710 0x7f93681a78e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.790+0000 7f936ddfd640 1 --2- 192.168.123.103:0/2062560296 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9368071d40 0x7f93681a73a0 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.790+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 >> 192.168.123.103:0/2062560296 conn(0x7f936806d660 msgr2=0x7f936810a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.790+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 shutdown_connections 2026-03-09T16:11:24.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:24.790+0000 7f936ddfd640 1 -- 192.168.123.103:0/2062560296 wait complete. 2026-03-09T16:11:24.852 DEBUG:teuthology.orchestra.run.vm03:osd.2> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.2.service 2026-03-09T16:11:24.854 INFO:tasks.cephadm:Deploying osd.3 on vm05 with /dev/vde... 2026-03-09T16:11:24.854 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vde 2026-03-09T16:11:25.010 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:25 vm03 ceph-mon[51019]: pgmap v28: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:25 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:25 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:25.156 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:25 vm05 ceph-mon[58702]: pgmap v28: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:25.156 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:25 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:25.156 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:25 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:25.484 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:25 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:11:25.481+0000 7fa3ecfa0740 -1 osd.2 0 log_to_monitors true 2026-03-09T16:11:25.511 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:11:25.525 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm05:/dev/vde 2026-03-09T16:11:25.682 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:25.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.941+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/804399301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4101a10 msgr2=0x7ff8b4101e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.941+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/804399301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4101a10 0x7ff8b4101e90 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7ff8a4009a00 tx=0x7ff8a402f280 comp rx=0 tx=0).stop 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.942+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/804399301 shutdown_connections 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.942+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/804399301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4101a10 0x7ff8b4101e90 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.942+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/804399301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4100810 0x7ff8b4100c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.942+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/804399301 >> 192.168.123.105:0/804399301 conn(0x7ff8b40fbf80 msgr2=0x7ff8b40fe3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.943+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/804399301 shutdown_connections 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.943+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/804399301 wait complete. 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.943+0000 7ff8bc6fb640 1 Processor -- start 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8bc6fb640 1 -- start start 2026-03-09T16:11:25.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8bc6fb640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:25.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8bc6fb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4101a10 0x7ff8b4198760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8bc6fb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8b4198d30 con 0x7ff8b4101a10 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8bc6fb640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8b4198ea0 con 0x7ff8b4100810 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8ba470640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8ba470640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42980/0 (socket says 192.168.123.105:42980) 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.944+0000 7ff8ba470640 1 -- 192.168.123.105:0/2539818877 learned_addr learned my addr 192.168.123.105:0/2539818877 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.945+0000 7ff8ba470640 1 -- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4101a10 msgr2=0x7ff8b4198760 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.945+0000 7ff8b9c6f640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4101a10 0x7ff8b4198760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.945+0000 7ff8ba470640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4101a10 0x7ff8b4198760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.945+0000 7ff8ba470640 1 -- 192.168.123.105:0/2539818877 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff8a4009660 con 0x7ff8b4100810 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.945+0000 7ff8ba470640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff89c009e30 tx=0x7ff89c00b690 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:25.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.946+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff89c015070 con 0x7ff8b4100810 2026-03-09T16:11:25.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.946+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff89c0026e0 con 0x7ff8b4100810 2026-03-09T16:11:25.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.946+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8b40733a0 con 0x7ff8b4100810 2026-03-09T16:11:25.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.947+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff89c0197b0 con 0x7ff8b4100810 2026-03-09T16:11:25.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.948+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8b40738f0 con 0x7ff8b4100810 2026-03-09T16:11:25.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.949+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 98424+0+0 (secure 0 0 0) 0x7ff89c0199b0 con 0x7ff8b4100810 2026-03-09T16:11:25.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.949+0000 7ff8ab7fe640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 0x7ff8880786c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:25.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.949+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(15..15 src has 1..15) v4 ==== 2519+0+0 (secure 0 0 0) 0x7ff89c0a0bb0 con 0x7ff8b4100810 2026-03-09T16:11:25.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.949+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8b4109440 con 0x7ff8b4100810 2026-03-09T16:11:25.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.951+0000 7ff8b9c6f640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 0x7ff8880786c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:25.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.951+0000 7ff8b9c6f640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 0x7ff8880786c0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7ff8a4002c80 tx=0x7ff8a4005b00 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:25.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:25.953+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff89c0a1050 con 0x7ff8b4100810 2026-03-09T16:11:26.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:26.053+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7ff8b4105c50 con 0x7ff888076200 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:26.591 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:26 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:26.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:26 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:27.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:27 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:11:27.368+0000 7fa3e86fc640 -1 osd.2 0 waiting for initial osdmap 2026-03-09T16:11:27.641 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:27 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:11:27.420+0000 7fa3e4d38640 -1 osd.2 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: pgmap v29: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='client.24125 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: osdmap e16: 3 total, 2 up, 3 in 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2292165198' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]': finished 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: osdmap e17: 4 total, 2 up, 4 in 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:28.184 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:27 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: pgmap v29: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='client.24125 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: osdmap e16: 3 total, 2 up, 3 in 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/2292165198' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]: dispatch 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "aa64c4f2-8110-40fd-928c-4df2efafc82e"}]': finished 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: osdmap e17: 4 total, 2 up, 4 in 2026-03-09T16:11:28.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:28.211 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:28.211 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:27 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: Detected new or changed devices on vm03 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: pgmap v32: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2993626156' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] boot 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:29.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:29 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: Detected new or changed devices on vm03 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: pgmap v32: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/2993626156' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: osd.2 [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] boot 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:11:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:29 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:30.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:30 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:30 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T16:11:32.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90531]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdd 2026-03-09T16:11:32.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90531]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T16:11:32.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T16:11:32.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90531]: pam_unix(sudo:session): session closed for user root 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90527]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vde 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90527]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90527]: pam_unix(sudo:session): session closed for user root 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90535]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdc 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90535]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T16:11:32.141 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90535]: pam_unix(sudo:session): session closed for user root 2026-03-09T16:11:32.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90539]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T16:11:32.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90539]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T16:11:32.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T16:11:32.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:31 vm03 sudo[90539]: pam_unix(sudo:session): session closed for user root 2026-03-09T16:11:32.238 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:31 vm05 sudo[63745]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T16:11:32.238 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:31 vm05 sudo[63745]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T16:11:32.238 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:31 vm05 sudo[63745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T16:11:32.238 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:31 vm05 sudo[63745]: pam_unix(sudo:session): session closed for user root 2026-03-09T16:11:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:32.422+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff89c061d40 con 0x7ff8b4100810 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:11:32.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:32 vm05 ceph-mon[58702]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T16:11:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T16:11:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T16:11:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:11:32.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:32 vm03 ceph-mon[51019]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: mgrmap e18: vm03.gbgzmu(active, since 58s), standbys: vm05.dygxfv 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: osdmap e21: 4 total, 3 up, 4 in 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:11:33.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:33 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:33.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:33.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: mgrmap e18: vm03.gbgzmu(active, since 58s), standbys: vm05.dygxfv 2026-03-09T16:11:33.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: osdmap e21: 4 total, 3 up, 4 in 2026-03-09T16:11:33.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:33.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:11:33.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:33 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:34.487 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:34 vm05 ceph-mon[58702]: Deploying daemon osd.3 on vm05 2026-03-09T16:11:34.487 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:34 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:34.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:34 vm03 ceph-mon[51019]: Deploying daemon osd.3 on vm05 2026-03-09T16:11:34.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:34 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:35.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.423+0000 7ff8ab7fe640 1 -- 192.168.123.105:0/2539818877 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff8b4105c50 con 0x7ff888076200 2026-03-09T16:11:35.425 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 3 on host 'vm05' 2026-03-09T16:11:35.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 msgr2=0x7ff8880786c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:35.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 0x7ff8880786c0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7ff8a4002c80 tx=0x7ff8a4005b00 comp rx=0 tx=0).stop 2026-03-09T16:11:35.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 msgr2=0x7ff8b4198220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:35.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff89c009e30 tx=0x7ff89c00b690 comp rx=0 tx=0).stop 2026-03-09T16:11:35.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 shutdown_connections 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff888076200 0x7ff8880786c0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff8b4101a10 0x7ff8b4198760 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 --2- 192.168.123.105:0/2539818877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8b4100810 0x7ff8b4198220 secure :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff89c009e30 tx=0x7ff89c00b690 comp rx=0 tx=0).stop 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.426+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 >> 192.168.123.105:0/2539818877 conn(0x7ff8b40fbf80 msgr2=0x7ff8b40fdb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.427+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 shutdown_connections 2026-03-09T16:11:35.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:35.428+0000 7ff8bc6fb640 1 -- 192.168.123.105:0/2539818877 wait complete. 2026-03-09T16:11:35.509 DEBUG:teuthology.orchestra.run.vm05:osd.3> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.3.service 2026-03-09T16:11:35.511 INFO:tasks.cephadm:Deploying osd.4 on vm05 with /dev/vdd... 2026-03-09T16:11:35.511 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vdd 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: pgmap v39: 1 pgs: 1 unknown; 0 B data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:35 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.732 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: pgmap v39: 1 pgs: 1 unknown; 0 B data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:35.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:35 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:36.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:11:35 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:11:35.903+0000 7f8c9f752740 -1 osd.3 0 log_to_monitors true 2026-03-09T16:11:36.300 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:11:36.314 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm05:/dev/vdd 2026-03-09T16:11:36.513 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:36.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:36 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:36.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:36 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:36.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:36 vm05 ceph-mon[58702]: from='osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:11:36.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:36 vm05 ceph-mon[58702]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:11:36.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3960182169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78072710 msgr2=0x7fcd7810c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3960182169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78072710 0x7fcd7810c590 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fcd60009a00 tx=0x7fcd6002f290 comp rx=0 tx=0).stop 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3960182169 shutdown_connections 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3960182169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78072710 0x7fcd7810c590 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3960182169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd78071d40 0x7fcd78072140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3960182169 >> 192.168.123.105:0/3960182169 conn(0x7fcd7806d660 msgr2=0x7fcd7806faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3960182169 shutdown_connections 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3960182169 wait complete. 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 Processor -- start 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- start start 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd78072710 0x7fcd78116fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd781184e0 con 0x7fcd78072710 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.792+0000 7fcd7d18c640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd78118650 con 0x7fcd78071d40 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52958/0 (socket says 192.168.123.105:52958) 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 -- 192.168.123.105:0/3631409164 learned_addr learned my addr 192.168.123.105:0/3631409164 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 -- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd78072710 msgr2=0x7fcd78116fe0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd78072710 0x7fcd78116fe0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 -- 192.168.123.105:0/3631409164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd60009660 con 0x7fcd78071d40 2026-03-09T16:11:36.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.793+0000 7fcd76d76640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c00b750 tx=0x7fcd6c00bc20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:36.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.794+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd6c004070 con 0x7fcd78071d40 2026-03-09T16:11:36.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.794+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcd6c002780 con 0x7fcd78071d40 2026-03-09T16:11:36.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.794+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd6c00ca90 con 0x7fcd78071d40 2026-03-09T16:11:36.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.794+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd781175e0 con 0x7fcd78071d40 2026-03-09T16:11:36.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.794+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd781b5b30 con 0x7fcd78071d40 2026-03-09T16:11:36.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.796+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fcd6c00cbf0 con 0x7fcd78071d40 2026-03-09T16:11:36.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.796+0000 7fcd57fff640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 0x7fcd4c078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:36.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.797+0000 7fcd76575640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 0x7fcd4c078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:36.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.797+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd44005350 con 0x7fcd78071d40 2026-03-09T16:11:36.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.797+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(22..22 src has 1..22) v4 ==== 3358+0+0 (secure 0 0 0) 0x7fcd6c0964a0 con 0x7fcd78071d40 2026-03-09T16:11:36.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.798+0000 7fcd76575640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 0x7fcd4c078750 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcd6002f7a0 tx=0x7fcd600023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:36.801 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.801+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fcd6c0603f0 con 0x7fcd78071d40 2026-03-09T16:11:36.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:36.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:36 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:36.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:36 vm03 ceph-mon[51019]: from='osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:11:36.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:36 vm03 ceph-mon[51019]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:11:36.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:36.918+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7fcd44002bf0 con 0x7fcd4c076290 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: osdmap e22: 4 total, 3 up, 4 in 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:37.845 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:37 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: osdmap e22: 4 total, 3 up, 4 in 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:37.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:37 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:38.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='client.24149 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: Detected new or changed devices on vm05 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: osdmap e23: 4 total, 3 up, 4 in 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/833309098' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]': finished 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: osdmap e24: 5 total, 3 up, 5 in 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:38.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:38 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1302590287' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='client.24149 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: Detected new or changed devices on vm05 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: osdmap e23: 4 total, 3 up, 4 in 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/833309098' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1567921f-08ce-4412-84d0-a4474c4e6ac0"}]': finished 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: osdmap e24: 5 total, 3 up, 5 in 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:38 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/1302590287' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:39.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:11:38 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:11:38.718+0000 7f8c9aeae640 -1 osd.3 0 waiting for initial osdmap 2026-03-09T16:11:39.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:11:38 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:11:38.727+0000 7f8c96ce9640 -1 osd.3 24 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: from='osd.3 ' entity='osd.3' 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] boot 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:39.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:39 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: from='osd.3 ' entity='osd.3' 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: osd.3 [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] boot 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:11:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:39 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:41.263 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:40 vm05 ceph-mon[58702]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:41.263 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:40 vm05 ceph-mon[58702]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T16:11:41.263 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:40 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:41.376 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:40 vm03 ceph-mon[51019]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:41.376 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:40 vm03 ceph-mon[51019]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T16:11:41.376 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:40 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:42.194 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:41 vm05 ceph-mon[58702]: osdmap e27: 5 total, 4 up, 5 in 2026-03-09T16:11:42.194 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:41 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:42.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:41 vm03 ceph-mon[51019]: osdmap e27: 5 total, 4 up, 5 in 2026-03-09T16:11:42.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:41 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:43.001 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:42 vm05 ceph-mon[58702]: pgmap v49: 1 pgs: 1 peering; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:43.001 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:42 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:11:43.001 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:42 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:43.001 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:42 vm05 ceph-mon[58702]: Deploying daemon osd.4 on vm05 2026-03-09T16:11:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:42 vm03 ceph-mon[51019]: pgmap v49: 1 pgs: 1 peering; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:11:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:42 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:42 vm03 ceph-mon[51019]: Deploying daemon osd.4 on vm05 2026-03-09T16:11:44.038 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:43 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:44.039 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:43 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:44.039 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:43 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:44.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:43 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:44.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:43 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:44.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:43 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:44.506 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 4 on host 'vm05' 2026-03-09T16:11:44.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.502+0000 7fcd57fff640 1 -- 192.168.123.105:0/3631409164 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fcd44002bf0 con 0x7fcd4c076290 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 msgr2=0x7fcd4c078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 0x7fcd4c078750 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcd6002f7a0 tx=0x7fcd600023d0 comp rx=0 tx=0).stop 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 msgr2=0x7fcd78116aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c00b750 tx=0x7fcd6c00bc20 comp rx=0 tx=0).stop 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 shutdown_connections 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcd4c076290 0x7fcd4c078750 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd78072710 0x7fcd78116fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 --2- 192.168.123.105:0/3631409164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd78071d40 0x7fcd78116aa0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 >> 192.168.123.105:0/3631409164 conn(0x7fcd7806d660 msgr2=0x7fcd7810a7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.506+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 shutdown_connections 2026-03-09T16:11:44.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:44.507+0000 7fcd7d18c640 1 -- 192.168.123.105:0/3631409164 wait complete. 2026-03-09T16:11:44.566 DEBUG:teuthology.orchestra.run.vm05:osd.4> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service 2026-03-09T16:11:44.569 INFO:tasks.cephadm:Deploying osd.5 on vm05 with /dev/vdc... 2026-03-09T16:11:44.569 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- lvm zap /dev/vdc 2026-03-09T16:11:44.780 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:45.335 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:11:45.349 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph orch daemon add osd vm05:/dev/vdc 2026-03-09T16:11:45.445 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:11:45 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:11:45.127+0000 7f8fe8db1740 -1 osd.4 0 log_to_monitors true 2026-03-09T16:11:45.544 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: pgmap v50: 1 pgs: 1 peering; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:11:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:45 vm03 ceph-mon[51019]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: pgmap v50: 1 pgs: 1 peering; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:11:45.709 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:45 vm05 ceph-mon[58702]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:11:45.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.825+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/1369611611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0072710 msgr2=0x7fe2d010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:45.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.825+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/1369611611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0072710 0x7fe2d010c590 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fe2b80099b0 tx=0x7fe2b802f240 comp rx=0 tx=0).stop 2026-03-09T16:11:45.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.828+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/1369611611 shutdown_connections 2026-03-09T16:11:45.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.828+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/1369611611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0072710 0x7fe2d010c590 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:45.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.828+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/1369611611 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2d0071d40 0x7fe2d0072140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:45.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.828+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/1369611611 >> 192.168.123.105:0/1369611611 conn(0x7fe2d006d660 msgr2=0x7fe2d006faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:45.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.829+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/1369611611 shutdown_connections 2026-03-09T16:11:45.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.829+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/1369611611 wait complete. 2026-03-09T16:11:45.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.829+0000 7fe2d6d3f640 1 Processor -- start 2026-03-09T16:11:45.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.829+0000 7fe2d6d3f640 1 -- start start 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d6d3f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d6d3f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2d0072710 0x7fe2d0117000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d6d3f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2d0118500 con 0x7fe2d0072710 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d6d3f640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2d0118670 con 0x7fe2d0071d40 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:58106/0 (socket says 192.168.123.105:58106) 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 -- 192.168.123.105:0/2743697440 learned_addr learned my addr 192.168.123.105:0/2743697440 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 -- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2d0072710 msgr2=0x7fe2d0117000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2d0072710 0x7fe2d0117000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 -- 192.168.123.105:0/2743697440 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2b8009660 con 0x7fe2d0071d40 2026-03-09T16:11:45.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.830+0000 7fe2d4ab4640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fe2c400b700 tx=0x7fe2c400bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.831+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2c400be90 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.831+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2d0117660 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.831+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2d01b5b10 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.831+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe2c4002ba0 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.831+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2c400ca50 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.832+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe2c400cbb0 con 0x7fe2d0071d40 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.833+0000 7fe2cdffb640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 0x7fe2ac078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:45.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.833+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(28..28 src has 1..28) v4 ==== 3890+0+0 (secure 0 0 0) 0x7fe2c4097780 con 0x7fe2d0071d40 2026-03-09T16:11:45.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.834+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe298005350 con 0x7fe2d0071d40 2026-03-09T16:11:45.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.838+0000 7fe2cffff640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 0x7fe2ac078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:45.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.838+0000 7fe2cffff640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 0x7fe2ac078680 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fe2b802f750 tx=0x7fe2b8005b20 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:45.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.840+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe2c4061430 con 0x7fe2d0071d40 2026-03-09T16:11:45.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:45.938+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7fe298002bf0 con 0x7fe2ac0761c0 2026-03-09T16:11:46.849 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:11:46 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:11:46.647+0000 7f8fe5d47640 -1 osd.4 0 waiting for initial osdmap 2026-03-09T16:11:46.849 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:11:46 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:11:46.668+0000 7f8fe0b49640 -1 osd.4 29 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:46.849 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: osdmap e28: 5 total, 4 up, 5 in 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='client.24167 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: Detected new or changed devices on vm05 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:46.850 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:46 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: osdmap e28: 5 total, 4 up, 5 in 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 507 MiB used, 79 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='client.24167 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: Detected new or changed devices on vm05 2026-03-09T16:11:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:47.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:46 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: osdmap e29: 5 total, 4 up, 5 in 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/1240126598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]: dispatch 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]: dispatch 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]': finished 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] boot 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:47 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/2101739670' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: osdmap e29: 5 total, 4 up, 5 in 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/1240126598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c322dd19-66a4-4f40-abd7-54565e63f71b"}]': finished 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: osd.4 [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] boot 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:47 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/2101739670' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T16:11:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:48 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:48 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:48 vm03 ceph-mon[51019]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T16:11:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:48 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:48 vm03 ceph-mon[51019]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 533 MiB used, 99 GiB / 100 GiB avail; 112 KiB/s, 0 objects/s recovering 2026-03-09T16:11:49.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:48 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:49.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:48 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:49.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:48 vm05 ceph-mon[58702]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T16:11:49.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:48 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:49.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:48 vm05 ceph-mon[58702]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 533 MiB used, 99 GiB / 100 GiB avail; 112 KiB/s, 0 objects/s recovering 2026-03-09T16:11:50.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:49 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:50.181 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:49 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:11:50.843 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:50 vm05 ceph-mon[58702]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:50 vm03 ceph-mon[51019]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:52.054 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:51 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:11:52.054 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:51 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:52.054 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:51 vm05 ceph-mon[58702]: Deploying daemon osd.5 on vm05 2026-03-09T16:11:52.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:51 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:11:52.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:51 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:52.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:51 vm03 ceph-mon[51019]: Deploying daemon osd.5 on vm05 2026-03-09T16:11:52.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:52 vm05 ceph-mon[58702]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:52.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:52.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:52.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:52 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:53.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:52 vm03 ceph-mon[51019]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:53.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:53.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:53.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:52 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:53.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.582+0000 7fe2cdffb640 1 -- 192.168.123.105:0/2743697440 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fe298002bf0 con 0x7fe2ac0761c0 2026-03-09T16:11:53.586 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 5 on host 'vm05' 2026-03-09T16:11:53.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.585+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 msgr2=0x7fe2ac078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:53.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.585+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 0x7fe2ac078680 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fe2b802f750 tx=0x7fe2b8005b20 comp rx=0 tx=0).stop 2026-03-09T16:11:53.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.585+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 msgr2=0x7fe2d0116ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.585+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fe2c400b700 tx=0x7fe2c400bbd0 comp rx=0 tx=0).stop 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.585+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 shutdown_connections 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe2ac0761c0 0x7fe2ac078680 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2d0072710 0x7fe2d0117000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 --2- 192.168.123.105:0/2743697440 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2d0071d40 0x7fe2d0116ac0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 >> 192.168.123.105:0/2743697440 conn(0x7fe2d006d660 msgr2=0x7fe2d00709d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 shutdown_connections 2026-03-09T16:11:53.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:53.586+0000 7fe2d6d3f640 1 -- 192.168.123.105:0/2743697440 wait complete. 2026-03-09T16:11:53.646 DEBUG:teuthology.orchestra.run.vm05:osd.5> sudo journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service 2026-03-09T16:11:53.647 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-09T16:11:53.647 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd stat -f json 2026-03-09T16:11:53.822 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.076+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/1254721607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 msgr2=0x7fbbbc1040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.076+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/1254721607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc1040e0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7fbbac0099b0 tx=0x7fbbac02f240 comp rx=0 tx=0).stop 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.077+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/1254721607 shutdown_connections 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.077+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/1254721607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc1040e0 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.077+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/1254721607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbbc102a60 0x7fbbbc102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.077+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/1254721607 >> 192.168.123.103:0/1254721607 conn(0x7fbbbc0fe250 msgr2=0x7fbbbc100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:54.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.078+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/1254721607 shutdown_connections 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.078+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/1254721607 wait complete. 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.078+0000 7fbbc2ee4640 1 Processor -- start 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.078+0000 7fbbc2ee4640 1 -- start start 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbc2ee4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbbc102a60 0x7fbbbc19a4d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbc2ee4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbc2ee4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbbc19afe0 con 0x7fbbbc103c60 2026-03-09T16:11:54.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39616/0 (socket says 192.168.123.103:39616) 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 -- 192.168.123.103:0/3888333366 learned_addr learned my addr 192.168.123.103:0/3888333366 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbc2ee4640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbbc19b150 con 0x7fbbbc102a60 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 -- 192.168.123.103:0/3888333366 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbbc102a60 msgr2=0x7fbbbc19a4d0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbbc102a60 0x7fbbbc19a4d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.079+0000 7fbbb3fff640 1 -- 192.168.123.103:0/3888333366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbac009660 con 0x7fbbbc103c60 2026-03-09T16:11:54.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbb3fff640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7fbbac009ae0 tx=0x7fbbac002980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:54.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbac03d070 con 0x7fbbbc103c60 2026-03-09T16:11:54.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbbac031e10 con 0x7fbbbc103c60 2026-03-09T16:11:54.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbac031280 con 0x7fbbbc103c60 2026-03-09T16:11:54.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbbc19fb90 con 0x7fbbbc103c60 2026-03-09T16:11:54.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.080+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbbc1a0100 con 0x7fbbbc103c60 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.081+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbb88005350 con 0x7fbbbc103c60 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.084+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fbbac02faa0 con 0x7fbbbc103c60 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.084+0000 7fbbb1ffb640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 0x7fbb8c078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.084+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4301+0+0 (secure 0 0 0) 0x7fbbac0bc680 con 0x7fbbbc103c60 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.084+0000 7fbbc0c59640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 0x7fbb8c078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.085+0000 7fbbc0c59640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 0x7fbb8c078750 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbbbc103ac0 tx=0x7fbba4006cd0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:54.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.085+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fbbac086110 con 0x7fbbbc103c60 2026-03-09T16:11:54.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.177+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fbb880058d0 con 0x7fbbbc103c60 2026-03-09T16:11:54.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.178+0000 7fbbb1ffb640 1 -- 192.168.123.103:0/3888333366 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7fbbac0bc020 con 0x7fbbbc103c60 2026-03-09T16:11:54.178 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.180+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 msgr2=0x7fbb8c078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.180+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 0x7fbb8c078750 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbbbc103ac0 tx=0x7fbba4006cd0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.180+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 msgr2=0x7fbbbc19aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.180+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7fbbac009ae0 tx=0x7fbbac002980 comp rx=0 tx=0).stop 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 shutdown_connections 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fbb8c076290 0x7fbb8c078750 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103c60 0x7fbbbc19aa10 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 --2- 192.168.123.103:0/3888333366 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbbc102a60 0x7fbbbc19a4d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:54.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 >> 192.168.123.103:0/3888333366 conn(0x7fbbbc0fe250 msgr2=0x7fbbbc0ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:54.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 shutdown_connections 2026-03-09T16:11:54.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:54.181+0000 7fbbc2ee4640 1 -- 192.168.123.103:0/3888333366 wait complete. 2026-03-09T16:11:54.225 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773072706,"num_in_osds":6,"osd_in_since":1773072706,"num_remapped_pgs":0} 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3888333366' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:11:54.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:54 vm03 ceph-mon[51019]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3888333366' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:11:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:54 vm05 ceph-mon[58702]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:11:54.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:11:54 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:11:54.473+0000 7fb97aff9740 -1 osd.5 0 log_to_monitors true 2026-03-09T16:11:55.225 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd stat -f json 2026-03-09T16:11:55.380 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.652+0000 7ff9cf577640 1 -- 192.168.123.103:0/3536422191 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0072370 msgr2=0x7ff9d010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.652+0000 7ff9cf577640 1 --2- 192.168.123.103:0/3536422191 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0072370 0x7ff9d010c590 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7ff9c40099b0 tx=0x7ff9c402f240 comp rx=0 tx=0).stop 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 -- 192.168.123.103:0/3536422191 shutdown_connections 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 --2- 192.168.123.103:0/3536422191 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0072370 0x7ff9d010c590 secure :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7ff9c40099b0 tx=0x7ff9c402f240 comp rx=0 tx=0).stop 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 --2- 192.168.123.103:0/3536422191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 -- 192.168.123.103:0/3536422191 >> 192.168.123.103:0/3536422191 conn(0x7ff9d006d4f0 msgr2=0x7ff9d006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 -- 192.168.123.103:0/3536422191 shutdown_connections 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.653+0000 7ff9cf577640 1 -- 192.168.123.103:0/3536422191 wait complete. 2026-03-09T16:11:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 Processor -- start 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 -- start start 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0117220 0x7ff9d0115db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9d01162f0 con 0x7ff9d00719a0 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9cf577640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9d0116430 con 0x7ff9d0117220 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.654+0000 7ff9ce575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39636/0 (socket says 192.168.123.103:39636) 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 -- 192.168.123.103:0/4205922285 learned_addr learned my addr 192.168.123.103:0/4205922285 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:55.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9cdd74640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0117220 0x7ff9d0115db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 -- 192.168.123.103:0/4205922285 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0117220 msgr2=0x7ff9d0115db0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0117220 0x7ff9d0115db0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 -- 192.168.123.103:0/4205922285 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9c4009660 con 0x7ff9d00719a0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9ce575640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7ff9c000bd50 tx=0x7ff9c0009f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.655+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c0002c70 con 0x7ff9d00719a0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.656+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9d01166b0 con 0x7ff9d00719a0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.656+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9d01b5b50 con 0x7ff9d00719a0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.656+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff9c0002dd0 con 0x7ff9d00719a0 2026-03-09T16:11:55.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.656+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c0015cc0 con 0x7ff9d00719a0 2026-03-09T16:11:55.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.657+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff9c001f430 con 0x7ff9d00719a0 2026-03-09T16:11:55.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.658+0000 7ff9bf7fe640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 0x7ff98c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:55.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.658+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4322+0+0 (secure 0 0 0) 0x7ff9c0098900 con 0x7ff9d00719a0 2026-03-09T16:11:55.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.658+0000 7ff9cdd74640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 0x7ff98c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:55.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.658+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9a8005350 con 0x7ff9d00719a0 2026-03-09T16:11:55.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.659+0000 7ff9cdd74640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 0x7ff98c078680 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7ff9c4002410 tx=0x7ff9c403a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:55.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.661+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff9c00622f0 con 0x7ff9d00719a0 2026-03-09T16:11:55.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.768+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7ff9a80051c0 con 0x7ff9d00719a0 2026-03-09T16:11:55.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.769+0000 7ff9bf7fe640 1 -- 192.168.123.103:0/4205922285 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v33) v1 ==== 74+0+130 (secure 0 0 0) 0x7ff9c0061c90 con 0x7ff9d00719a0 2026-03-09T16:11:55.770 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.771+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 msgr2=0x7ff98c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.771+0000 7ff9cf577640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 0x7ff98c078680 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7ff9c4002410 tx=0x7ff9c403a040 comp rx=0 tx=0).stop 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.771+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 msgr2=0x7ff9d0115870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.771+0000 7ff9cf577640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7ff9c000bd50 tx=0x7ff9c0009f00 comp rx=0 tx=0).stop 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 shutdown_connections 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff98c0761c0 0x7ff98c078680 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9d0117220 0x7ff9d0115db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:55.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 --2- 192.168.123.103:0/4205922285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9d00719a0 0x7ff9d0115870 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:55.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 >> 192.168.123.103:0/4205922285 conn(0x7ff9d006d4f0 msgr2=0x7ff9d00702c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:55.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 shutdown_connections 2026-03-09T16:11:55.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:55.772+0000 7ff9cf577640 1 -- 192.168.123.103:0/4205922285 wait complete. 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: osdmap e32: 6 total, 5 up, 6 in 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:55.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:55.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:55 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:55.823 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"num_osds":6,"num_up_osds":5,"osd_up_since":1773072706,"num_in_osds":6,"osd_in_since":1773072706,"num_remapped_pgs":0} 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: osdmap e32: 6 total, 5 up, 6 in 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:11:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:55 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:11:56.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:11:55 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:11:55.671+0000 7fb976755640 -1 osd.5 0 waiting for initial osdmap 2026-03-09T16:11:56.027 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:11:55 vm05 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:11:55.695+0000 7fb972d91640 -1 osd.5 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:11:56.824 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd stat -f json 2026-03-09T16:11:56.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: Detected new or changed devices on vm05 2026-03-09T16:11:56.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:56.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: osdmap e33: 6 total, 5 up, 6 in 2026-03-09T16:11:56.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:56.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:56.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:56 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/4205922285' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:56.992 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: Detected new or changed devices on vm05 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: osdmap e33: 6 total, 5 up, 6 in 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:56 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/4205922285' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.278+0000 7ff185663640 1 -- 192.168.123.103:0/2802930660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180075ba0 msgr2=0x7ff180075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.278+0000 7ff185663640 1 --2- 192.168.123.103:0/2802930660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180075ba0 0x7ff180075fa0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7ff1680099b0 tx=0x7ff16802f220 comp rx=0 tx=0).stop 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.280+0000 7ff185663640 1 -- 192.168.123.103:0/2802930660 shutdown_connections 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.280+0000 7ff185663640 1 --2- 192.168.123.103:0/2802930660 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180076df0 0x7ff180077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.280+0000 7ff185663640 1 --2- 192.168.123.103:0/2802930660 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180075ba0 0x7ff180075fa0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.280+0000 7ff185663640 1 -- 192.168.123.103:0/2802930660 >> 192.168.123.103:0/2802930660 conn(0x7ff1800fe250 msgr2=0x7ff180100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:57.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.281+0000 7ff185663640 1 -- 192.168.123.103:0/2802930660 shutdown_connections 2026-03-09T16:11:57.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.281+0000 7ff185663640 1 -- 192.168.123.103:0/2802930660 wait complete. 2026-03-09T16:11:57.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.281+0000 7ff185663640 1 Processor -- start 2026-03-09T16:11:57.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff185663640 1 -- start start 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff185663640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 0x7ff18019e900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff185663640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff185663640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff18019f380 con 0x7ff180076df0 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff185663640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff18019f4f0 con 0x7ff180075ba0 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff17e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff17e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39648/0 (socket says 192.168.123.103:39648) 2026-03-09T16:11:57.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.282+0000 7ff17e7fc640 1 -- 192.168.123.103:0/2470464929 learned_addr learned my addr 192.168.123.103:0/2470464929 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:57.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.283+0000 7ff17effd640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 0x7ff18019e900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.283+0000 7ff17e7fc640 1 -- 192.168.123.103:0/2470464929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 msgr2=0x7ff18019e900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.283+0000 7ff17e7fc640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 0x7ff18019e900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.283+0000 7ff17e7fc640 1 -- 192.168.123.103:0/2470464929 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff168009660 con 0x7ff180076df0 2026-03-09T16:11:57.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.283+0000 7ff17e7fc640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7ff17400cc60 tx=0x7ff174007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff174007e00 con 0x7ff180076df0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff17400ce80 con 0x7ff180076df0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff17400f660 con 0x7ff180076df0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff1801a3fd0 con 0x7ff180076df0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1801a44d0 con 0x7ff180076df0 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.284+0000 7ff17effd640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 0x7ff18019e900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:11:57.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.286+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff14c005350 con 0x7ff180076df0 2026-03-09T16:11:57.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.286+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff1740040a0 con 0x7ff180076df0 2026-03-09T16:11:57.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.289+0000 7ff15ffff640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 0x7ff158078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.289+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4618+0+0 (secure 0 0 0) 0x7ff17401d030 con 0x7ff180076df0 2026-03-09T16:11:57.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.289+0000 7ff17effd640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 0x7ff158078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.290+0000 7ff17effd640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 0x7ff158078680 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7ff168002c20 tx=0x7ff16803a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:57.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.290+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff174060f00 con 0x7ff180076df0 2026-03-09T16:11:57.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.390+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7ff14c0051c0 con 0x7ff180076df0 2026-03-09T16:11:57.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.391+0000 7ff15ffff640 1 -- 192.168.123.103:0/2470464929 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v34) v1 ==== 74+0+130 (secure 0 0 0) 0x7ff1740608a0 con 0x7ff180076df0 2026-03-09T16:11:57.392 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:57.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.393+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 msgr2=0x7ff158078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.393+0000 7ff185663640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 0x7ff158078680 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7ff168002c20 tx=0x7ff16803a040 comp rx=0 tx=0).stop 2026-03-09T16:11:57.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 msgr2=0x7ff18019ee40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7ff17400cc60 tx=0x7ff174007590 comp rx=0 tx=0).stop 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 shutdown_connections 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ff1580761c0 0x7ff158078680 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff180076df0 0x7ff18019ee40 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 --2- 192.168.123.103:0/2470464929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff180075ba0 0x7ff18019e900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 >> 192.168.123.103:0/2470464929 conn(0x7ff1800fe250 msgr2=0x7ff1800ffd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 shutdown_connections 2026-03-09T16:11:57.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.394+0000 7ff185663640 1 -- 192.168.123.103:0/2470464929 wait complete. 2026-03-09T16:11:57.444 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":34,"num_osds":6,"num_up_osds":6,"osd_up_since":1773072716,"num_in_osds":6,"osd_in_since":1773072706,"num_remapped_pgs":0} 2026-03-09T16:11:57.444 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd dump --format=json 2026-03-09T16:11:57.597 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.829+0000 7f020e45f640 1 -- 192.168.123.103:0/3442070311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208075720 msgr2=0x7f0208075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.829+0000 7f020e45f640 1 --2- 192.168.123.103:0/3442070311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208075720 0x7f0208075b00 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f01f80099b0 tx=0x7f01f802f220 comp rx=0 tx=0).stop 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.830+0000 7f020e45f640 1 -- 192.168.123.103:0/3442070311 shutdown_connections 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.830+0000 7f020e45f640 1 --2- 192.168.123.103:0/3442070311 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0208076040 0x7f0208111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.830+0000 7f020e45f640 1 --2- 192.168.123.103:0/3442070311 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208075720 0x7f0208075b00 secure :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f01f80099b0 tx=0x7f01f802f220 comp rx=0 tx=0).stop 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.830+0000 7f020e45f640 1 -- 192.168.123.103:0/3442070311 >> 192.168.123.103:0/3442070311 conn(0x7f02080fe710 msgr2=0x7f0208100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:57.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.831+0000 7f020e45f640 1 -- 192.168.123.103:0/3442070311 shutdown_connections 2026-03-09T16:11:57.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.831+0000 7f020e45f640 1 -- 192.168.123.103:0/3442070311 wait complete. 2026-03-09T16:11:57.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.831+0000 7f020e45f640 1 Processor -- start 2026-03-09T16:11:57.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.831+0000 7f020e45f640 1 -- start start 2026-03-09T16:11:57.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.832+0000 7f020e45f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 0x7f020819f1c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.832+0000 7f0207fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 0x7f020819f1c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.832+0000 7f0207fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 0x7f020819f1c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39674/0 (socket says 192.168.123.103:39674) 2026-03-09T16:11:57.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.833+0000 7f020e45f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 0x7f02081a3ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.833+0000 7f020e45f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f020819fd00 con 0x7f0208076040 2026-03-09T16:11:57.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.833+0000 7f020e45f640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f020819fe70 con 0x7f020819f700 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.833+0000 7f0207fff640 1 -- 192.168.123.103:0/3389604476 learned_addr learned my addr 192.168.123.103:0/3389604476 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02077fe640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 0x7f02081a3ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02077fe640 1 -- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 msgr2=0x7f020819f1c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02077fe640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 0x7f020819f1c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02077fe640 1 -- 192.168.123.103:0/3389604476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f01f8009660 con 0x7f020819f700 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02077fe640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 0x7f02081a3ab0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f01f400b4d0 tx=0x7f01f400b9a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f01f4004280 con 0x7f020819f700 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f01f40043e0 con 0x7f020819f700 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.834+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f01f4010b80 con 0x7f020819f700 2026-03-09T16:11:57.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.835+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f02081a40b0 con 0x7f020819f700 2026-03-09T16:11:57.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.835+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f02081a4600 con 0x7f020819f700 2026-03-09T16:11:57.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.835+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f01cc005350 con 0x7f020819f700 2026-03-09T16:11:57.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.837+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f01f40026e0 con 0x7f020819f700 2026-03-09T16:11:57.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.837+0000 7f02057fa640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 0x7f01dc078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:57.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.837+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f01f40974e0 con 0x7f020819f700 2026-03-09T16:11:57.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.837+0000 7f0207fff640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 0x7f01dc078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:57.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.838+0000 7f0207fff640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 0x7f01dc078680 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f01f8002410 tx=0x7f01f803a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:57.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.839+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f01f4060e30 con 0x7f020819f700 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: purged_snaps scrub starts 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: purged_snaps scrub ok 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] boot 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:57.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:57 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2470464929' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:57.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.933+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f01cc0051c0 con 0x7f020819f700 2026-03-09T16:11:57.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.934+0000 7f02057fa640 1 -- 192.168.123.103:0/3389604476 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v35) v1 ==== 74+0+11574 (secure 0 0 0) 0x7f01f40607d0 con 0x7f020819f700 2026-03-09T16:11:57.935 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:11:57.936 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":35,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","created":"2026-03-09T16:09:34.709272+0000","modified":"2026-03-09T16:11:57.672982+0000","last_up_change":"2026-03-09T16:11:56.667048+0000","last_in_change":"2026-03-09T16:11:46.772209+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T16:11:29.876648+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d36e00ca-e7bc-4475-866a-be22243d455f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6803","nonce":2444576527}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6805","nonce":2444576527}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6809","nonce":2444576527}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6807","nonce":2444576527}]},"public_addr":"192.168.123.103:6803/2444576527","cluster_addr":"192.168.123.103:6805/2444576527","heartbeat_back_addr":"192.168.123.103:6809/2444576527","heartbeat_front_addr":"192.168.123.103:6807/2444576527","state":["exists","up"]},{"osd":1,"uuid":"77efea00-570c-4571-a7a6-968cc4097343","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6811","nonce":3417259072}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6813","nonce":3417259072}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6817","nonce":3417259072}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6815","nonce":3417259072}]},"public_addr":"192.168.123.103:6811/3417259072","cluster_addr":"192.168.123.103:6813/3417259072","heartbeat_back_addr":"192.168.123.103:6817/3417259072","heartbeat_front_addr":"192.168.123.103:6815/3417259072","state":["exists","up"]},{"osd":2,"uuid":"5f4a9aed-e670-4b8f-b945-c157bdccafca","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6819","nonce":2017087007}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6821","nonce":2017087007}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6825","nonce":2017087007}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6823","nonce":2017087007}]},"public_addr":"192.168.123.103:6819/2017087007","cluster_addr":"192.168.123.103:6821/2017087007","heartbeat_back_addr":"192.168.123.103:6825/2017087007","heartbeat_front_addr":"192.168.123.103:6823/2017087007","state":["exists","up"]},{"osd":3,"uuid":"aa64c4f2-8110-40fd-928c-4df2efafc82e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6801","nonce":143716735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6803","nonce":143716735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6807","nonce":143716735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6805","nonce":143716735}]},"public_addr":"192.168.123.105:6801/143716735","cluster_addr":"192.168.123.105:6803/143716735","heartbeat_back_addr":"192.168.123.105:6807/143716735","heartbeat_front_addr":"192.168.123.105:6805/143716735","state":["exists","up"]},{"osd":4,"uuid":"1567921f-08ce-4412-84d0-a4474c4e6ac0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6809","nonce":1555294449}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6811","nonce":1555294449}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6815","nonce":1555294449}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6813","nonce":1555294449}]},"public_addr":"192.168.123.105:6809/1555294449","cluster_addr":"192.168.123.105:6811/1555294449","heartbeat_back_addr":"192.168.123.105:6815/1555294449","heartbeat_front_addr":"192.168.123.105:6813/1555294449","state":["exists","up"]},{"osd":5,"uuid":"c322dd19-66a4-4f40-abd7-54565e63f71b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6817","nonce":262747247}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6819","nonce":262747247}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6823","nonce":262747247}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6821","nonce":262747247}]},"public_addr":"192.168.123.105:6817/262747247","cluster_addr":"192.168.123.105:6819/262747247","heartbeat_back_addr":"192.168.123.105:6823/262747247","heartbeat_front_addr":"192.168.123.105:6821/262747247","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:05.178350+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:15.264774+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:26.435748+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:36.903804+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:46.168754+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:55.471363+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:6800/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:6800/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/3979296636":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/2298651818":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6800/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2831546175":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2646707583":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1871615672":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1922548321":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/384698677":"2026-03-10T16:09:45.518986+0000","192.168.123.103:0/553055862":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1811194494":"2026-03-10T16:10:33.810609+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T16:11:57.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.937+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 msgr2=0x7f01dc078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.937+0000 7f020e45f640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 0x7f01dc078680 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f01f8002410 tx=0x7f01f803a040 comp rx=0 tx=0).stop 2026-03-09T16:11:57.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.937+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 msgr2=0x7f02081a3ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:57.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.937+0000 7f020e45f640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 0x7f02081a3ab0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f01f400b4d0 tx=0x7f01f400b9a0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.937+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 shutdown_connections 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f01dc0761c0 0x7f01dc078680 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f020819f700 0x7f02081a3ab0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 --2- 192.168.123.103:0/3389604476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0208076040 0x7f020819f1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 >> 192.168.123.103:0/3389604476 conn(0x7f02080fe710 msgr2=0x7f0208100250 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 shutdown_connections 2026-03-09T16:11:57.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:57.938+0000 7f020e45f640 1 -- 192.168.123.103:0/3389604476 wait complete. 2026-03-09T16:11:58.001 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-09T16:11:29.876648+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'is_stretch_pool': False, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '21', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-09T16:11:58.001 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd pool get .mgr pg_num 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: purged_snaps scrub starts 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: purged_snaps scrub ok 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 134 MiB used, 100 GiB / 100 GiB avail 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: osd.5 [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] boot 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:11:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:57 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2470464929' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T16:11:58.154 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:58.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.403+0000 7f8e2251b640 1 -- 192.168.123.103:0/3294636985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c100780 msgr2=0x7f8e1c100be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:58.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.403+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3294636985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c100780 0x7f8e1c100be0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f8e08009a00 tx=0x7f8e0802f280 comp rx=0 tx=0).stop 2026-03-09T16:11:58.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 -- 192.168.123.103:0/3294636985 shutdown_connections 2026-03-09T16:11:58.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3294636985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c100780 0x7f8e1c100be0 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3294636985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c106780 0x7f8e1c106b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 -- 192.168.123.103:0/3294636985 >> 192.168.123.103:0/3294636985 conn(0x7f8e1c0fc460 msgr2=0x7f8e1c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:58.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 -- 192.168.123.103:0/3294636985 shutdown_connections 2026-03-09T16:11:58.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.405+0000 7f8e2251b640 1 -- 192.168.123.103:0/3294636985 wait complete. 2026-03-09T16:11:58.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.406+0000 7f8e2251b640 1 Processor -- start 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.406+0000 7f8e2251b640 1 -- start start 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.406+0000 7f8e2251b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 0x7f8e1c073470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e2251b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 0x7f8e1c073470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e2251b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e1c0754d0 con 0x7f8e1c106780 2026-03-09T16:11:58.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1b7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39692/0 (socket says 192.168.123.103:39692) 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1b7fe640 1 -- 192.168.123.103:0/3171645872 learned_addr learned my addr 192.168.123.103:0/3171645872 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e1c073f20 con 0x7f8e1c100780 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1bfff640 1 -- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 msgr2=0x7f8e1c0739b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.407+0000 7f8e1bfff640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e1bfff640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e08009660 con 0x7f8e1c100780 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e1b7fe640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:11:58.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e1bfff640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 0x7f8e1c073470 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f8e0c00e970 tx=0x7f8e0c00ee40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:58.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e0c00ccb0 con 0x7f8e1c100780 2026-03-09T16:11:58.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e1c074200 con 0x7f8e1c100780 2026-03-09T16:11:58.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.408+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e1c1a4ce0 con 0x7f8e1c100780 2026-03-09T16:11:58.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.409+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8e0c004590 con 0x7f8e1c100780 2026-03-09T16:11:58.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.409+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e0c010640 con 0x7f8e1c100780 2026-03-09T16:11:58.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.410+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8e0c0107a0 con 0x7f8e1c100780 2026-03-09T16:11:58.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.410+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e1c101ec0 con 0x7f8e1c100780 2026-03-09T16:11:58.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.410+0000 7f8e197fa640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 0x7f8df0078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:58.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.411+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f8e0c014070 con 0x7f8e1c100780 2026-03-09T16:11:58.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.411+0000 7f8e1b7fe640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 0x7f8df0078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:58.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.412+0000 7f8e1b7fe640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 0x7f8df0078630 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f8e1c074bf0 tx=0x7f8e08005e50 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:58.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.413+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8e0c05ddc0 con 0x7f8e1c100780 2026-03-09T16:11:58.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.507+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7f8e1c100be0 con 0x7f8e1c100780 2026-03-09T16:11:58.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.510+0000 7f8e197fa640 1 -- 192.168.123.103:0/3171645872 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v35) v1 ==== 93+0+10 (secure 0 0 0) 0x7f8e0c060e40 con 0x7f8e1c100780 2026-03-09T16:11:58.511 INFO:teuthology.orchestra.run.vm03.stdout:pg_num: 1 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 msgr2=0x7f8df0078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 0x7f8df0078630 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f8e1c074bf0 tx=0x7f8e08005e50 comp rx=0 tx=0).stop 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 msgr2=0x7f8e1c073470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 0x7f8e1c073470 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f8e0c00e970 tx=0x7f8e0c00ee40 comp rx=0 tx=0).stop 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 shutdown_connections 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8df0076170 0x7f8df0078630 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e1c106780 0x7f8e1c0739b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 --2- 192.168.123.103:0/3171645872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e1c100780 0x7f8e1c073470 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.513+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 >> 192.168.123.103:0/3171645872 conn(0x7f8e1c0fc460 msgr2=0x7f8e1c10a720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.514+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 shutdown_connections 2026-03-09T16:11:58.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:58.514+0000 7f8e2251b640 1 -- 192.168.123.103:0/3171645872 wait complete. 2026-03-09T16:11:58.561 INFO:tasks.cephadm:Setting up client nodes... 2026-03-09T16:11:58.561 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T16:11:58.727 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:11:58.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:58 vm03 ceph-mon[51019]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T16:11:58.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:58 vm03 ceph-mon[51019]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:11:58.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:58 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3389604476' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:11:58.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:58 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3171645872' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T16:11:58.955 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:58 vm05 ceph-mon[58702]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T16:11:58.955 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:58 vm05 ceph-mon[58702]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:11:58.955 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:58 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3389604476' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:11:58.955 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:58 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3171645872' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.065+0000 7fda18b79640 1 -- 192.168.123.103:0/2250873836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 msgr2=0x7fda14075c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.065+0000 7fda18b79640 1 --2- 192.168.123.103:0/2250873836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda14075c70 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7fda08009a00 tx=0x7fda0802f290 comp rx=0 tx=0).stop 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.066+0000 7fda18b79640 1 -- 192.168.123.103:0/2250873836 shutdown_connections 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.066+0000 7fda18b79640 1 --2- 192.168.123.103:0/2250873836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda14075c70 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.066+0000 7fda18b79640 1 --2- 192.168.123.103:0/2250873836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda14077680 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.066+0000 7fda18b79640 1 -- 192.168.123.103:0/2250873836 >> 192.168.123.103:0/2250873836 conn(0x7fda140fe220 msgr2=0x7fda14100640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 -- 192.168.123.103:0/2250873836 shutdown_connections 2026-03-09T16:11:59.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 -- 192.168.123.103:0/2250873836 wait complete. 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 Processor -- start 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 -- start start 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda1419eb00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda1419f610 con 0x7fda14075810 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.067+0000 7fda18b79640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda1419f780 con 0x7fda14077280 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47444/0 (socket says 192.168.123.103:47444) 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 -- 192.168.123.103:0/1058013379 learned_addr learned my addr 192.168.123.103:0/1058013379 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 -- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 msgr2=0x7fda1419eb00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda12575640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda1419eb00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda1419eb00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 -- 192.168.123.103:0/1058013379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda08009660 con 0x7fda14077280 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda11d74640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fda0802f7a0 tx=0x7fda080043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:59.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.068+0000 7fda12575640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda1419eb00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:11:59.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.069+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda0802fd00 con 0x7fda14077280 2026-03-09T16:11:59.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.069+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fda0802fe60 con 0x7fda14077280 2026-03-09T16:11:59.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.069+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda08041bf0 con 0x7fda14077280 2026-03-09T16:11:59.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.069+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda141a41c0 con 0x7fda14077280 2026-03-09T16:11:59.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.069+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda141a4730 con 0x7fda14077280 2026-03-09T16:11:59.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.070+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fda0803f070 con 0x7fda14077280 2026-03-09T16:11:59.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.070+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd9d8005350 con 0x7fda14077280 2026-03-09T16:11:59.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.071+0000 7fd9fb7fe640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 0x7fd9ec078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.071+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fda080bc350 con 0x7fda14077280 2026-03-09T16:11:59.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.071+0000 7fda12575640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 0x7fd9ec078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.072+0000 7fda12575640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 0x7fd9ec078630 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fd9fc007920 tx=0x7fd9fc008040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:59.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.073+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fda08085c40 con 0x7fda14077280 2026-03-09T16:11:59.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.200+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fd9d80051c0 con 0x7fda14077280 2026-03-09T16:11:59.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.205+0000 7fd9fb7fe640 1 -- 192.168.123.103:0/1058013379 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7fda08085600 con 0x7fda14077280 2026-03-09T16:11:59.206 INFO:teuthology.orchestra.run.vm03.stdout:[client.0] 2026-03-09T16:11:59.206 INFO:teuthology.orchestra.run.vm03.stdout: key = AQBP8a5p7LwLDBAAXYwobigSixZ13ct5nWzf2w== 2026-03-09T16:11:59.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.207+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 msgr2=0x7fd9ec078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.207+0000 7fda18b79640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 0x7fd9ec078630 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fd9fc007920 tx=0x7fd9fc008040 comp rx=0 tx=0).stop 2026-03-09T16:11:59.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.207+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 msgr2=0x7fda1419f040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.207+0000 7fda18b79640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fda0802f7a0 tx=0x7fda080043d0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 shutdown_connections 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd9ec076170 0x7fd9ec078630 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fda14077280 0x7fda1419f040 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 --2- 192.168.123.103:0/1058013379 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda14075810 0x7fda1419eb00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 >> 192.168.123.103:0/1058013379 conn(0x7fda140fe220 msgr2=0x7fda140ffb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 shutdown_connections 2026-03-09T16:11:59.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:11:59.208+0000 7fda18b79640 1 -- 192.168.123.103:0/1058013379 wait complete. 2026-03-09T16:11:59.269 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:11:59.269 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-09T16:11:59.269 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-09T16:11:59.304 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T16:11:59.452 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm05/config 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.703+0000 7fa80058b640 1 -- 192.168.123.105:0/1553898375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f81019f0 msgr2=0x7fa7f8101e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.703+0000 7fa80058b640 1 --2- 192.168.123.105:0/1553898375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f81019f0 0x7fa7f8101e70 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa7ec009a00 tx=0x7fa7ec02f280 comp rx=0 tx=0).stop 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.704+0000 7fa80058b640 1 -- 192.168.123.105:0/1553898375 shutdown_connections 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.704+0000 7fa80058b640 1 --2- 192.168.123.105:0/1553898375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f81019f0 0x7fa7f8101e70 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.704+0000 7fa80058b640 1 --2- 192.168.123.105:0/1553898375 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f8100bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.704+0000 7fa80058b640 1 -- 192.168.123.105:0/1553898375 >> 192.168.123.105:0/1553898375 conn(0x7fa7f80fbf80 msgr2=0x7fa7f80fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.705+0000 7fa80058b640 1 -- 192.168.123.105:0/1553898375 shutdown_connections 2026-03-09T16:11:59.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.705+0000 7fa80058b640 1 -- 192.168.123.105:0/1553898375 wait complete. 2026-03-09T16:11:59.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.705+0000 7fa80058b640 1 Processor -- start 2026-03-09T16:11:59.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.705+0000 7fa80058b640 1 -- start start 2026-03-09T16:11:59.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa80058b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f81961c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa80058b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 0x7fa7f819b770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa80058b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7f8196b80 con 0x7fa7f81007f0 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fe300640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f81961c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fe300640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f81961c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.105:43728/0 (socket says 192.168.123.105:43728) 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fe300640 1 -- 192.168.123.105:0/3697406467 learned_addr learned my addr 192.168.123.105:0/3697406467 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fdaff640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 0x7fa7f819b770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7f8196cf0 con 0x7fa7f8196700 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fdaff640 1 -- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 msgr2=0x7fa7f81961c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.706+0000 7fa7fdaff640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f81961c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa7fdaff640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7ec009660 con 0x7fa7f8196700 2026-03-09T16:11:59.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa7fdaff640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 0x7fa7f819b770 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa7ec008000 tx=0x7fa7ec031d30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7ec031ea0 con 0x7fa7f8196700 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa7ec031110 con 0x7fa7f8196700 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7ec038680 con 0x7fa7f8196700 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa7f819bcb0 con 0x7fa7f8196700 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.707+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa7f819c0c0 con 0x7fa7f8196700 2026-03-09T16:11:59.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.708+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa7f81071e0 con 0x7fa7f8196700 2026-03-09T16:11:59.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.711+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa7ec03f030 con 0x7fa7f8196700 2026-03-09T16:11:59.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.712+0000 7fa7e77fe640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 0x7fa7d8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:11:59.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.712+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fa7ec0bbae0 con 0x7fa7f8196700 2026-03-09T16:11:59.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.712+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa7ec0e9880 con 0x7fa7f8196700 2026-03-09T16:11:59.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.712+0000 7fa7fe300640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 0x7fa7d8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:11:59.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.712+0000 7fa7fe300640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 0x7fa7d8078680 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fa7e80059c0 tx=0x7fa7e8009290 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:11:59.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.833+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fa7f81073f0 con 0x7fa7f8196700 2026-03-09T16:11:59.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.838+0000 7fa7e77fe640 1 -- 192.168.123.105:0/3697406467 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7fa7ec085070 con 0x7fa7f8196700 2026-03-09T16:11:59.839 INFO:teuthology.orchestra.run.vm05.stdout:[client.1] 2026-03-09T16:11:59.840 INFO:teuthology.orchestra.run.vm05.stdout: key = AQBP8a5pSjrUMRAApEKPSg6EfMHRzFUWuEWsQQ== 2026-03-09T16:11:59.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 msgr2=0x7fa7d8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 0x7fa7d8078680 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fa7e80059c0 tx=0x7fa7e8009290 comp rx=0 tx=0).stop 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 msgr2=0x7fa7f819b770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 0x7fa7f819b770 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa7ec008000 tx=0x7fa7ec031d30 comp rx=0 tx=0).stop 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 shutdown_connections 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa7d80761c0 0x7fa7d8078680 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.841+0000 7fa80058b640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7f8196700 0x7fa7f819b770 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.842+0000 7fa80058b640 1 --2- 192.168.123.105:0/3697406467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7f81007f0 0x7fa7f81961c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.842+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 >> 192.168.123.105:0/3697406467 conn(0x7fa7f80fbf80 msgr2=0x7fa7f80fdc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:11:59.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.842+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 shutdown_connections 2026-03-09T16:11:59.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:11:59.842+0000 7fa80058b640 1 -- 192.168.123.105:0/3697406467 wait complete. 2026-03-09T16:11:59.853 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:59 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1058013379' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:11:59.853 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:59 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:11:59.853 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:11:59 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T16:11:59.886 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:11:59.886 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-09T16:11:59.886 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-09T16:11:59.927 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-09T16:11:59.927 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-09T16:11:59.927 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mgr dump --format=json 2026-03-09T16:11:59.951 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:59 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1058013379' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:11:59.951 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:59 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:11:59.951 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:11:59 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T16:12:00.086 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 -- 192.168.123.103:0/2432003120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 msgr2=0x7f397010a800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 --2- 192.168.123.103:0/2432003120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f397010a800 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f39640098e0 tx=0x7f396402f1e0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 -- 192.168.123.103:0/2432003120 shutdown_connections 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 --2- 192.168.123.103:0/2432003120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f397010a800 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 --2- 192.168.123.103:0/2432003120 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f3970101dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 -- 192.168.123.103:0/2432003120 >> 192.168.123.103:0/2432003120 conn(0x7f39700fb3d0 msgr2=0x7f39700fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 -- 192.168.123.103:0/2432003120 shutdown_connections 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.335+0000 7f3975054640 1 -- 192.168.123.103:0/2432003120 wait complete. 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.336+0000 7f3975054640 1 Processor -- start 2026-03-09T16:12:00.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.336+0000 7f3975054640 1 -- start start 2026-03-09T16:12:00.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f3975054640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:00.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f3975054640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f39700ffa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:00.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f3975054640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3970101620 con 0x7f39701019f0 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f3975054640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f39700fff80 con 0x7f3970102310 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f396ed76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f396ed76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55752/0 (socket says 192.168.123.103:55752) 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f396ed76640 1 -- 192.168.123.103:0/3865706994 learned_addr learned my addr 192.168.123.103:0/3865706994 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f396e575640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f39700ffa40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.337+0000 7f396ed76640 1 -- 192.168.123.103:0/3865706994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 msgr2=0x7f39700ffa40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f396ed76640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f39700ffa40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f396ed76640 1 -- 192.168.123.103:0/3865706994 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3958009660 con 0x7f39701019f0 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f396ed76640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f3958002a00 tx=0x7f3958002ed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:00.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f395800ec70 con 0x7f39701019f0 2026-03-09T16:12:00.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3964009590 con 0x7f39701019f0 2026-03-09T16:12:00.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f39700718f0 con 0x7f39701019f0 2026-03-09T16:12:00.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.338+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f395800edd0 con 0x7f39701019f0 2026-03-09T16:12:00.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.339+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f395800f760 con 0x7f39701019f0 2026-03-09T16:12:00.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.340+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f395801f050 con 0x7f39701019f0 2026-03-09T16:12:00.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.341+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3970069700 con 0x7f39701019f0 2026-03-09T16:12:00.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.341+0000 7f394ffff640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 0x7f3944078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:00.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.341+0000 7f396e575640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 0x7f3944078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:00.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.342+0000 7f396e575640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 0x7f3944078750 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f3964004850 tx=0x7f396404d040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:00.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.342+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f39580979d0 con 0x7f39701019f0 2026-03-09T16:12:00.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.344+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f39580617f0 con 0x7f39701019f0 2026-03-09T16:12:00.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.468+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7f3970101dd0 con 0x7f39701019f0 2026-03-09T16:12:00.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.471+0000 7f394ffff640 1 -- 192.168.123.103:0/3865706994 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v18) v1 ==== 74+0+189855 (secure 0 0 0) 0x7f3958061190 con 0x7f39701019f0 2026-03-09T16:12:00.472 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 msgr2=0x7f3944078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 0x7f3944078750 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f3964004850 tx=0x7f396404d040 comp rx=0 tx=0).stop 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 msgr2=0x7f39700ff500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f3958002a00 tx=0x7f3958002ed0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 shutdown_connections 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3944076290 0x7f3944078750 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3970102310 0x7f39700ffa40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 --2- 192.168.123.103:0/3865706994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f39701019f0 0x7f39700ff500 secure :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f3958002a00 tx=0x7f3958002ed0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 >> 192.168.123.103:0/3865706994 conn(0x7f39700fb3d0 msgr2=0x7f3970109df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.479+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 shutdown_connections 2026-03-09T16:12:00.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.480+0000 7f3975054640 1 -- 192.168.123.103:0/3865706994 wait complete. 2026-03-09T16:12:00.559 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":18,"flags":0,"active_gid":14225,"active_name":"vm03.gbgzmu","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3168090362},{"type":"v1","addr":"192.168.123.103:6801","nonce":3168090362}]},"active_addr":"192.168.123.103:6801/3168090362","active_change":"2026-03-09T16:10:33.810907+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14248,"name":"vm05.dygxfv","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.0.0","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_requests":{"name":"max_requests","type":"int","level":"advanced","flags":0,"default_value":"500","min":"","max":"","enum_allowed":[],"desc":"Maximum number of requests to keep in memory. When new request comes in, the oldest request will be removed if the number of requests exceeds the max request number.if un-finished request is removed, error message will be logged in the ceph-mgr log.","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.0.0","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_requests":{"name":"max_requests","type":"int","level":"advanced","flags":0,"default_value":"500","min":"","max":"","enum_allowed":[],"desc":"Maximum number of requests to keep in memory. When new request comes in, the oldest request will be removed if the number of requests exceeds the max request number.if un-finished request is removed, error message will be logged in the ceph-mgr log.","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.103:8443/","prometheus":"http://192.168.123.103:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":3754062357}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":1273452242}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":906769150}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":1314893658}]}]} 2026-03-09T16:12:00.561 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-09T16:12:00.561 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-09T16:12:00.561 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd dump --format=json 2026-03-09T16:12:00.762 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:00.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:00 vm03 ceph-mon[51019]: from='client.? 192.168.123.105:0/3697406467' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:12:00.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:00 vm03 ceph-mon[51019]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:00.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:00 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:12:00.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:00 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T16:12:00.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:00 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3865706994' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 -- 192.168.123.103:0/2733145052 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073510 msgr2=0x7f1e1c0738f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 --2- 192.168.123.103:0/2733145052 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073510 0x7f1e1c0738f0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f1e10009a00 tx=0x7f1e1002f320 comp rx=0 tx=0).stop 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 -- 192.168.123.103:0/2733145052 shutdown_connections 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 --2- 192.168.123.103:0/2733145052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e1c073e30 0x7f1e1c10cb80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 --2- 192.168.123.103:0/2733145052 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073510 0x7f1e1c0738f0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 -- 192.168.123.103:0/2733145052 >> 192.168.123.103:0/2733145052 conn(0x7f1e1c0fc460 msgr2=0x7f1e1c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 -- 192.168.123.103:0/2733145052 shutdown_connections 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.998+0000 7f1e23fdd640 1 -- 192.168.123.103:0/2733145052 wait complete. 2026-03-09T16:12:00.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 Processor -- start 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 -- start start 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e1c073510 0x7f1e1c1006e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e1c104760 con 0x7f1e1c073e30 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e23fdd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e1c1048d0 con 0x7f1e1c073510 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e21551640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e21551640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55776/0 (socket says 192.168.123.103:55776) 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:00.999+0000 7f1e21551640 1 -- 192.168.123.103:0/142093896 learned_addr learned my addr 192.168.123.103:0/142093896 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e21551640 1 -- 192.168.123.103:0/142093896 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e1c073510 msgr2=0x7f1e1c1006e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e21551640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e1c073510 0x7f1e1c1006e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e21551640 1 -- 192.168.123.103:0/142093896 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e10009660 con 0x7f1e1c073e30 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e21551640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f1e0c00ece0 tx=0x7f1e0c00c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:01.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e0c00eea0 con 0x7f1e1c073e30 2026-03-09T16:12:01.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e23fdd640 1 -- 192.168.123.103:0/142093896 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1e1c101220 con 0x7f1e1c073e30 2026-03-09T16:12:01.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e23fdd640 1 -- 192.168.123.103:0/142093896 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1e1c071930 con 0x7f1e1c073e30 2026-03-09T16:12:01.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1e0c004590 con 0x7f1e1c073e30 2026-03-09T16:12:01.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.000+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e0c010640 con 0x7f1e1c073e30 2026-03-09T16:12:01.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.001+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1e1c074c50 con 0x7f1e1c073e30 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.002+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f1e0c0040d0 con 0x7f1e1c073e30 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.003+0000 7f1e0affd640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 0x7f1df0078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.003+0000 7f1e21d52640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 0x7f1df0078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.003+0000 7f1e21d52640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 0x7f1df0078750 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f1e10009a00 tx=0x7f1e100023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.003+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f1e0c014070 con 0x7f1e1c073e30 2026-03-09T16:12:01.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.006+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f1e0c0612c0 con 0x7f1e1c073e30 2026-03-09T16:12:01.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:00 vm05 ceph-mon[58702]: from='client.? 192.168.123.105:0/3697406467' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:12:01.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:00 vm05 ceph-mon[58702]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:01.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:00 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T16:12:01.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:00 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T16:12:01.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:00 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3865706994' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T16:12:01.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.102+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f1e1c101c40 con 0x7f1e1c073e30 2026-03-09T16:12:01.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.104+0000 7f1e0affd640 1 -- 192.168.123.103:0/142093896 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v35) v1 ==== 74+0+11574 (secure 0 0 0) 0x7f1e0c060c60 con 0x7f1e1c073e30 2026-03-09T16:12:01.105 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:01.105 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":35,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","created":"2026-03-09T16:09:34.709272+0000","modified":"2026-03-09T16:11:57.672982+0000","last_up_change":"2026-03-09T16:11:56.667048+0000","last_in_change":"2026-03-09T16:11:46.772209+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T16:11:29.876648+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d36e00ca-e7bc-4475-866a-be22243d455f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6803","nonce":2444576527}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6805","nonce":2444576527}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6809","nonce":2444576527}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6807","nonce":2444576527}]},"public_addr":"192.168.123.103:6803/2444576527","cluster_addr":"192.168.123.103:6805/2444576527","heartbeat_back_addr":"192.168.123.103:6809/2444576527","heartbeat_front_addr":"192.168.123.103:6807/2444576527","state":["exists","up"]},{"osd":1,"uuid":"77efea00-570c-4571-a7a6-968cc4097343","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6811","nonce":3417259072}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6813","nonce":3417259072}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6817","nonce":3417259072}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6815","nonce":3417259072}]},"public_addr":"192.168.123.103:6811/3417259072","cluster_addr":"192.168.123.103:6813/3417259072","heartbeat_back_addr":"192.168.123.103:6817/3417259072","heartbeat_front_addr":"192.168.123.103:6815/3417259072","state":["exists","up"]},{"osd":2,"uuid":"5f4a9aed-e670-4b8f-b945-c157bdccafca","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6819","nonce":2017087007}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6821","nonce":2017087007}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6825","nonce":2017087007}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6823","nonce":2017087007}]},"public_addr":"192.168.123.103:6819/2017087007","cluster_addr":"192.168.123.103:6821/2017087007","heartbeat_back_addr":"192.168.123.103:6825/2017087007","heartbeat_front_addr":"192.168.123.103:6823/2017087007","state":["exists","up"]},{"osd":3,"uuid":"aa64c4f2-8110-40fd-928c-4df2efafc82e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6801","nonce":143716735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6803","nonce":143716735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6807","nonce":143716735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6805","nonce":143716735}]},"public_addr":"192.168.123.105:6801/143716735","cluster_addr":"192.168.123.105:6803/143716735","heartbeat_back_addr":"192.168.123.105:6807/143716735","heartbeat_front_addr":"192.168.123.105:6805/143716735","state":["exists","up"]},{"osd":4,"uuid":"1567921f-08ce-4412-84d0-a4474c4e6ac0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6809","nonce":1555294449}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6811","nonce":1555294449}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6815","nonce":1555294449}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6813","nonce":1555294449}]},"public_addr":"192.168.123.105:6809/1555294449","cluster_addr":"192.168.123.105:6811/1555294449","heartbeat_back_addr":"192.168.123.105:6815/1555294449","heartbeat_front_addr":"192.168.123.105:6813/1555294449","state":["exists","up"]},{"osd":5,"uuid":"c322dd19-66a4-4f40-abd7-54565e63f71b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6817","nonce":262747247}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6819","nonce":262747247}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6823","nonce":262747247}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6821","nonce":262747247}]},"public_addr":"192.168.123.105:6817/262747247","cluster_addr":"192.168.123.105:6819/262747247","heartbeat_back_addr":"192.168.123.105:6823/262747247","heartbeat_front_addr":"192.168.123.105:6821/262747247","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:05.178350+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:15.264774+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:26.435748+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:36.903804+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:46.168754+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:55.471363+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:6800/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:6800/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/3979296636":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/2298651818":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6800/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2831546175":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2646707583":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1871615672":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1922548321":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/384698677":"2026-03-10T16:09:45.518986+0000","192.168.123.103:0/553055862":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1811194494":"2026-03-10T16:10:33.810609+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.107+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 msgr2=0x7f1df0078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.107+0000 7f1e08ff9640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 0x7f1df0078750 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f1e10009a00 tx=0x7f1e100023d0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.107+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 msgr2=0x7f1e1c100c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.107+0000 7f1e08ff9640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f1e0c00ece0 tx=0x7f1e0c00c6a0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 shutdown_connections 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f1df0076290 0x7f1df0078750 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1e1c073e30 0x7f1e1c100c20 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 --2- 192.168.123.103:0/142093896 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e1c073510 0x7f1e1c1006e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 >> 192.168.123.103:0/142093896 conn(0x7f1e1c0fc460 msgr2=0x7f1e1c10c270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:01.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 shutdown_connections 2026-03-09T16:12:01.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.108+0000 7f1e08ff9640 1 -- 192.168.123.103:0/142093896 wait complete. 2026-03-09T16:12:01.175 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-09T16:12:01.175 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd dump --format=json 2026-03-09T16:12:01.332 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:01.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.582+0000 7f87af0f8640 1 -- 192.168.123.103:0/1324421089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 msgr2=0x7f87a8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.582+0000 7f87af0f8640 1 --2- 192.168.123.103:0/1324421089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a8102e30 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f87900099b0 tx=0x7f879002f220 comp rx=0 tx=0).stop 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 -- 192.168.123.103:0/1324421089 shutdown_connections 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 --2- 192.168.123.103:0/1324421089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a8102e30 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 --2- 192.168.123.103:0/1324421089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 -- 192.168.123.103:0/1324421089 >> 192.168.123.103:0/1324421089 conn(0x7f87a80fe710 msgr2=0x7f87a8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 -- 192.168.123.103:0/1324421089 shutdown_connections 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.583+0000 7f87af0f8640 1 -- 192.168.123.103:0/1324421089 wait complete. 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 Processor -- start 2026-03-09T16:12:01.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 -- start start 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a81a0b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87a81a11e0 con 0x7f87a81029d0 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f87af0f8640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87a819a790 con 0x7f87a81089d0 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.584+0000 7f879ffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a81a0b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f87ace6d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f87ace6d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55808/0 (socket says 192.168.123.103:55808) 2026-03-09T16:12:01.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f87ace6d640 1 -- 192.168.123.103:0/2881197099 learned_addr learned my addr 192.168.123.103:0/2881197099 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f879ffff640 1 -- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 msgr2=0x7f87a81a0650 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f879ffff640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f879ffff640 1 -- 192.168.123.103:0/2881197099 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8790009660 con 0x7f87a81089d0 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f87ace6d640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.585+0000 7f879ffff640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a81a0b90 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f87900099b0 tx=0x7f8790004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:01.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.586+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f879003d070 con 0x7f87a81089d0 2026-03-09T16:12:01.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.586+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87a819aa10 con 0x7f87a81089d0 2026-03-09T16:12:01.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.586+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87a819af00 con 0x7f87a81089d0 2026-03-09T16:12:01.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.586+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8790004440 con 0x7f87a81089d0 2026-03-09T16:12:01.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.587+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87900388e0 con 0x7f87a81089d0 2026-03-09T16:12:01.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.587+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8770005350 con 0x7f87a81089d0 2026-03-09T16:12:01.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.588+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8790038a40 con 0x7f87a81089d0 2026-03-09T16:12:01.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.589+0000 7f879dffb640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 0x7f8784078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:01.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.589+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f87900bcb20 con 0x7f87a81089d0 2026-03-09T16:12:01.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.589+0000 7f87ace6d640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 0x7f8784078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:01.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.590+0000 7f87ace6d640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 0x7f8784078680 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f8798004290 tx=0x7f879800a480 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:01.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.591+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f87900864a0 con 0x7f87a81089d0 2026-03-09T16:12:01.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.690+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f87700051c0 con 0x7f87a81089d0 2026-03-09T16:12:01.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.690+0000 7f879dffb640 1 -- 192.168.123.103:0/2881197099 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v35) v1 ==== 74+0+11574 (secure 0 0 0) 0x7f8790085e40 con 0x7f87a81089d0 2026-03-09T16:12:01.691 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:01.691 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":35,"fsid":"2b05df78-1bd2-11f1-83c0-c950214d6edc","created":"2026-03-09T16:09:34.709272+0000","modified":"2026-03-09T16:11:57.672982+0000","last_up_change":"2026-03-09T16:11:56.667048+0000","last_in_change":"2026-03-09T16:11:46.772209+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T16:11:29.876648+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"d36e00ca-e7bc-4475-866a-be22243d455f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6803","nonce":2444576527}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6805","nonce":2444576527}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6809","nonce":2444576527}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":2444576527},{"type":"v1","addr":"192.168.123.103:6807","nonce":2444576527}]},"public_addr":"192.168.123.103:6803/2444576527","cluster_addr":"192.168.123.103:6805/2444576527","heartbeat_back_addr":"192.168.123.103:6809/2444576527","heartbeat_front_addr":"192.168.123.103:6807/2444576527","state":["exists","up"]},{"osd":1,"uuid":"77efea00-570c-4571-a7a6-968cc4097343","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6811","nonce":3417259072}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6813","nonce":3417259072}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6817","nonce":3417259072}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3417259072},{"type":"v1","addr":"192.168.123.103:6815","nonce":3417259072}]},"public_addr":"192.168.123.103:6811/3417259072","cluster_addr":"192.168.123.103:6813/3417259072","heartbeat_back_addr":"192.168.123.103:6817/3417259072","heartbeat_front_addr":"192.168.123.103:6815/3417259072","state":["exists","up"]},{"osd":2,"uuid":"5f4a9aed-e670-4b8f-b945-c157bdccafca","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6819","nonce":2017087007}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6821","nonce":2017087007}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6825","nonce":2017087007}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":2017087007},{"type":"v1","addr":"192.168.123.103:6823","nonce":2017087007}]},"public_addr":"192.168.123.103:6819/2017087007","cluster_addr":"192.168.123.103:6821/2017087007","heartbeat_back_addr":"192.168.123.103:6825/2017087007","heartbeat_front_addr":"192.168.123.103:6823/2017087007","state":["exists","up"]},{"osd":3,"uuid":"aa64c4f2-8110-40fd-928c-4df2efafc82e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6801","nonce":143716735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6803","nonce":143716735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6807","nonce":143716735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":143716735},{"type":"v1","addr":"192.168.123.105:6805","nonce":143716735}]},"public_addr":"192.168.123.105:6801/143716735","cluster_addr":"192.168.123.105:6803/143716735","heartbeat_back_addr":"192.168.123.105:6807/143716735","heartbeat_front_addr":"192.168.123.105:6805/143716735","state":["exists","up"]},{"osd":4,"uuid":"1567921f-08ce-4412-84d0-a4474c4e6ac0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6809","nonce":1555294449}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6811","nonce":1555294449}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6815","nonce":1555294449}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1555294449},{"type":"v1","addr":"192.168.123.105:6813","nonce":1555294449}]},"public_addr":"192.168.123.105:6809/1555294449","cluster_addr":"192.168.123.105:6811/1555294449","heartbeat_back_addr":"192.168.123.105:6815/1555294449","heartbeat_front_addr":"192.168.123.105:6813/1555294449","state":["exists","up"]},{"osd":5,"uuid":"c322dd19-66a4-4f40-abd7-54565e63f71b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6817","nonce":262747247}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6819","nonce":262747247}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6823","nonce":262747247}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":262747247},{"type":"v1","addr":"192.168.123.105:6821","nonce":262747247}]},"public_addr":"192.168.123.105:6817/262747247","cluster_addr":"192.168.123.105:6819/262747247","heartbeat_back_addr":"192.168.123.105:6823/262747247","heartbeat_front_addr":"192.168.123.105:6821/262747247","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:05.178350+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:15.264774+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:26.435748+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:36.903804+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:46.168754+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T16:11:55.471363+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:6800/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:6800/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4159093290":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/3405276359":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/3979296636":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/2298651818":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6800/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2831546175":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/2646707583":"2026-03-10T16:09:45.518986+0000","192.168.123.103:6801/4285644309":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1871615672":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1922548321":"2026-03-10T16:10:33.810609+0000","192.168.123.103:0/384698677":"2026-03-10T16:09:45.518986+0000","192.168.123.103:0/553055862":"2026-03-10T16:09:57.470573+0000","192.168.123.103:0/1811194494":"2026-03-10T16:10:33.810609+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T16:12:01.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.696+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 msgr2=0x7f8784078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.696+0000 7f87af0f8640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 0x7f8784078680 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f8798004290 tx=0x7f879800a480 comp rx=0 tx=0).stop 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 msgr2=0x7f87a81a0b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a81a0b90 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f87900099b0 tx=0x7f8790004290 comp rx=0 tx=0).stop 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 shutdown_connections 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f87840761c0 0x7f8784078680 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87a81089d0 0x7f87a81a0b90 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 --2- 192.168.123.103:0/2881197099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87a81029d0 0x7f87a81a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 >> 192.168.123.103:0/2881197099 conn(0x7f87a80fe710 msgr2=0x7f87a810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 shutdown_connections 2026-03-09T16:12:01.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:01.697+0000 7f87af0f8640 1 -- 192.168.123.103:0/2881197099 wait complete. 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.0 flush_pg_stats 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.1 flush_pg_stats 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.2 flush_pg_stats 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.3 flush_pg_stats 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.4 flush_pg_stats 2026-03-09T16:12:01.765 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph tell osd.5 flush_pg_stats 2026-03-09T16:12:02.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:01 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/142093896' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:12:02.116 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:01 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/142093896' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:12:02.131 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.349 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.355 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.356 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.432 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.440 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:02.697 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:02 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2881197099' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:12:02.697 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:02 vm03 ceph-mon[51019]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.837+0000 7f38b9225640 1 -- 192.168.123.103:0/1899443195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072120 msgr2=0x7f38b4072500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.837+0000 7f38b9225640 1 --2- 192.168.123.103:0/1899443195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072120 0x7f38b4072500 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f389c0099b0 tx=0x7f389c02f240 comp rx=0 tx=0).stop 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 -- 192.168.123.103:0/1899443195 shutdown_connections 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 --2- 192.168.123.103:0/1899443195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072a40 0x7f38b410ca90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 --2- 192.168.123.103:0/1899443195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072120 0x7f38b4072500 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 -- 192.168.123.103:0/1899443195 >> 192.168.123.103:0/1899443195 conn(0x7f38b406c7d0 msgr2=0x7f38b406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 -- 192.168.123.103:0/1899443195 shutdown_connections 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.838+0000 7f38b9225640 1 -- 192.168.123.103:0/1899443195 wait complete. 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 Processor -- start 2026-03-09T16:12:02.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 -- start start 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072120 0x7f38b4116510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38b411a540 con 0x7f38b4072a40 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b9225640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38b411a6b0 con 0x7f38b4072120 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55820/0 (socket says 192.168.123.103:55820) 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 -- 192.168.123.103:0/349509228 learned_addr learned my addr 192.168.123.103:0/349509228 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b3fff640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072120 0x7f38b4116510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 -- 192.168.123.103:0/349509228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072120 msgr2=0x7f38b4116510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072120 0x7f38b4116510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f389c009660 con 0x7f38b4072a40 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.839+0000 7f38b37fe640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f38a800b730 tx=0x7f38a800bc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.841+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38a8004280 con 0x7f38b4072a40 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.841+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f38a80043e0 con 0x7f38b4072a40 2026-03-09T16:12:02.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.841+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38a800ca90 con 0x7f38b4072a40 2026-03-09T16:12:02.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.841+0000 7f38b9225640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38b41170b0 con 0x7f38b4072a40 2026-03-09T16:12:02.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.841+0000 7f38b9225640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38b41ba3d0 con 0x7f38b4072a40 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.843+0000 7f38b9225640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f38b4110a40 con 0x7f38b4072a40 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.844+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f38a800cbf0 con 0x7f38b4072a40 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.844+0000 7f38b17fa640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 0x7f3884078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.844+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f38a80978f0 con 0x7f38b4072a40 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.844+0000 7f38b17fa640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 0x7f388407e160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:02.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.844+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f388407e810 con 0x7f388407bd40 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.845+0000 7f38b3fff640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 0x7f3884078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.845+0000 7f38b8a24640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 0x7f388407e160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.846+0000 7f38b8a24640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 0x7f388407e160 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.846+0000 7f38b3fff640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 0x7f3884078750 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f389c002410 tx=0x7f389c03a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.846+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7f38a8061350 con 0x7f38b4072a40 2026-03-09T16:12:02.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.846+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== osd.0 v2:192.168.123.103:6802/2444576527 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f388407e810 con 0x7f388407bd40 2026-03-09T16:12:02.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.859+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 --> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f3874000f80 con 0x7f388407bd40 2026-03-09T16:12:02.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.865+0000 7f38b17fa640 1 -- 192.168.123.103:0/349509228 <== osd.0 v2:192.168.123.103:6802/2444576527 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f3874000f80 con 0x7f388407bd40 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.865+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 msgr2=0x7f388407e160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 0x7f388407e160 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 msgr2=0x7f3884078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 0x7f3884078750 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f389c002410 tx=0x7f389c03a040 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 msgr2=0x7f38b4116a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f38a800b730 tx=0x7f38a800bc00 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 shutdown_connections 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f3884076290 0x7f3884078750 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:6802/2444576527,v1:192.168.123.103:6803/2444576527] conn(0x7f388407bd40 0x7f388407e160 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4072a40 0x7f38b4116a50 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 --2- 192.168.123.103:0/349509228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38b4072120 0x7f38b4116510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 >> 192.168.123.103:0/349509228 conn(0x7f38b406c7d0 msgr2=0x7f38b406f6c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 shutdown_connections 2026-03-09T16:12:02.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:02.866+0000 7f3892ffd640 1 -- 192.168.123.103:0/349509228 wait complete. 2026-03-09T16:12:03.001 INFO:teuthology.orchestra.run.vm03.stdout:38654705676 2026-03-09T16:12:03.001 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.0 2026-03-09T16:12:03.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:02 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2881197099' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T16:12:03.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:02 vm05 ceph-mon[58702]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:03.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.147+0000 7f5413315640 1 -- 192.168.123.103:0/3982112050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 msgr2=0x7f540c102cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.147+0000 7f5413315640 1 --2- 192.168.123.103:0/3982112050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c102cf0 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f53f4009a00 tx=0x7f53f402f290 comp rx=0 tx=0).stop 2026-03-09T16:12:03.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.155+0000 7f5413315640 1 -- 192.168.123.103:0/3982112050 shutdown_connections 2026-03-09T16:12:03.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.155+0000 7f5413315640 1 --2- 192.168.123.103:0/3982112050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c102cf0 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.155+0000 7f5413315640 1 --2- 192.168.123.103:0/3982112050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c108c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.155+0000 7f5413315640 1 -- 192.168.123.103:0/3982112050 >> 192.168.123.103:0/3982112050 conn(0x7f540c0fe5d0 msgr2=0x7f540c1009f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.164+0000 7f5413315640 1 -- 192.168.123.103:0/3982112050 shutdown_connections 2026-03-09T16:12:03.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.164+0000 7f5413315640 1 -- 192.168.123.103:0/3982112050 wait complete. 2026-03-09T16:12:03.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.164+0000 7f5413315640 1 Processor -- start 2026-03-09T16:12:03.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.175+0000 7f5413315640 1 -- start start 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.179+0000 7f5413315640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5413315640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c1a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5413315640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f540c1a1180 con 0x7f540c102890 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5413315640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f540c19a710 con 0x7f540c108890 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5412313640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5412313640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55848/0 (socket says 192.168.123.103:55848) 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5412313640 1 -- 192.168.123.103:0/526811512 learned_addr learned my addr 192.168.123.103:0/526811512 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:03.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5411b12640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c1a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.181+0000 7f5411b12640 1 -- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 msgr2=0x7f540c1a0620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.184+0000 7f5411b12640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.184+0000 7f5411b12640 1 -- 192.168.123.103:0/526811512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53f4009660 con 0x7f540c108890 2026-03-09T16:12:03.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.185+0000 7f5412313640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:12:03.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.185+0000 7f5411b12640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c1a0b60 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f53f402f7a0 tx=0x7f53f4004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.188+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53f4004400 con 0x7f540c108890 2026-03-09T16:12:03.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.188+0000 7f5413315640 1 -- 192.168.123.103:0/526811512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f540c19a990 con 0x7f540c108890 2026-03-09T16:12:03.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.188+0000 7f5413315640 1 -- 192.168.123.103:0/526811512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f540c19ae80 con 0x7f540c108890 2026-03-09T16:12:03.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.193+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f53f402fd00 con 0x7f540c108890 2026-03-09T16:12:03.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.193+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53f4041a40 con 0x7f540c108890 2026-03-09T16:12:03.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.193+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f53f403f070 con 0x7f540c108890 2026-03-09T16:12:03.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.199+0000 7f60009c9640 1 -- 192.168.123.103:0/2857769519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072a40 msgr2=0x7f5ffc10ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.199+0000 7f60009c9640 1 --2- 192.168.123.103:0/2857769519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072a40 0x7f5ffc10ca90 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f5fe40099b0 tx=0x7f5fe402f240 comp rx=0 tx=0).stop 2026-03-09T16:12:03.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.202+0000 7f60009c9640 1 -- 192.168.123.103:0/2857769519 shutdown_connections 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.202+0000 7f60009c9640 1 --2- 192.168.123.103:0/2857769519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072a40 0x7f5ffc10ca90 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.202+0000 7f60009c9640 1 --2- 192.168.123.103:0/2857769519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc072120 0x7f5ffc072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.202+0000 7f60009c9640 1 -- 192.168.123.103:0/2857769519 >> 192.168.123.103:0/2857769519 conn(0x7f5ffc06c7d0 msgr2=0x7f5ffc06cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.203+0000 7f60009c9640 1 -- 192.168.123.103:0/2857769519 shutdown_connections 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.200+0000 7f54037fe640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 0x7f53dc078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.200+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f53f40bd260 con 0x7f540c108890 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.200+0000 7f5413315640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 0x7f53d4003aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.200+0000 7f5413315640 1 -- 192.168.123.103:0/526811512 --> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f53d4006c40 con 0x7f53d40015e0 2026-03-09T16:12:03.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.203+0000 7f5412b14640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 0x7f53d4003aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.203+0000 7f5412b14640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 0x7f53d4003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.206+0000 7f60009c9640 1 -- 192.168.123.103:0/2857769519 wait complete. 2026-03-09T16:12:03.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.213+0000 7f5412313640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 0x7f53dc078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== osd.3 v2:192.168.123.105:6800/143716735 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f53d4006c40 con 0x7f53d40015e0 2026-03-09T16:12:03.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.206+0000 7f60009c9640 1 Processor -- start 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f60009c9640 1 -- start start 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f60009c9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f60009c9640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc1131a0 0x7f5ffc1b9ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f60009c9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ffc113750 con 0x7f5ffc072120 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.216+0000 7f60009c9640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ffc1138c0 con 0x7f5ffc1131a0 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.218+0000 7f5ffad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.218+0000 7f5ffad76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55856/0 (socket says 192.168.123.103:55856) 2026-03-09T16:12:03.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.218+0000 7f5412313640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 0x7f53dc078820 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f53fc004640 tx=0x7f53fc015040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.218+0000 7f5ffad76640 1 -- 192.168.123.103:0/4072325133 learned_addr learned my addr 192.168.123.103:0/4072325133 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5ffa575640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc1131a0 0x7f5ffc1b9ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5ffad76640 1 -- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc1131a0 msgr2=0x7f5ffc1b9ad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5ffad76640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc1131a0 0x7f5ffc1b9ad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5ffad76640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fe4009660 con 0x7f5ffc072120 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5ffad76640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f5ff000e990 tx=0x7f5ff000ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.222+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ff000cd30 con 0x7f5ffc072120 2026-03-09T16:12:03.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.225+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ffc1ba070 con 0x7f5ffc072120 2026-03-09T16:12:03.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.225+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ffc1ba5c0 con 0x7f5ffc072120 2026-03-09T16:12:03.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.226+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f5ffc11cc40 con 0x7f5ffc072120 2026-03-09T16:12:03.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.228+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5ff000ce90 con 0x7f5ffc072120 2026-03-09T16:12:03.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.228+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ff0010640 con 0x7f5ffc072120 2026-03-09T16:12:03.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.231+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f5ff00107a0 con 0x7f5ffc072120 2026-03-09T16:12:03.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.237+0000 7f5413315640 1 -- 192.168.123.103:0/526811512 --> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f53d4005d20 con 0x7f53d40015e0 2026-03-09T16:12:03.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.239+0000 7f54037fe640 1 -- 192.168.123.103:0/526811512 <== osd.3 v2:192.168.123.105:6800/143716735 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f53d4005d20 con 0x7f53d40015e0 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 msgr2=0x7f53d4003aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 0x7f53d4003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 msgr2=0x7f53dc078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 0x7f53dc078820 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f53fc004640 tx=0x7f53fc015040 comp rx=0 tx=0).stop 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 msgr2=0x7f540c1a0b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.244+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c1a0b60 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f53f402f7a0 tx=0x7f53f4004290 comp rx=0 tx=0).stop 2026-03-09T16:12:03.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.245+0000 7f5fdbfff640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 0x7f5fc4078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 shutdown_connections 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:6800/143716735,v1:192.168.123.105:6801/143716735] conn(0x7f53d40015e0 0x7f53d4003aa0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f53dc076360 0x7f53dc078820 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f540c108890 0x7f540c1a0b60 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 --2- 192.168.123.103:0/526811512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f540c102890 0x7f540c1a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 >> 192.168.123.103:0/526811512 conn(0x7f540c0fe5d0 msgr2=0x7f540c10c850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 shutdown_connections 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.248+0000 7f54017fa640 1 -- 192.168.123.103:0/526811512 wait complete. 2026-03-09T16:12:03.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.249+0000 7f5ffa575640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 0x7f5fc4078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.252+0000 7fd60c942640 1 -- 192.168.123.103:0/3444132304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608072a40 msgr2=0x7fd60810ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.252+0000 7fd60c942640 1 --2- 192.168.123.103:0/3444132304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608072a40 0x7fd60810ca90 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fd5f00099b0 tx=0x7fd5f002f240 comp rx=0 tx=0).stop 2026-03-09T16:12:03.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7fd60c942640 1 -- 192.168.123.103:0/3444132304 shutdown_connections 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7fd60c942640 1 --2- 192.168.123.103:0/3444132304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608072a40 0x7fd60810ca90 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7fd60c942640 1 --2- 192.168.123.103:0/3444132304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.249+0000 7f5ffa575640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 0x7f5fc4078680 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f5fe4002410 tx=0x7f5fe4005c50 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f5ff0014070 con 0x7f5ffc072120 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7f5fdbfff640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 0x7f5fc407e090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7fd60c942640 1 -- 192.168.123.103:0/3444132304 >> 192.168.123.103:0/3444132304 conn(0x7fd60806c7d0 msgr2=0x7fd60806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.255+0000 7fd60c942640 1 -- 192.168.123.103:0/3444132304 shutdown_connections 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.253+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f5fc407e740 con 0x7f5fc407bc70 2026-03-09T16:12:03.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.255+0000 7f5ffb577640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 0x7f5fc407e090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.255+0000 7fd60c942640 1 -- 192.168.123.103:0/3444132304 wait complete. 2026-03-09T16:12:03.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.255+0000 7fd60c942640 1 Processor -- start 2026-03-09T16:12:03.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7f5ff00995b0 con 0x7f5ffc072120 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7fd60c942640 1 -- start start 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7fd60c942640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7fd60c942640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608113140 0x7fd6081b9ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7fd60c942640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6081136f0 con 0x7fd608113140 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.257+0000 7fd60c942640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd608113860 con 0x7fd608072120 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.259+0000 7fd6077fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.259+0000 7f5ffb577640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 0x7f5fc407e090 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.259+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== osd.4 v2:192.168.123.105:6808/1555294449 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f5fc407e740 con 0x7f5fc407bc70 2026-03-09T16:12:03.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.259+0000 7fd6077fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47546/0 (socket says 192.168.123.103:47546) 2026-03-09T16:12:03.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.259+0000 7fd6077fe640 1 -- 192.168.123.103:0/336203536 learned_addr learned my addr 192.168.123.103:0/336203536 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:03.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.261+0000 7fd606ffd640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608113140 0x7fd6081b9ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.261+0000 7fd6077fe640 1 -- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608113140 msgr2=0x7fd6081b9ad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.261+0000 7fd6077fe640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608113140 0x7fd6081b9ad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.261+0000 7fd6077fe640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5f0009660 con 0x7fd608072120 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.263+0000 7fd6077fe640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fd5fc00e990 tx=0x7fd5fc00ee60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.266+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5fc00cd30 con 0x7fd608072120 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.266+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6081ba070 con 0x7fd608072120 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.266+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6081ba5c0 con 0x7fd608072120 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.267+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd5fc00ce90 con 0x7fd608072120 2026-03-09T16:12:03.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.267+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5fc010640 con 0x7fd608072120 2026-03-09T16:12:03.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.268+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd5fc0107a0 con 0x7fd608072120 2026-03-09T16:12:03.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.275+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fd60811cc40 con 0x7fd608072120 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 -- 192.168.123.103:0/3773947651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6548072af0 msgr2=0x7f654810ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 --2- 192.168.123.103:0/3773947651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6548072af0 0x7f654810ba70 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f654000b3e0 tx=0x7f654002f730 comp rx=0 tx=0).stop 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 -- 192.168.123.103:0/3773947651 shutdown_connections 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 --2- 192.168.123.103:0/3773947651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6548072af0 0x7f654810ba70 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 --2- 192.168.123.103:0/3773947651 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f6548072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 -- 192.168.123.103:0/3773947651 >> 192.168.123.103:0/3773947651 conn(0x7f654806c7e0 msgr2=0x7f654806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 -- 192.168.123.103:0/3773947651 shutdown_connections 2026-03-09T16:12:03.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.281+0000 7f6550310640 1 -- 192.168.123.103:0/3773947651 wait complete. 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 Processor -- start 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 -- start start 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 0x7f654807df10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6548084600 con 0x7f654807dab0 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f6550310640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65480820a0 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f654e085640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f654e085640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47564/0 (socket says 192.168.123.103:47564) 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f654e085640 1 -- 192.168.123.103:0/3107331493 learned_addr learned my addr 192.168.123.103:0/3107331493 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.282+0000 7f654d884640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 0x7f654807df10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f654e085640 1 -- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 msgr2=0x7f654807df10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f654e085640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 0x7f654807df10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f654e085640 1 -- 192.168.123.103:0/3107331493 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6540009d00 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f654d884640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 0x7f654807df10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f654e085640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f654400b4f0 tx=0x7f654400b9c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.285+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6544004280 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.286+0000 7f6550310640 1 -- 192.168.123.103:0/3107331493 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6548082380 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.286+0000 7f6550310640 1 -- 192.168.123.103:0/3107331493 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f65480828d0 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.286+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f65440043e0 con 0x7f6548072140 2026-03-09T16:12:03.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.286+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6544010af0 con 0x7f6548072140 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7fd604ff9640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 0x7fd5d8078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7fd606ffd640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 0x7fd5d8078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.290+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f654401a460 con 0x7f6548072140 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.290+0000 7f653f7fe640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 0x7f6528078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.290+0000 7f654d884640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 0x7f6528078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7f654d884640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 0x7f6528078750 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f654002fc40 tx=0x7f6540002750 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f6544097ab0 con 0x7f6548072140 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7f6550310640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 0x7f6514003aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7f6550310640 1 -- 192.168.123.103:0/3107331493 --> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f6514006c40 con 0x7f65140015e0 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.291+0000 7f654e886640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 0x7f6514003aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.292+0000 7fd606ffd640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 0x7fd5d8078750 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fd5f0002410 tx=0x7fd5f003a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.293+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fd5fc014070 con 0x7fd608072120 2026-03-09T16:12:03.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.295+0000 7fd604ff9640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 0x7fd5d807e160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.295+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fd5d807e810 con 0x7fd5d807bd40 2026-03-09T16:12:03.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.295+0000 7fd607fff640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 0x7fd5d807e160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.295+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7fd5fc09d240 con 0x7fd608072120 2026-03-09T16:12:03.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.295+0000 7f654e886640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 0x7f6514003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.298+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== osd.5 v2:192.168.123.105:6816/262747247 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f6514006c40 con 0x7f65140015e0 2026-03-09T16:12:03.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.299+0000 7f7fccea4640 1 -- 192.168.123.103:0/1083995525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072120 msgr2=0x7f7fc8072500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.299+0000 7fd607fff640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 0x7fd5d807e160 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.299+0000 7f7fccea4640 1 --2- 192.168.123.103:0/1083995525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072120 0x7f7fc8072500 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f7fb00099b0 tx=0x7f7fb002f240 comp rx=0 tx=0).stop 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.301+0000 7f7fccea4640 1 -- 192.168.123.103:0/1083995525 shutdown_connections 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.301+0000 7f7fccea4640 1 --2- 192.168.123.103:0/1083995525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7fc8072a40 0x7f7fc810ca90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.301+0000 7f7fccea4640 1 --2- 192.168.123.103:0/1083995525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072120 0x7f7fc8072500 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.301+0000 7f7fccea4640 1 -- 192.168.123.103:0/1083995525 >> 192.168.123.103:0/1083995525 conn(0x7f7fc806c7d0 msgr2=0x7f7fc806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.299+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== osd.2 v2:192.168.123.103:6818/2017087007 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7fd5d807e810 con 0x7fd5d807bd40 2026-03-09T16:12:03.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.304+0000 7f7fccea4640 1 -- 192.168.123.103:0/1083995525 shutdown_connections 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.304+0000 7f7fccea4640 1 -- 192.168.123.103:0/1083995525 wait complete. 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 Processor -- start 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 -- start start 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7fc8113180 0x7f7fc81b9b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fc8113670 con 0x7f7fc8072a40 2026-03-09T16:12:03.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fccea4640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fc81137e0 con 0x7f7fc8113180 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fc77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fc77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55926/0 (socket says 192.168.123.103:55926) 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.305+0000 7f7fc77fe640 1 -- 192.168.123.103:0/2368094659 learned_addr learned my addr 192.168.123.103:0/2368094659 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.307+0000 7f7fc77fe640 1 -- 192.168.123.103:0/2368094659 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7fc8113180 msgr2=0x7f7fc81b9b70 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.307+0000 7f7fc77fe640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7fc8113180 0x7f7fc81b9b70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.307+0000 7f7fc77fe640 1 -- 192.168.123.103:0/2368094659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7fb0009660 con 0x7f7fc8072a40 2026-03-09T16:12:03.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.307+0000 7f7fc77fe640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f7fb002f750 tx=0x7f7fb0004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.308+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7fb003d070 con 0x7f7fc8072a40 2026-03-09T16:12:03.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.308+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7fc81ba0b0 con 0x7f7fc8072a40 2026-03-09T16:12:03.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.308+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7fc81ba570 con 0x7f7fc8072a40 2026-03-09T16:12:03.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.308+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7fb0004590 con 0x7f7fc8072a40 2026-03-09T16:12:03.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.308+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7fb0031070 con 0x7f7fc8072a40 2026-03-09T16:12:03.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.310+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f7fb0038820 con 0x7f7fc8072a40 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.311+0000 7f7fc4ff9640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 0x7f7fac078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.311+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f7fb00bbde0 con 0x7f7fc8072a40 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.312+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 0x7f7f94003aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.312+0000 7f7fc6ffd640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 0x7f7fac078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.312+0000 7f7fc7fff640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 0x7f7f94003aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.313+0000 7f7fc6ffd640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 0x7f7fac078750 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7fb800ac00 tx=0x7f7fb8009250 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.313+0000 7f7fc7fff640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 0x7f7f94003aa0 crc :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:03.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.313+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 --> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7f94006c40 con 0x7f7f940015e0 2026-03-09T16:12:03.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.315+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== osd.1 v2:192.168.123.103:6810/3417259072 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f7f94006c40 con 0x7f7f940015e0 2026-03-09T16:12:03.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.320+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 --> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f5ffc108570 con 0x7f5fc407bc70 2026-03-09T16:12:03.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f5fdbfff640 1 -- 192.168.123.103:0/4072325133 <== osd.4 v2:192.168.123.105:6808/1555294449 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f5ffc108570 con 0x7f5fc407bc70 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.321+0000 7f6550310640 1 -- 192.168.123.103:0/3107331493 --> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f6514005d20 con 0x7f65140015e0 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653f7fe640 1 -- 192.168.123.103:0/3107331493 <== osd.5 v2:192.168.123.105:6816/262747247 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f6514005d20 con 0x7f65140015e0 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 msgr2=0x7f6514003aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 0x7f6514003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 msgr2=0x7f6528078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 0x7f6528078750 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f654002fc40 tx=0x7f6540002750 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 msgr2=0x7f654807d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f654400b4f0 tx=0x7f654400b9c0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 shutdown_connections 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6528076290 0x7f6528078750 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f654807dab0 0x7f654807df10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:6816/262747247,v1:192.168.123.105:6817/262747247] conn(0x7f65140015e0 0x7f6514003aa0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 --2- 192.168.123.103:0/3107331493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6548072140 0x7f654807d570 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 >> 192.168.123.103:0/3107331493 conn(0x7f654806c7e0 msgr2=0x7f654806f8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 shutdown_connections 2026-03-09T16:12:03.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.322+0000 7f653d7fa640 1 -- 192.168.123.103:0/3107331493 wait complete. 2026-03-09T16:12:03.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.325+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 msgr2=0x7f5fc407e090 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.326+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 0x7f5fc407e090 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 msgr2=0x7f5fc4078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 0x7f5fc4078680 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f5fe4002410 tx=0x7f5fe4005c50 comp rx=0 tx=0).stop 2026-03-09T16:12:03.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 msgr2=0x7f5ffc112c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f5ff000e990 tx=0x7f5ff000ee60 comp rx=0 tx=0).stop 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 shutdown_connections 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:6808/1555294449,v1:192.168.123.105:6809/1555294449] conn(0x7f5fc407bc70 0x7f5fc407e090 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5fc40761c0 0x7f5fc4078680 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ffc1131a0 0x7f5ffc1b9ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 --2- 192.168.123.103:0/4072325133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ffc072120 0x7f5ffc112c60 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 >> 192.168.123.103:0/4072325133 conn(0x7f5ffc06c7d0 msgr2=0x7f5ffc070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 shutdown_connections 2026-03-09T16:12:03.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.329+0000 7f60009c9640 1 -- 192.168.123.103:0/4072325133 wait complete. 2026-03-09T16:12:03.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.338+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 --> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fd608108570 con 0x7fd5d807bd40 2026-03-09T16:12:03.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.339+0000 7fd604ff9640 1 -- 192.168.123.103:0/336203536 <== osd.2 v2:192.168.123.103:6818/2017087007 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fd608108570 con 0x7fd5d807bd40 2026-03-09T16:12:03.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.355+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 msgr2=0x7fd5d807e160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.355+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 0x7fd5d807e160 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.362+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 msgr2=0x7fd5d8078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.362+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 0x7fd5d8078750 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fd5f0002410 tx=0x7fd5f003a040 comp rx=0 tx=0).stop 2026-03-09T16:12:03.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.362+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 msgr2=0x7fd608112c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.362+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fd5fc00e990 tx=0x7fd5fc00ee60 comp rx=0 tx=0).stop 2026-03-09T16:12:03.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 shutdown_connections 2026-03-09T16:12:03.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6818/2017087007,v1:192.168.123.103:6819/2017087007] conn(0x7fd5d807bd40 0x7fd5d807e160 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5d8076290 0x7fd5d8078750 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd608113140 0x7fd6081b9ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 --2- 192.168.123.103:0/336203536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd608072120 0x7fd608112c00 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.367+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 >> 192.168.123.103:0/336203536 conn(0x7fd60806c7d0 msgr2=0x7fd608070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.372+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 shutdown_connections 2026-03-09T16:12:03.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.372+0000 7fd60c942640 1 -- 192.168.123.103:0/336203536 wait complete. 2026-03-09T16:12:03.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.378+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 --> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f7f94005d20 con 0x7f7f940015e0 2026-03-09T16:12:03.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.380+0000 7f7fc4ff9640 1 -- 192.168.123.103:0/2368094659 <== osd.1 v2:192.168.123.103:6810/3417259072 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f7f94005d20 con 0x7f7f940015e0 2026-03-09T16:12:03.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.382+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 msgr2=0x7f7f94003aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.382+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 0x7f7f94003aa0 crc :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.385+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 msgr2=0x7f7fac078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.385+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 0x7f7fac078750 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7fb800ac00 tx=0x7f7fb8009250 comp rx=0 tx=0).stop 2026-03-09T16:12:03.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.385+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 msgr2=0x7f7fc8112c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:03.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.385+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f7fb002f750 tx=0x7f7fb0004290 comp rx=0 tx=0).stop 2026-03-09T16:12:03.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 shutdown_connections 2026-03-09T16:12:03.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6810/3417259072,v1:192.168.123.103:6811/3417259072] conn(0x7f7f940015e0 0x7f7f94003aa0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7fac076290 0x7f7fac078750 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7fc8113180 0x7f7fc81b9b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 --2- 192.168.123.103:0/2368094659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc8072a40 0x7f7fc8112c40 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:03.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 >> 192.168.123.103:0/2368094659 conn(0x7f7fc806c7d0 msgr2=0x7f7fc8070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:03.389 INFO:teuthology.orchestra.run.vm03.stdout:107374182406 2026-03-09T16:12:03.389 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.3 2026-03-09T16:12:03.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 shutdown_connections 2026-03-09T16:12:03.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:03.388+0000 7f7fccea4640 1 -- 192.168.123.103:0/2368094659 wait complete. 2026-03-09T16:12:03.446 INFO:teuthology.orchestra.run.vm03.stdout:128849018885 2026-03-09T16:12:03.446 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.4 2026-03-09T16:12:03.480 INFO:teuthology.orchestra.run.vm03.stdout:146028888067 2026-03-09T16:12:03.480 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.5 2026-03-09T16:12:03.521 INFO:teuthology.orchestra.run.vm03.stdout:77309411336 2026-03-09T16:12:03.522 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.2 2026-03-09T16:12:03.533 INFO:teuthology.orchestra.run.vm03.stdout:60129542154 2026-03-09T16:12:03.533 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd last-stat-seq osd.1 2026-03-09T16:12:03.542 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:03.921 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:04.038 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 -- 192.168.123.103:0/1184819666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072140 msgr2=0x7f76e4072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 --2- 192.168.123.103:0/1184819666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072140 0x7f76e4072520 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f76d8007980 tx=0x7f76d8031140 comp rx=0 tx=0).stop 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 -- 192.168.123.103:0/1184819666 shutdown_connections 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 --2- 192.168.123.103:0/1184819666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76e4072af0 0x7f76e410ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 --2- 192.168.123.103:0/1184819666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072140 0x7f76e4072520 secure :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f76d8007980 tx=0x7f76d8031140 comp rx=0 tx=0).stop 2026-03-09T16:12:04.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.081+0000 7f76ec944640 1 -- 192.168.123.103:0/1184819666 >> 192.168.123.103:0/1184819666 conn(0x7f76e406c7e0 msgr2=0x7f76e406cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.087+0000 7f76ec944640 1 -- 192.168.123.103:0/1184819666 shutdown_connections 2026-03-09T16:12:04.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.088+0000 7f76ec944640 1 -- 192.168.123.103:0/1184819666 wait complete. 2026-03-09T16:12:04.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.088+0000 7f76ec944640 1 Processor -- start 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.088+0000 7f76ec944640 1 -- start start 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ec944640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ec944640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76e407e100 0x7f76e4082460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ec944640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f76e407e5f0 con 0x7f76e4072af0 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ec944640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f76e407e760 con 0x7f76e407e100 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55940/0 (socket says 192.168.123.103:55940) 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 -- 192.168.123.103:0/2012857306 learned_addr learned my addr 192.168.123.103:0/2012857306 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 -- 192.168.123.103:0/2012857306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76e407e100 msgr2=0x7f76e4082460 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76e407e100 0x7f76e4082460 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 -- 192.168.123.103:0/2012857306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f76d80075d0 con 0x7f76e4072af0 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.089+0000 7f76ea6b9640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f76d80040c0 tx=0x7f76d8002910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.090+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f76d800f030 con 0x7f76e4072af0 2026-03-09T16:12:04.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.090+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f76d8002a00 con 0x7f76e4072af0 2026-03-09T16:12:04.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.090+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f76d80417d0 con 0x7f76e4072af0 2026-03-09T16:12:04.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.090+0000 7f76ec944640 1 -- 192.168.123.103:0/2012857306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f76e40829a0 con 0x7f76e4072af0 2026-03-09T16:12:04.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.090+0000 7f76ec944640 1 -- 192.168.123.103:0/2012857306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f76e4082e60 con 0x7f76e4072af0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.091+0000 7f76ec944640 1 -- 192.168.123.103:0/2012857306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f76b4005350 con 0x7f76e4072af0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.094+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f76d8049050 con 0x7f76e4072af0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.094+0000 7f76d77fe640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 0x7f76b8078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.094+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f76d80bb5a0 con 0x7f76e4072af0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.097+0000 7f76e9eb8640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 0x7f76b8078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.097+0000 7f76e9eb8640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 0x7f76b8078630 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f76e00060b0 tx=0x7f76e0006040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.097+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f76d8084fb0 con 0x7f76e4072af0 2026-03-09T16:12:04.164 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:03 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:04.185 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:04.189 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:04.256 INFO:teuthology.orchestra.run.vm03.stdout:38654705677 2026-03-09T16:12:04.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.249+0000 7f76ec944640 1 -- 192.168.123.103:0/2012857306 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f76b40051c0 con 0x7f76e4072af0 2026-03-09T16:12:04.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.251+0000 7f76d77fe640 1 -- 192.168.123.103:0/2012857306 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f76d8084950 con 0x7f76e4072af0 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 msgr2=0x7f76b8078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 0x7f76b8078630 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f76e00060b0 tx=0x7f76e0006040 comp rx=0 tx=0).stop 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 msgr2=0x7f76e407dbc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f76d80040c0 tx=0x7f76d8002910 comp rx=0 tx=0).stop 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 shutdown_connections 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f76b8076170 0x7f76b8078630 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76e407e100 0x7f76e4082460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 --2- 192.168.123.103:0/2012857306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f76e4072af0 0x7f76e407dbc0 secure :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f76d80040c0 tx=0x7f76d8002910 comp rx=0 tx=0).stop 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.255+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 >> 192.168.123.103:0/2012857306 conn(0x7f76e406c7e0 msgr2=0x7f76e406fdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.263+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 shutdown_connections 2026-03-09T16:12:04.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.264+0000 7f76d57fa640 1 -- 192.168.123.103:0/2012857306 wait complete. 2026-03-09T16:12:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:03 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:04.375 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:04.396 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705677 for osd.0 2026-03-09T16:12:04.396 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 -- 192.168.123.103:0/1515358817 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5000720e0 msgr2=0x7fe500072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 --2- 192.168.123.103:0/1515358817 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5000720e0 0x7fe500072520 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fe4f000bb70 tx=0x7fe4f0030fe0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 -- 192.168.123.103:0/1515358817 shutdown_connections 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 --2- 192.168.123.103:0/1515358817 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5000720e0 0x7fe500072520 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 --2- 192.168.123.103:0/1515358817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50010d4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 -- 192.168.123.103:0/1515358817 >> 192.168.123.103:0/1515358817 conn(0x7fe50006b7f0 msgr2=0x7fe50006bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 -- 192.168.123.103:0/1515358817 shutdown_connections 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.634+0000 7fe50793c640 1 -- 192.168.123.103:0/1515358817 wait complete. 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 Processor -- start 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 -- start start 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe50007d9b0 0x7fe50007de10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe500084490 con 0x7fe50007d9b0 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.635+0000 7fe50793c640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe5000845d0 con 0x7fe50010d110 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47618/0 (socket says 192.168.123.103:47618) 2026-03-09T16:12:04.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 -- 192.168.123.103:0/3541541449 learned_addr learned my addr 192.168.123.103:0/3541541449 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 -- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe50007d9b0 msgr2=0x7fe50007de10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe50007d9b0 0x7fe50007de10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.638+0000 7fe5056b1640 1 -- 192.168.123.103:0/3541541449 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe4f000b820 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.639+0000 7fe5056b1640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fe4fc0136b0 tx=0x7fe4fc013b80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.639+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4fc00a0d0 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.639+0000 7fe50793c640 1 -- 192.168.123.103:0/3541541449 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe500081f90 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.639+0000 7fe50793c640 1 -- 192.168.123.103:0/3541541449 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe5000824e0 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.639+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe4fc004d60 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.640+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4fc005740 con 0x7fe50010d110 2026-03-09T16:12:04.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.641+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe4fc022050 con 0x7fe50010d110 2026-03-09T16:12:04.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.644+0000 7fe4f67fc640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 0x7fe4d8078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.645+0000 7fe504eb0640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 0x7fe4d8078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.645+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fe4fc09be20 con 0x7fe50010d110 2026-03-09T16:12:04.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.646+0000 7fe504eb0640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 0x7fe4d8078750 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fe4f000b440 tx=0x7fe4f0005d90 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.646+0000 7fe50793c640 1 -- 192.168.123.103:0/3541541449 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe4d0005350 con 0x7fe50010d110 2026-03-09T16:12:04.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.651+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe4fc065770 con 0x7fe50010d110 2026-03-09T16:12:04.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.829+0000 7f32ed7dd640 1 -- 192.168.123.103:0/2954670950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e8072a40 msgr2=0x7f32e810ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.829+0000 7f32ed7dd640 1 --2- 192.168.123.103:0/2954670950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e8072a40 0x7f32e810ca90 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f32d00099b0 tx=0x7f32d002f220 comp rx=0 tx=0).stop 2026-03-09T16:12:04.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.833+0000 7f32ed7dd640 1 -- 192.168.123.103:0/2954670950 shutdown_connections 2026-03-09T16:12:04.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.833+0000 7f32ed7dd640 1 --2- 192.168.123.103:0/2954670950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e8072a40 0x7f32e810ca90 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.833+0000 7f32ed7dd640 1 --2- 192.168.123.103:0/2954670950 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e8072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.833+0000 7f32ed7dd640 1 -- 192.168.123.103:0/2954670950 >> 192.168.123.103:0/2954670950 conn(0x7f32e806c7d0 msgr2=0x7f32e806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.833+0000 7f32ed7dd640 1 -- 192.168.123.103:0/2954670950 shutdown_connections 2026-03-09T16:12:04.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.834+0000 7f32ed7dd640 1 -- 192.168.123.103:0/2954670950 wait complete. 2026-03-09T16:12:04.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.834+0000 7f32ed7dd640 1 Processor -- start 2026-03-09T16:12:04.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.837+0000 7f32ed7dd640 1 -- start start 2026-03-09T16:12:04.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.838+0000 7f32ed7dd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.838+0000 7f32ed7dd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e81a7b40 0x7f32e81abf40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.838+0000 7f32ed7dd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32e81a8140 con 0x7f32e81a7b40 2026-03-09T16:12:04.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.838+0000 7f32ed7dd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32e81a82b0 con 0x7f32e8072120 2026-03-09T16:12:04.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.846+0000 7f32e7fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.849+0000 7f32e7fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47630/0 (socket says 192.168.123.103:47630) 2026-03-09T16:12:04.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.849+0000 7f32e7fff640 1 -- 192.168.123.103:0/1382645047 learned_addr learned my addr 192.168.123.103:0/1382645047 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.849+0000 7f32e77fe640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e81a7b40 0x7f32e81abf40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.852+0000 7f32e7fff640 1 -- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e81a7b40 msgr2=0x7f32e81abf40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.852+0000 7f32e7fff640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e81a7b40 0x7f32e81abf40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.853+0000 7f32e7fff640 1 -- 192.168.123.103:0/1382645047 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32d0009660 con 0x7f32e8072120 2026-03-09T16:12:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.854+0000 7f32e7fff640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f32dc00e9b0 tx=0x7f32dc00ee80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.854+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f32dc00cd90 con 0x7f32e8072120 2026-03-09T16:12:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.854+0000 7f32ed7dd640 1 -- 192.168.123.103:0/1382645047 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f32e81ac540 con 0x7f32e8072120 2026-03-09T16:12:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.854+0000 7f32ed7dd640 1 -- 192.168.123.103:0/1382645047 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f32e81aca90 con 0x7f32e8072120 2026-03-09T16:12:04.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.857+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f32dc004590 con 0x7f32e8072120 2026-03-09T16:12:04.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.857+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f32dc010640 con 0x7f32e8072120 2026-03-09T16:12:04.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.859+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f32dc0107a0 con 0x7f32e8072120 2026-03-09T16:12:04.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.861+0000 7f32e57fa640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 0x7f32b8078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.864+0000 7f32e77fe640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 0x7f32b8078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.864+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f32dc014070 con 0x7f32e8072120 2026-03-09T16:12:04.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.864+0000 7f32e77fe640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 0x7f32b8078750 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f32d0002410 tx=0x7f32d003a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.865+0000 7f32ed7dd640 1 -- 192.168.123.103:0/1382645047 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f32b4005350 con 0x7f32e8072120 2026-03-09T16:12:04.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.878+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f32dc062660 con 0x7f32e8072120 2026-03-09T16:12:04.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.921+0000 7f4cfffff640 1 -- 192.168.123.103:0/1237390519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072120 msgr2=0x7f4d00072500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.921+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1237390519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072120 0x7f4d00072500 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f4cf40099b0 tx=0x7f4cf402f240 comp rx=0 tx=0).stop 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.925+0000 7f4cfffff640 1 -- 192.168.123.103:0/1237390519 shutdown_connections 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.925+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1237390519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072a40 0x7f4d0010ca90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.925+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1237390519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072120 0x7f4d00072500 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.925+0000 7f4cfffff640 1 -- 192.168.123.103:0/1237390519 >> 192.168.123.103:0/1237390519 conn(0x7f4d0006c7d0 msgr2=0x7f4d0006cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.926+0000 7f4cfffff640 1 -- 192.168.123.103:0/1237390519 shutdown_connections 2026-03-09T16:12:04.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.926+0000 7f4cfffff640 1 -- 192.168.123.103:0/1237390519 wait complete. 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.926+0000 7f4cfffff640 1 Processor -- start 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.926+0000 7f4cfffff640 1 -- start start 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 0x7f4d001a75e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 0x7f4d001a7b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfffff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d001a81b0 con 0x7f4d00072a40 2026-03-09T16:12:04.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfffff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d001abf20 con 0x7f4d00072120 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfe7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 0x7f4d001a7b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfeffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 0x7f4d001a75e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfeffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 0x7f4d001a75e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47652/0 (socket says 192.168.123.103:47652) 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.927+0000 7f4cfeffd640 1 -- 192.168.123.103:0/1123688902 learned_addr learned my addr 192.168.123.103:0/1123688902 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfe7fc640 1 -- 192.168.123.103:0/1123688902 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 msgr2=0x7f4d001a75e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfe7fc640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 0x7f4d001a75e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfe7fc640 1 -- 192.168.123.103:0/1123688902 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cf4009660 con 0x7f4d00072a40 2026-03-09T16:12:04.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfe7fc640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 0x7f4d001a7b20 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f4ce800e970 tx=0x7f4ce800ee40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce800ccb0 con 0x7f4d00072a40 2026-03-09T16:12:04.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4ce8004590 con 0x7f4d00072a40 2026-03-09T16:12:04.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d001ac200 con 0x7f4d00072a40 2026-03-09T16:12:04.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.928+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d001ac750 con 0x7f4d00072a40 2026-03-09T16:12:04.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.929+0000 7f9bfddc6640 1 -- 192.168.123.103:0/3596413146 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 msgr2=0x7f9bf810cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.929+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/3596413146 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf810cd90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f9be00098e0 tx=0x7f9be002f1b0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.930+0000 7f9bfddc6640 1 -- 192.168.123.103:0/3596413146 shutdown_connections 2026-03-09T16:12:04.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.930+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/3596413146 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf810cd90 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.930+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/3596413146 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9bf8072340 0x7f9bf8072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.930+0000 7f9bfddc6640 1 -- 192.168.123.103:0/3596413146 >> 192.168.123.103:0/3596413146 conn(0x7f9bf806b7f0 msgr2=0x7f9bf806bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.932+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d00108570 con 0x7f4d00072a40 2026-03-09T16:12:04.934 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:04 vm03 ceph-mon[51019]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:04.934 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:04 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2012857306' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T16:12:04.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.934+0000 7f9bfddc6640 1 -- 192.168.123.103:0/3596413146 shutdown_connections 2026-03-09T16:12:04.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.932+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce8010640 con 0x7f4d00072a40 2026-03-09T16:12:04.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.932+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f4ce8010870 con 0x7f4d00072a40 2026-03-09T16:12:04.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.934+0000 7f4cdffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 0x7f4cd4078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.934+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f4ce8014070 con 0x7f4d00072a40 2026-03-09T16:12:04.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.937+0000 7f4cfeffd640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 0x7f4cd4078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.938+0000 7f4cfeffd640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 0x7f4cd4078680 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f4cf4002410 tx=0x7f4cf403a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.939+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f4ce8062570 con 0x7f4d00072a40 2026-03-09T16:12:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.939+0000 7f9bfddc6640 1 -- 192.168.123.103:0/3596413146 wait complete. 2026-03-09T16:12:04.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 Processor -- start 2026-03-09T16:12:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 -- start start 2026-03-09T16:12:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9bf8072340 0x7f9bf81ad620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9bf81a7710 con 0x7f9bf8072340 2026-03-09T16:12:04.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.940+0000 7f9bfddc6640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9bf81a7880 con 0x7f9bf8072cf0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47654/0 (socket says 192.168.123.103:47654) 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 -- 192.168.123.103:0/1804096548 learned_addr learned my addr 192.168.123.103:0/1804096548 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 -- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9bf8072340 msgr2=0x7f9bf81ad620 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9bf8072340 0x7f9bf81ad620 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 -- 192.168.123.103:0/1804096548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9be0009590 con 0x7f9bf8072cf0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9beffff640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f9be002f6c0 tx=0x7f9be0004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9be003d070 con 0x7f9bf8072cf0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9be0004510 con 0x7f9bf8072cf0 2026-03-09T16:12:04.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.941+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9be0031070 con 0x7f9bf8072cf0 2026-03-09T16:12:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.943+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9bf81a79c0 con 0x7f9bf8072cf0 2026-03-09T16:12:04.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.943+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9bf81a7f10 con 0x7f9bf8072cf0 2026-03-09T16:12:04.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.944+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9bb8005350 con 0x7f9bf8072cf0 2026-03-09T16:12:04.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.944+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f9be00384b0 con 0x7f9bf8072cf0 2026-03-09T16:12:04.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.945+0000 7f9bf57fa640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 0x7f9bd409ae90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.945+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f9be00bbb00 con 0x7f9bf8072cf0 2026-03-09T16:12:04.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.950+0000 7f9bf77fe640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 0x7f9bd409ae90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.952+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9be00853d0 con 0x7f9bf8072cf0 2026-03-09T16:12:04.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.953+0000 7f9bf77fe640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 0x7f9bd409ae90 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f9bd8009b70 tx=0x7f9bd8009340 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.974+0000 7f7773724640 1 -- 192.168.123.103:0/4085149932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 msgr2=0x7f776c111160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.974+0000 7f7773724640 1 --2- 192.168.123.103:0/4085149932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c111160 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f7758009a00 tx=0x7f775802f290 comp rx=0 tx=0).stop 2026-03-09T16:12:04.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 -- 192.168.123.103:0/4085149932 shutdown_connections 2026-03-09T16:12:04.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 --2- 192.168.123.103:0/4085149932 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c111160 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 --2- 192.168.123.103:0/4085149932 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f776c075720 0x7f776c075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 -- 192.168.123.103:0/4085149932 >> 192.168.123.103:0/4085149932 conn(0x7f776c0fe540 msgr2=0x7f776c100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 -- 192.168.123.103:0/4085149932 shutdown_connections 2026-03-09T16:12:04.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 -- 192.168.123.103:0/4085149932 wait complete. 2026-03-09T16:12:04.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.979+0000 7f7773724640 1 Processor -- start 2026-03-09T16:12:04.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.981+0000 7f7773724640 1 -- start start 2026-03-09T16:12:04.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7773724640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f776c075720 0x7f776c19ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7773724640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7773724640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f776c19f8f0 con 0x7f776c076040 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7773724640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f776c1a2560 con 0x7f776c075720 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56024/0 (socket says 192.168.123.103:56024) 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 -- 192.168.123.103:0/584587005 learned_addr learned my addr 192.168.123.103:0/584587005 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 -- 192.168.123.103:0/584587005 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f776c075720 msgr2=0x7f776c19ecd0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f776c075720 0x7f776c19ecd0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 -- 192.168.123.103:0/584587005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7758009660 con 0x7f776c076040 2026-03-09T16:12:04.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.983+0000 7f7771f21640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f775802f7a0 tx=0x7f7758031d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.984+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7758002a30 con 0x7f776c076040 2026-03-09T16:12:04.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.984+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7758031eb0 con 0x7f776c076040 2026-03-09T16:12:04.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.984+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7758031280 con 0x7f776c076040 2026-03-09T16:12:04.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.984+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f776c1a27e0 con 0x7f776c076040 2026-03-09T16:12:04.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.985+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f776c1a2c50 con 0x7f776c076040 2026-03-09T16:12:04.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.986+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f775803f070 con 0x7f776c076040 2026-03-09T16:12:04.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.987+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f773c005350 con 0x7f776c076040 2026-03-09T16:12:04.993 INFO:teuthology.orchestra.run.vm03.stdout:107374182406 2026-03-09T16:12:04.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.990+0000 7fe50793c640 1 -- 192.168.123.103:0/3541541449 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7fe4d00051c0 con 0x7fe50010d110 2026-03-09T16:12:04.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.991+0000 7fe4f67fc640 1 -- 192.168.123.103:0/3541541449 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fe4fc00a450 con 0x7fe50010d110 2026-03-09T16:12:04.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.993+0000 7f77577fe640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 0x7f77400784c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:04.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.993+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f77580c0170 con 0x7f776c076040 2026-03-09T16:12:04.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.993+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f77580c1050 con 0x7f776c076040 2026-03-09T16:12:04.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.993+0000 7f7772722640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 0x7f77400784c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:04.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.994+0000 7f7772722640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 0x7f77400784c0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7764009770 tx=0x7f7764006cd0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:04.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.995+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 msgr2=0x7fe4d8078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.995+0000 7fe4cbfff640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 0x7fe4d8078750 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fe4f000b440 tx=0x7fe4f0005d90 comp rx=0 tx=0).stop 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 msgr2=0x7fe50007d470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fe4fc0136b0 tx=0x7fe4fc013b80 comp rx=0 tx=0).stop 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 shutdown_connections 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe4d8076290 0x7fe4d8078750 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe50007d9b0 0x7fe50007de10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 --2- 192.168.123.103:0/3541541449 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50010d110 0x7fe50007d470 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:04.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 >> 192.168.123.103:0/3541541449 conn(0x7fe50006b7f0 msgr2=0x7fe5000715f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:04.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 shutdown_connections 2026-03-09T16:12:04.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:04.997+0000 7fe4cbfff640 1 -- 192.168.123.103:0/3541541449 wait complete. 2026-03-09T16:12:05.065 INFO:tasks.cephadm.ceph_manager.ceph:need seq 107374182406 got 107374182406 for osd.3 2026-03-09T16:12:05.065 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:05.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.154+0000 7f32ed7dd640 1 -- 192.168.123.103:0/1382645047 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f32b40051c0 con 0x7f32e8072120 2026-03-09T16:12:05.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.156+0000 7f32e57fa640 1 -- 192.168.123.103:0/1382645047 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f32dc062000 con 0x7f32e8072120 2026-03-09T16:12:05.161 INFO:teuthology.orchestra.run.vm03.stdout:60129542154 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 msgr2=0x7f32b8078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 0x7f32b8078750 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f32d0002410 tx=0x7f32d003a040 comp rx=0 tx=0).stop 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 msgr2=0x7f32e81a7600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f32dc00e9b0 tx=0x7f32dc00ee80 comp rx=0 tx=0).stop 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 shutdown_connections 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f32b8076290 0x7f32b8078750 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32e81a7b40 0x7f32e81abf40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 --2- 192.168.123.103:0/1382645047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32e8072120 0x7f32e81a7600 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 >> 192.168.123.103:0/1382645047 conn(0x7f32e806c7d0 msgr2=0x7f32e8070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.165+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 shutdown_connections 2026-03-09T16:12:05.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.165+0000 7f32c6ffd640 1 -- 192.168.123.103:0/1382645047 wait complete. 2026-03-09T16:12:05.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.163+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f9bb80051c0 con 0x7f9bf8072cf0 2026-03-09T16:12:05.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.164+0000 7f9bf57fa640 1 -- 192.168.123.103:0/1804096548 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f9be0046020 con 0x7f9bf8072cf0 2026-03-09T16:12:05.167 INFO:teuthology.orchestra.run.vm03.stdout:128849018885 2026-03-09T16:12:05.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.172+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 msgr2=0x7f9bd409ae90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.172+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 0x7f9bd409ae90 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f9bd8009b70 tx=0x7f9bd8009340 comp rx=0 tx=0).stop 2026-03-09T16:12:05.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.172+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 msgr2=0x7f9bf81adb60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f9be002f6c0 tx=0x7f9be0004290 comp rx=0 tx=0).stop 2026-03-09T16:12:05.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 shutdown_connections 2026-03-09T16:12:05.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9bd40989d0 0x7f9bd409ae90 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bf8072cf0 0x7f9bf81adb60 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 --2- 192.168.123.103:0/1804096548 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9bf8072340 0x7f9bf81ad620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 >> 192.168.123.103:0/1804096548 conn(0x7f9bf806b7f0 msgr2=0x7f9bf810e5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.175+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 shutdown_connections 2026-03-09T16:12:05.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.177+0000 7f9bfddc6640 1 -- 192.168.123.103:0/1804096548 wait complete. 2026-03-09T16:12:05.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.230+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f4d0011c9c0 con 0x7f4d00072a40 2026-03-09T16:12:05.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.231+0000 7f4cdffff640 1 -- 192.168.123.103:0/1123688902 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f4ce8061f10 con 0x7f4d00072a40 2026-03-09T16:12:05.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.232+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f773c0058d0 con 0x7f776c076040 2026-03-09T16:12:05.236 INFO:teuthology.orchestra.run.vm03.stdout:146028888067 2026-03-09T16:12:05.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.236+0000 7f77577fe640 1 -- 192.168.123.103:0/584587005 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f7758038e60 con 0x7f776c076040 2026-03-09T16:12:05.237 INFO:teuthology.orchestra.run.vm03.stdout:77309411336 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.237+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 msgr2=0x7f4cd4078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.237+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 0x7f4cd4078680 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f4cf4002410 tx=0x7f4cf403a040 comp rx=0 tx=0).stop 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 msgr2=0x7f4d001a7b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 0x7f4d001a7b20 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f4ce800e970 tx=0x7f4ce800ee40 comp rx=0 tx=0).stop 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 shutdown_connections 2026-03-09T16:12:05.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f4cd40761c0 0x7f4cd4078680 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4d00072a40 0x7f4d001a7b20 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 --2- 192.168.123.103:0/1123688902 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d00072120 0x7f4d001a75e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 >> 192.168.123.103:0/1123688902 conn(0x7f4d0006c7d0 msgr2=0x7f4d0010dde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 shutdown_connections 2026-03-09T16:12:05.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.238+0000 7f4cfffff640 1 -- 192.168.123.103:0/1123688902 wait complete. 2026-03-09T16:12:05.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.242+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 msgr2=0x7f77400784c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 0x7f77400784c0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7764009770 tx=0x7f7764006cd0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 msgr2=0x7f776c19f210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f775802f7a0 tx=0x7f7758031d40 comp rx=0 tx=0).stop 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 shutdown_connections 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f7740076000 0x7f77400784c0 secure :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7764009770 tx=0x7f7764006cd0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f776c076040 0x7f776c19f210 secure :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f775802f7a0 tx=0x7f7758031d40 comp rx=0 tx=0).stop 2026-03-09T16:12:05.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 --2- 192.168.123.103:0/584587005 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f776c075720 0x7f776c19ecd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.244+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 >> 192.168.123.103:0/584587005 conn(0x7f776c0fe540 msgr2=0x7f776c0ffc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.245+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 shutdown_connections 2026-03-09T16:12:05.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.245+0000 7f7773724640 1 -- 192.168.123.103:0/584587005 wait complete. 2026-03-09T16:12:05.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:04 vm05 ceph-mon[58702]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:05.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:04 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2012857306' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T16:12:05.318 INFO:tasks.cephadm.ceph_manager.ceph:need seq 60129542154 got 60129542154 for osd.1 2026-03-09T16:12:05.318 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:05.326 INFO:tasks.cephadm.ceph_manager.ceph:need seq 146028888067 got 146028888067 for osd.5 2026-03-09T16:12:05.326 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:05.339 INFO:tasks.cephadm.ceph_manager.ceph:need seq 128849018885 got 128849018885 for osd.4 2026-03-09T16:12:05.339 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:05.349 INFO:tasks.cephadm.ceph_manager.ceph:need seq 77309411336 got 77309411336 for osd.2 2026-03-09T16:12:05.349 DEBUG:teuthology.parallel:result is None 2026-03-09T16:12:05.349 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-09T16:12:05.349 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph pg dump --format=json 2026-03-09T16:12:05.555 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.799+0000 7efd387a5640 1 -- 192.168.123.103:0/289403377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30106780 msgr2=0x7efd30106b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.799+0000 7efd387a5640 1 --2- 192.168.123.103:0/289403377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30106780 0x7efd30106b60 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7efd180099b0 tx=0x7efd1802f220 comp rx=0 tx=0).stop 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- 192.168.123.103:0/289403377 shutdown_connections 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 --2- 192.168.123.103:0/289403377 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd30100780 0x7efd30100be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 --2- 192.168.123.103:0/289403377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30106780 0x7efd30106b60 secure :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7efd180099b0 tx=0x7efd1802f220 comp rx=0 tx=0).stop 2026-03-09T16:12:05.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- 192.168.123.103:0/289403377 >> 192.168.123.103:0/289403377 conn(0x7efd300fc460 msgr2=0x7efd300fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- 192.168.123.103:0/289403377 shutdown_connections 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- 192.168.123.103:0/289403377 wait complete. 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 Processor -- start 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- start start 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 0x7efd3019b3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd301969a0 con 0x7efd30100780 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.800+0000 7efd387a5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd30196b10 con 0x7efd3019b930 2026-03-09T16:12:05.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47676/0 (socket says 192.168.123.103:47676) 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 -- 192.168.123.103:0/3030539719 learned_addr learned my addr 192.168.123.103:0/3030539719 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd3651a640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 0x7efd3019b3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 -- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 msgr2=0x7efd3019b3f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 0x7efd3019b3f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 -- 192.168.123.103:0/3030539719 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd18009660 con 0x7efd3019b930 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd3651a640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 0x7efd3019b3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.801+0000 7efd35d19640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7efd2000ca30 tx=0x7efd2000cf00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:05.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.802+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd20004430 con 0x7efd3019b930 2026-03-09T16:12:05.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.802+0000 7efd387a5640 1 -- 192.168.123.103:0/3030539719 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd30196d10 con 0x7efd3019b930 2026-03-09T16:12:05.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.802+0000 7efd387a5640 1 -- 192.168.123.103:0/3030539719 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd30197230 con 0x7efd3019b930 2026-03-09T16:12:05.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.802+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efd20004590 con 0x7efd3019b930 2026-03-09T16:12:05.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.802+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd2000f660 con 0x7efd3019b930 2026-03-09T16:12:05.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.803+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd30101ec0 con 0x7efd3019b930 2026-03-09T16:12:05.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.803+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7efd20002870 con 0x7efd3019b930 2026-03-09T16:12:05.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.804+0000 7efd277fe640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 0x7efd0c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:05.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.804+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7efd200975a0 con 0x7efd3019b930 2026-03-09T16:12:05.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.804+0000 7efd3651a640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 0x7efd0c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:05.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.804+0000 7efd3651a640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 0x7efd0c078680 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7efd18002410 tx=0x7efd18005c50 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:05.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.806+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7efd20060e70 con 0x7efd3019b930 2026-03-09T16:12:05.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.894+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7efd3010b850 con 0x7efd0c0761c0 2026-03-09T16:12:05.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.895+0000 7efd277fe640 1 -- 192.168.123.103:0/3030539719 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19136 (secure 0 0 0) 0x7efd3010b850 con 0x7efd0c0761c0 2026-03-09T16:12:05.896 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:05.896 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-09T16:12:05.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 msgr2=0x7efd0c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 0x7efd0c078680 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7efd18002410 tx=0x7efd18005c50 comp rx=0 tx=0).stop 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 msgr2=0x7efd30196460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7efd2000ca30 tx=0x7efd2000cf00 comp rx=0 tx=0).stop 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 shutdown_connections 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7efd0c0761c0 0x7efd0c078680 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efd3019b930 0x7efd30196460 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 --2- 192.168.123.103:0/3030539719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd30100780 0x7efd3019b3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.897+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 >> 192.168.123.103:0/3030539719 conn(0x7efd300fc460 msgr2=0x7efd300fd710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.898+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 shutdown_connections 2026-03-09T16:12:05.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:05.898+0000 7efd257fa640 1 -- 192.168.123.103:0/3030539719 wait complete. 2026-03-09T16:12:05.957 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":69,"stamp":"2026-03-09T16:12:05.836501+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":164248,"kb_used_data":3228,"kb_used_omap":9,"kb_used_meta":160886,"kb_avail":125640296,"statfs":{"total":128823853056,"available":128655663104,"internally_reserved":0,"allocated":3305472,"data_stored":2109222,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":9532,"internal_metadata":164747972},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.162211"},"pg_stats":[{"pgid":"1.0","version":"20'32","reported_seq":81,"reported_epoch":34,"state":"active+clean","last_fresh":"2026-03-09T16:11:56.675580+0000","last_change":"2026-03-09T16:11:46.785830+0000","last_active":"2026-03-09T16:11:56.675580+0000","last_peered":"2026-03-09T16:11:56.675580+0000","last_clean":"2026-03-09T16:11:56.675580+0000","last_became_active":"2026-03-09T16:11:46.785641+0000","last_became_peered":"2026-03-09T16:11:46.785641+0000","last_unstale":"2026-03-09T16:11:56.675580+0000","last_undegraded":"2026-03-09T16:11:56.675580+0000","last_fullsized":"2026-03-09T16:11:56.675580+0000","mapping_epoch":29,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":30,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T16:11:30.401760+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T16:11:30.401760+0000","last_clean_scrub_stamp":"2026-03-09T16:11:30.401760+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T18:18:15.445003+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":34,"seq":146028888067,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38900000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40600000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59699999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33100000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52300000000000002}]}]},{"osd":4,"up_from":30,"seq":128849018885,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27152,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940272,"statfs":{"total":21470642176,"available":21442838528,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.375}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57999999999999996}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59699999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60599999999999998}]}]},{"osd":3,"up_from":25,"seq":107374182407,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27604,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939820,"statfs":{"total":21470642176,"available":21442375680,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1587,"internal_metadata":27457997},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61899999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66100000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76800000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.628}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56799999999999995}]}]},{"osd":2,"up_from":18,"seq":77309411337,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27148,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940276,"statfs":{"total":21470642176,"available":21442842624,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.439}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40600000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46400000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41799999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.434}]}]},{"osd":1,"up_from":14,"seq":60129542155,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27600,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939824,"statfs":{"total":21470642176,"available":21442379776,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63600000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50900000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.62}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64800000000000002}]}]},{"osd":0,"up_from":9,"seq":38654705677,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27600,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939824,"statfs":{"total":21470642176,"available":21442379776,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33800000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76000000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54200000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55700000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47399999999999998}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T16:12:05.957 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph pg dump --format=json 2026-03-09T16:12:06.111 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:06.136 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:05 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3541541449' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T16:12:06.136 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:05 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1382645047' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T16:12:06.136 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:05 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1804096548' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T16:12:06.136 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:05 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1123688902' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T16:12:06.136 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:05 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/584587005' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T16:12:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:05 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3541541449' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T16:12:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:05 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1382645047' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T16:12:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:05 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1804096548' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T16:12:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:05 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1123688902' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T16:12:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:05 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/584587005' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.341+0000 7f661cf94640 1 -- 192.168.123.103:0/2400755675 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f66180ff170 msgr2=0x7f66180ff5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.341+0000 7f661cf94640 1 --2- 192.168.123.103:0/2400755675 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f66180ff170 0x7f66180ff5b0 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f660c0099e0 tx=0x7f660c02f260 comp rx=0 tx=0).stop 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 -- 192.168.123.103:0/2400755675 shutdown_connections 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 --2- 192.168.123.103:0/2400755675 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f66180ff170 0x7f66180ff5b0 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 --2- 192.168.123.103:0/2400755675 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6618101820 0x7f66180fec30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 -- 192.168.123.103:0/2400755675 >> 192.168.123.103:0/2400755675 conn(0x7f66180faa80 msgr2=0x7f66180fcea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 -- 192.168.123.103:0/2400755675 shutdown_connections 2026-03-09T16:12:06.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.342+0000 7f661cf94640 1 -- 192.168.123.103:0/2400755675 wait complete. 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 Processor -- start 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 -- start start 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66180ff170 0x7f66181a0880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f66181a13e0 con 0x7f6618101820 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.343+0000 7f661cf94640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f661819a970 con 0x7f66180ff170 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56064/0 (socket says 192.168.123.103:56064) 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 -- 192.168.123.103:0/1640411777 learned_addr learned my addr 192.168.123.103:0/1640411777 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:06.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6616575640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66180ff170 0x7f66181a0880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 -- 192.168.123.103:0/1640411777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66180ff170 msgr2=0x7f66181a0880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66180ff170 0x7f66181a0880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 -- 192.168.123.103:0/1640411777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f660c009660 con 0x7f6618101820 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.344+0000 7f6615d74640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f660c02f770 tx=0x7f660c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.345+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f660c03d070 con 0x7f6618101820 2026-03-09T16:12:06.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.345+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f660c004400 con 0x7f6618101820 2026-03-09T16:12:06.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.345+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f661819abf0 con 0x7f6618101820 2026-03-09T16:12:06.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.345+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f661819b140 con 0x7f6618101820 2026-03-09T16:12:06.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.346+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f660c041620 con 0x7f6618101820 2026-03-09T16:12:06.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.347+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f65dc005350 con 0x7f6618101820 2026-03-09T16:12:06.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.347+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f660c02fc90 con 0x7f6618101820 2026-03-09T16:12:06.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.348+0000 7f65fb7fe640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 0x7f65f0078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.348+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f660c0bc660 con 0x7f6618101820 2026-03-09T16:12:06.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.350+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f660c086020 con 0x7f6618101820 2026-03-09T16:12:06.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.350+0000 7f6616575640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 0x7f65f0078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.350+0000 7f6616575640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 0x7f65f0078470 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f6604005fd0 tx=0x7f6604005950 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:06.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.435+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f65dc002bf0 con 0x7f65f0075fb0 2026-03-09T16:12:06.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.436+0000 7f65fb7fe640 1 -- 192.168.123.103:0/1640411777 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19136 (secure 0 0 0) 0x7f65dc002bf0 con 0x7f65f0075fb0 2026-03-09T16:12:06.437 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:06.437 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.438+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 msgr2=0x7f65f0078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.438+0000 7f661cf94640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 0x7f65f0078470 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f6604005fd0 tx=0x7f6604005950 comp rx=0 tx=0).stop 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 msgr2=0x7f66181a0dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f660c02f770 tx=0x7f660c004290 comp rx=0 tx=0).stop 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 shutdown_connections 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f65f0075fb0 0x7f65f0078470 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6618101820 0x7f66181a0dc0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 --2- 192.168.123.103:0/1640411777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66180ff170 0x7f66181a0880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 >> 192.168.123.103:0/1640411777 conn(0x7f66180faa80 msgr2=0x7f661810c640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:06.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 shutdown_connections 2026-03-09T16:12:06.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.439+0000 7f661cf94640 1 -- 192.168.123.103:0/1640411777 wait complete. 2026-03-09T16:12:06.481 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":69,"stamp":"2026-03-09T16:12:05.836501+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":164248,"kb_used_data":3228,"kb_used_omap":9,"kb_used_meta":160886,"kb_avail":125640296,"statfs":{"total":128823853056,"available":128655663104,"internally_reserved":0,"allocated":3305472,"data_stored":2109222,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":9532,"internal_metadata":164747972},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.162211"},"pg_stats":[{"pgid":"1.0","version":"20'32","reported_seq":81,"reported_epoch":34,"state":"active+clean","last_fresh":"2026-03-09T16:11:56.675580+0000","last_change":"2026-03-09T16:11:46.785830+0000","last_active":"2026-03-09T16:11:56.675580+0000","last_peered":"2026-03-09T16:11:56.675580+0000","last_clean":"2026-03-09T16:11:56.675580+0000","last_became_active":"2026-03-09T16:11:46.785641+0000","last_became_peered":"2026-03-09T16:11:46.785641+0000","last_unstale":"2026-03-09T16:11:56.675580+0000","last_undegraded":"2026-03-09T16:11:56.675580+0000","last_fullsized":"2026-03-09T16:11:56.675580+0000","mapping_epoch":29,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":30,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T16:11:30.401760+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T16:11:30.401760+0000","last_clean_scrub_stamp":"2026-03-09T16:11:30.401760+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T18:18:15.445003+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":34,"seq":146028888067,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38900000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40600000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59699999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33100000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52300000000000002}]}]},{"osd":4,"up_from":30,"seq":128849018885,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27152,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940272,"statfs":{"total":21470642176,"available":21442838528,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.375}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57999999999999996}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59699999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60599999999999998}]}]},{"osd":3,"up_from":25,"seq":107374182407,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27604,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939820,"statfs":{"total":21470642176,"available":21442375680,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1587,"internal_metadata":27457997},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61899999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66100000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76800000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.628}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56799999999999995}]}]},{"osd":2,"up_from":18,"seq":77309411337,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27148,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940276,"statfs":{"total":21470642176,"available":21442842624,"internally_reserved":0,"allocated":319488,"data_stored":121897,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.439}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40600000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46400000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41799999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.434}]}]},{"osd":1,"up_from":14,"seq":60129542155,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27600,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939824,"statfs":{"total":21470642176,"available":21442379776,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63600000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50900000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.62}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64800000000000002}]}]},{"osd":0,"up_from":9,"seq":38654705677,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27600,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939824,"statfs":{"total":21470642176,"available":21442379776,"internally_reserved":0,"allocated":782336,"data_stored":581177,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33800000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76000000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54200000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55700000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47399999999999998}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T16:12:06.482 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-09T16:12:06.482 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-09T16:12:06.482 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-09T16:12:06.482 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph health --format=json 2026-03-09T16:12:06.630 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.839+0000 7fd7b10c4640 1 -- 192.168.123.103:0/1940084754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 msgr2=0x7fd7ac0ffae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.839+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/1940084754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac0ffae0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fd7a00098e0 tx=0x7fd7a002f1e0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 -- 192.168.123.103:0/1940084754 shutdown_connections 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/1940084754 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac1044f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/1940084754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac0ffae0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 -- 192.168.123.103:0/1940084754 >> 192.168.123.103:0/1940084754 conn(0x7fd7ac0fb330 msgr2=0x7fd7ac0fd750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:06.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 -- 192.168.123.103:0/1940084754 shutdown_connections 2026-03-09T16:12:06.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 -- 192.168.123.103:0/1940084754 wait complete. 2026-03-09T16:12:06.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.840+0000 7fd7b10c4640 1 Processor -- start 2026-03-09T16:12:06.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7b10c4640 1 -- start start 2026-03-09T16:12:06.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7b10c4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7b10c4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac19f390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7b10c4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7ac19fa20 con 0x7fd7ac100020 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7b10c4640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7ac1a3790 con 0x7fd7ac0ff700 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47716/0 (socket says 192.168.123.103:47716) 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 -- 192.168.123.103:0/2406537049 learned_addr learned my addr 192.168.123.103:0/2406537049 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aa575640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac19f390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 -- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 msgr2=0x7fd7ac19f390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac19f390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aad76640 1 -- 192.168.123.103:0/2406537049 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd794009660 con 0x7fd7ac0ff700 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.841+0000 7fd7aa575640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac19f390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:06.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd7aad76640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd7a00043d0 tx=0x7fd7a0004400 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:06.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a003d070 con 0x7fd7ac0ff700 2026-03-09T16:12:06.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd7a002fc50 con 0x7fd7ac0ff700 2026-03-09T16:12:06.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a0041730 con 0x7fd7ac0ff700 2026-03-09T16:12:06.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7a0009590 con 0x7fd7ac0ff700 2026-03-09T16:12:06.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.842+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7ac1a3d70 con 0x7fd7ac0ff700 2026-03-09T16:12:06.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.843+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd770005350 con 0x7fd7ac0ff700 2026-03-09T16:12:06.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.844+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd7a0038470 con 0x7fd7ac0ff700 2026-03-09T16:12:06.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.845+0000 7fd78bfff640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 0x7fd780078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:06.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.845+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fd7a00bc580 con 0x7fd7ac0ff700 2026-03-09T16:12:06.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.845+0000 7fd7aa575640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 0x7fd780078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:06.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.845+0000 7fd7aa575640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 0x7fd780078750 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fd7ac1a0400 tx=0x7fd794009340 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:06.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.846+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd7a0085ed0 con 0x7fd7ac0ff700 2026-03-09T16:12:06.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.956+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7fd7700051c0 con 0x7fd7ac0ff700 2026-03-09T16:12:06.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:06 vm03 ceph-mon[51019]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:06.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:06 vm03 ceph-mon[51019]: from='client.24237 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T16:12:06.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:06 vm03 ceph-mon[51019]: from='client.14438 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T16:12:06.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.958+0000 7fd78bfff640 1 -- 192.168.123.103:0/2406537049 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7fd7a0085870 con 0x7fd7ac0ff700 2026-03-09T16:12:06.959 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:06.959 INFO:teuthology.orchestra.run.vm03.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-09T16:12:06.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 msgr2=0x7fd780078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 0x7fd780078750 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fd7ac1a0400 tx=0x7fd794009340 comp rx=0 tx=0).stop 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 msgr2=0x7fd7ac19ee50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd7a00043d0 tx=0x7fd7a0004400 comp rx=0 tx=0).stop 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 shutdown_connections 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd780076290 0x7fd780078750 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7ac100020 0x7fd7ac19f390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 --2- 192.168.123.103:0/2406537049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd7ac0ff700 0x7fd7ac19ee50 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.960+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 >> 192.168.123.103:0/2406537049 conn(0x7fd7ac0fb330 msgr2=0x7fd7ac0fcc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.961+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 shutdown_connections 2026-03-09T16:12:06.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:06.961+0000 7fd7b10c4640 1 -- 192.168.123.103:0/2406537049 wait complete. 2026-03-09T16:12:07.014 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-09T16:12:07.014 INFO:tasks.cephadm:Setup complete, yielding 2026-03-09T16:12:07.014 INFO:teuthology.run_tasks:Running task print... 2026-03-09T16:12:07.016 INFO:teuthology.task.print:**** done end installing reef cephadm ... 2026-03-09T16:12:07.016 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:07.018 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:07.018 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-09T16:12:07.169 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:06 vm05 ceph-mon[58702]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:06 vm05 ceph-mon[58702]: from='client.24237 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T16:12:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:06 vm05 ceph-mon[58702]: from='client.14438 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 -- 192.168.123.103:0/1670954015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848102a60 msgr2=0x7f3848102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1670954015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848102a60 0x7f3848102e60 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f38300099b0 tx=0x7f383002f220 comp rx=0 tx=0).stop 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 -- 192.168.123.103:0/1670954015 shutdown_connections 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1670954015 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848103c60 0x7f38481040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1670954015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848102a60 0x7f3848102e60 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.392+0000 7f384d9b3640 1 -- 192.168.123.103:0/1670954015 >> 192.168.123.103:0/1670954015 conn(0x7f38480fe250 msgr2=0x7f3848100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.393+0000 7f384d9b3640 1 -- 192.168.123.103:0/1670954015 shutdown_connections 2026-03-09T16:12:07.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.393+0000 7f384d9b3640 1 -- 192.168.123.103:0/1670954015 wait complete. 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.393+0000 7f384d9b3640 1 Processor -- start 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.393+0000 7f384d9b3640 1 -- start start 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f384d9b3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848102a60 0x7f384819a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f384d9b3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f384d9b3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f384819af70 con 0x7f3848103c60 2026-03-09T16:12:07.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f384d9b3640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f384819b0e0 con 0x7f3848102a60 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f3846ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848102a60 0x7f384819a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56112/0 (socket says 192.168.123.103:56112) 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 -- 192.168.123.103:0/1883952047 learned_addr learned my addr 192.168.123.103:0/1883952047 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 -- 192.168.123.103:0/1883952047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848102a60 msgr2=0x7f384819a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848102a60 0x7f384819a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 -- 192.168.123.103:0/1883952047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3830009660 con 0x7f3848103c60 2026-03-09T16:12:07.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.394+0000 7f38467fc640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f383c00da60 tx=0x7f383c00df30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:07.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.395+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f383c004280 con 0x7f3848103c60 2026-03-09T16:12:07.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.395+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f383c0076c0 con 0x7f3848103c60 2026-03-09T16:12:07.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.395+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f383c010460 con 0x7f3848103c60 2026-03-09T16:12:07.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.395+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f384819fb80 con 0x7f3848103c60 2026-03-09T16:12:07.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.395+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38481a00d0 con 0x7f3848103c60 2026-03-09T16:12:07.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.396+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3814005350 con 0x7f3848103c60 2026-03-09T16:12:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.399+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f383c0026e0 con 0x7f3848103c60 2026-03-09T16:12:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.400+0000 7f384c9b1640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 0x7f3820078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.400+0000 7f3846ffd640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 0x7f3820078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.400+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f383c097990 con 0x7f3848103c60 2026-03-09T16:12:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.400+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f383c0c58c0 con 0x7f3848103c60 2026-03-09T16:12:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.400+0000 7f3846ffd640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 0x7f3820078680 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f3830002c20 tx=0x7f383003a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:07.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.494+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f38140051c0 con 0x7f3848103c60 2026-03-09T16:12:07.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.502+0000 7f384c9b1640 1 -- 192.168.123.103:0/1883952047 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v16)=0 v16) v1 ==== 143+0+0 (secure 0 0 0) 0x7f383c0612e0 con 0x7f3848103c60 2026-03-09T16:12:07.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 msgr2=0x7f3820078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:07.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 0x7f3820078680 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f3830002c20 tx=0x7f383003a040 comp rx=0 tx=0).stop 2026-03-09T16:12:07.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 msgr2=0x7f384819a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:07.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f383c00da60 tx=0x7f383c00df30 comp rx=0 tx=0).stop 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 shutdown_connections 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f38200761c0 0x7f3820078680 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3848103c60 0x7f384819a9a0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 --2- 192.168.123.103:0/1883952047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3848102a60 0x7f384819a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 >> 192.168.123.103:0/1883952047 conn(0x7f38480fe250 msgr2=0x7f38480ffd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:07.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.509+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 shutdown_connections 2026-03-09T16:12:07.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.510+0000 7f384d9b3640 1 -- 192.168.123.103:0/1883952047 wait complete. 2026-03-09T16:12:07.561 INFO:teuthology.run_tasks:Running task print... 2026-03-09T16:12:07.563 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-09T16:12:07.563 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:07.565 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:07.565 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph orch status' 2026-03-09T16:12:07.739 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:07.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.973+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/1502858803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41019f0 msgr2=0x7fe3b4101e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:07.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.973+0000 7fe3bc2d5640 1 --2- 192.168.123.103:0/1502858803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41019f0 0x7fe3b4101e70 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fe3a40098e0 tx=0x7fe3a402f190 comp rx=0 tx=0).stop 2026-03-09T16:12:07.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/1502858803 shutdown_connections 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 --2- 192.168.123.103:0/1502858803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41019f0 0x7fe3b4101e70 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 --2- 192.168.123.103:0/1502858803 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41007f0 0x7fe3b4100bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/1502858803 >> 192.168.123.103:0/1502858803 conn(0x7fe3b40fbf80 msgr2=0x7fe3b40fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/1502858803 shutdown_connections 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/1502858803 wait complete. 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 Processor -- start 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.974+0000 7fe3bc2d5640 1 -- start start 2026-03-09T16:12:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3bc2d5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41007f0 0x7fe3b419a420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3bc2d5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3bc2d5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3b419af30 con 0x7fe3b41019f0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3bc2d5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3b419b0a0 con 0x7fe3b41007f0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56130/0 (socket says 192.168.123.103:56130) 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 -- 192.168.123.103:0/788549731 learned_addr learned my addr 192.168.123.103:0/788549731 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 -- 192.168.123.103:0/788549731 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41007f0 msgr2=0x7fe3b419a420 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41007f0 0x7fe3b419a420 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 -- 192.168.123.103:0/788549731 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe3a8009660 con 0x7fe3b41019f0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.975+0000 7fe3b9849640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7fe3a4031c30 tx=0x7fe3a4031c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.976+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3a403d070 con 0x7fe3b41019f0 2026-03-09T16:12:07.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.976+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/788549731 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe3a4009590 con 0x7fe3b41019f0 2026-03-09T16:12:07.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.976+0000 7fe3bc2d5640 1 -- 192.168.123.103:0/788549731 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3b419fe40 con 0x7fe3b41019f0 2026-03-09T16:12:07.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.976+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe3a402fe70 con 0x7fe3b41019f0 2026-03-09T16:12:07.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.976+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3a4038550 con 0x7fe3b41019f0 2026-03-09T16:12:07.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.977+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe380005350 con 0x7fe3b41019f0 2026-03-09T16:12:07.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.978+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe3a4049050 con 0x7fe3b41019f0 2026-03-09T16:12:07.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.978+0000 7fe39b7fe640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 0x7fe390078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:07.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.979+0000 7fe3ba04a640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 0x7fe390078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:07.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.981+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fe3a40bc4b0 con 0x7fe3b41019f0 2026-03-09T16:12:07.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.981+0000 7fe3ba04a640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 0x7fe390078680 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fe3b4101850 tx=0x7fe3a8009340 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:07.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:07.981+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe3a4081f90 con 0x7fe3b41019f0 2026-03-09T16:12:08.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.078+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe380002bf0 con 0x7fe3900761c0 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2406537049' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1883952047' entity='client.admin' 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:08.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:07 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:08.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.081+0000 7fe39b7fe640 1 -- 192.168.123.103:0/788549731 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7fe380002bf0 con 0x7fe3900761c0 2026-03-09T16:12:08.082 INFO:teuthology.orchestra.run.vm03.stdout:Backend: cephadm 2026-03-09T16:12:08.082 INFO:teuthology.orchestra.run.vm03.stdout:Available: Yes 2026-03-09T16:12:08.082 INFO:teuthology.orchestra.run.vm03.stdout:Paused: No 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.083+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 msgr2=0x7fe390078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.083+0000 7fe3997fa640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 0x7fe390078680 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fe3b4101850 tx=0x7fe3a8009340 comp rx=0 tx=0).stop 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.083+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 msgr2=0x7fe3b419a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.083+0000 7fe3997fa640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7fe3a4031c30 tx=0x7fe3a4031c60 comp rx=0 tx=0).stop 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 shutdown_connections 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fe3900761c0 0x7fe390078680 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe3b41019f0 0x7fe3b419a960 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 --2- 192.168.123.103:0/788549731 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe3b41007f0 0x7fe3b419a420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 >> 192.168.123.103:0/788549731 conn(0x7fe3b40fbf80 msgr2=0x7fe3b40fda90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:08.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 shutdown_connections 2026-03-09T16:12:08.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.084+0000 7fe3997fa640 1 -- 192.168.123.103:0/788549731 wait complete. 2026-03-09T16:12:08.143 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph orch ps' 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2406537049' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1883952047' entity='client.admin' 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:08.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:07 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:08.291 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:08.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.542+0000 7f839ffff640 1 -- 192.168.123.103:0/3685443853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a01001c0 msgr2=0x7f83a00fe2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.542+0000 7f839ffff640 1 --2- 192.168.123.103:0/3685443853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a01001c0 0x7f83a00fe2a0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f83940099b0 tx=0x7f839402f220 comp rx=0 tx=0).stop 2026-03-09T16:12:08.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 -- 192.168.123.103:0/3685443853 shutdown_connections 2026-03-09T16:12:08.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 --2- 192.168.123.103:0/3685443853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a01001c0 0x7f83a00fe2a0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 --2- 192.168.123.103:0/3685443853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a00ff810 0x7f83a00ffbf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 -- 192.168.123.103:0/3685443853 >> 192.168.123.103:0/3685443853 conn(0x7f83a00f9f80 msgr2=0x7f83a00fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 -- 192.168.123.103:0/3685443853 shutdown_connections 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 -- 192.168.123.103:0/3685443853 wait complete. 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 Processor -- start 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.543+0000 7f839ffff640 1 -- start start 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839ffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a00ff810 0x7f83a01057b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:08.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839ffff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83a0102210 con 0x7f83a01001c0 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839ffff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83a0102380 con 0x7f83a00ff810 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56146/0 (socket says 192.168.123.103:56146) 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839e7fc640 1 -- 192.168.123.103:0/1478612113 learned_addr learned my addr 192.168.123.103:0/1478612113 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.544+0000 7f839effd640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a00ff810 0x7f83a01057b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f839e7fc640 1 -- 192.168.123.103:0/1478612113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a00ff810 msgr2=0x7f83a01057b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f839e7fc640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a00ff810 0x7f83a01057b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f839e7fc640 1 -- 192.168.123.103:0/1478612113 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8388009590 con 0x7f83a01001c0 2026-03-09T16:12:08.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f839e7fc640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f8394005bb0 tx=0x7f8394002f60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:08.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f839403d070 con 0x7f83a01001c0 2026-03-09T16:12:08.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.545+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8394038730 con 0x7f83a01001c0 2026-03-09T16:12:08.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.546+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8394041760 con 0x7f83a01001c0 2026-03-09T16:12:08.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.546+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8394009660 con 0x7f83a01001c0 2026-03-09T16:12:08.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.546+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f83a0102930 con 0x7f83a01001c0 2026-03-09T16:12:08.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.546+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f836c005350 con 0x7f83a01001c0 2026-03-09T16:12:08.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.548+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f83940388a0 con 0x7f83a01001c0 2026-03-09T16:12:08.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.548+0000 7f83a4843640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 0x7f83780784c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:08.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.548+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f83940bc470 con 0x7f83a01001c0 2026-03-09T16:12:08.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.550+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8394085e70 con 0x7f83a01001c0 2026-03-09T16:12:08.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.550+0000 7f839effd640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 0x7f83780784c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:08.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.550+0000 7f839effd640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 0x7f83780784c0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f8388009810 tx=0x7f8388009290 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:08.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.644+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f836c002bf0 con 0x7f8378076000 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.648+0000 7f83a4843640 1 -- 192.168.123.103:0/1478612113 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2940 (secure 0 0 0) 0x7f836c002bf0 con 0x7f8378076000 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (75s) 39s ago 116s 22.6M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (2m) 39s ago 2m 8158k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (90s) 15s ago 89s 8342k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 39s ago 2m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (89s) 15s ago 89s 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (74s) 39s ago 105s 76.9M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:9283,8765,8443 running (2m) 39s ago 2m 530M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 55454b4aaab2 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (85s) 15s ago 85s 487M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 a411a05027bd 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 39s ago 2m 48.9M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (84s) 15s ago 83s 46.3M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (119s) 39s ago 119s 13.6M - 1.5.0 0da6a335fe13 8c7f00e55632 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (86s) 15s ago 86s 14.6M - 1.5.0 0da6a335fe13 4c3ab3bdf8cf 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (66s) 39s ago 66s 41.7M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (56s) 39s ago 56s 63.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:12:08.649 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (44s) 39s ago 44s 38.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:12:08.650 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (34s) 15s ago 34s 64.2M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:12:08.650 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (25s) 15s ago 25s 40.6M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:12:08.650 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (16s) 15s ago 16s 12.2M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:12:08.650 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (68s) 39s ago 99s 32.5M - 2.43.0 a07b618ecd1d 89a8f084cd57 2026-03-09T16:12:08.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 msgr2=0x7f83780784c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 0x7f83780784c0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f8388009810 tx=0x7f8388009290 comp rx=0 tx=0).stop 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 msgr2=0x7f83a0105cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f8394005bb0 tx=0x7f8394002f60 comp rx=0 tx=0).stop 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 shutdown_connections 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8378076000 0x7f83780784c0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f83a01001c0 0x7f83a0105cf0 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 --2- 192.168.123.103:0/1478612113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83a00ff810 0x7f83a01057b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 >> 192.168.123.103:0/1478612113 conn(0x7f83a00f9f80 msgr2=0x7f83a00fbb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 shutdown_connections 2026-03-09T16:12:08.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:08.651+0000 7f839ffff640 1 -- 192.168.123.103:0/1478612113 wait complete. 2026-03-09T16:12:08.714 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph orch ls' 2026-03-09T16:12:08.872 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.092+0000 7f6f4590e640 1 -- 192.168.123.103:0/4174105979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40102a60 msgr2=0x7f6f40102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.092+0000 7f6f4590e640 1 --2- 192.168.123.103:0/4174105979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40102a60 0x7f6f40102e60 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f6f2c0099b0 tx=0x7f6f2c02f220 comp rx=0 tx=0).stop 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 -- 192.168.123.103:0/4174105979 shutdown_connections 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 --2- 192.168.123.103:0/4174105979 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40103c60 0x7f6f401040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 --2- 192.168.123.103:0/4174105979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40102a60 0x7f6f40102e60 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 -- 192.168.123.103:0/4174105979 >> 192.168.123.103:0/4174105979 conn(0x7f6f400fe250 msgr2=0x7f6f40100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:09.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 -- 192.168.123.103:0/4174105979 shutdown_connections 2026-03-09T16:12:09.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 -- 192.168.123.103:0/4174105979 wait complete. 2026-03-09T16:12:09.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 Processor -- start 2026-03-09T16:12:09.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.093+0000 7f6f4590e640 1 -- start start 2026-03-09T16:12:09.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f4590e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f4590e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 0x7f6f4019a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f4590e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f4019ae90 con 0x7f6f40103c60 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f4590e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f4019b000 con 0x7f6f40102a60 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f3effd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f3effd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:52220/0 (socket says 192.168.123.103:52220) 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f3effd640 1 -- 192.168.123.103:0/1823798468 learned_addr learned my addr 192.168.123.103:0/1823798468 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.094+0000 7f6f3e7fc640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 0x7f6f4019a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f3effd640 1 -- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 msgr2=0x7f6f4019a950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f3effd640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 0x7f6f4019a950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f3effd640 1 -- 192.168.123.103:0/1823798468 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f2c009660 con 0x7f6f40102a60 2026-03-09T16:12:09.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f3e7fc640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 0x7f6f4019a950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:12:09.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f3effd640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f2c02f730 tx=0x7f6f2c002980 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:09.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.095+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f2c03d070 con 0x7f6f40102a60 2026-03-09T16:12:09.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.096+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6f2c02fd50 con 0x7f6f40102a60 2026-03-09T16:12:09.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.096+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f2c041a30 con 0x7f6f40102a60 2026-03-09T16:12:09.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.096+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f4019fa80 con 0x7f6f40102a60 2026-03-09T16:12:09.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.096+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f4019ff20 con 0x7f6f40102a60 2026-03-09T16:12:09.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.097+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6f2c038730 con 0x7f6f40102a60 2026-03-09T16:12:09.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.097+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f0c005350 con 0x7f6f40102a60 2026-03-09T16:12:09.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.098+0000 7f6f4490c640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 0x7f6f18078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.098+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f6f2c0bc190 con 0x7f6f40102a60 2026-03-09T16:12:09.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.098+0000 7f6f3e7fc640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 0x7f6f18078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.099+0000 7f6f3e7fc640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 0x7f6f18078630 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f6f4019b930 tx=0x7f6f3400a400 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:09.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.101+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6f2c085b10 con 0x7f6f40102a60 2026-03-09T16:12:09.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.200+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f0c002bf0 con 0x7f6f18076170 2026-03-09T16:12:09.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.204+0000 7f6f4490c640 1 -- 192.168.123.103:0/1823798468 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f6f0c002bf0 con 0x7f6f18076170 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager ?:9093,9094 1/1 39s ago 2m count:1 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter 2/2 39s ago 2m * 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:crash 2/2 39s ago 2m * 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:grafana ?:3000 1/1 39s ago 2m count:1 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:mgr 2/2 39s ago 2m count:2 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:mon 2/2 39s ago 2m vm03:192.168.123.103=vm03;vm05:192.168.123.105=vm05;count:2 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter ?:9100 2/2 39s ago 2m * 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:osd 6 39s ago - 2026-03-09T16:12:09.205 INFO:teuthology.orchestra.run.vm03.stdout:prometheus ?:9095 1/1 39s ago 2m count:1 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.206+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 msgr2=0x7f6f18078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.206+0000 7f6f4590e640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 0x7f6f18078630 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f6f4019b930 tx=0x7f6f3400a400 comp rx=0 tx=0).stop 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 msgr2=0x7f6f4019a410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f2c02f730 tx=0x7f6f2c002980 comp rx=0 tx=0).stop 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 shutdown_connections 2026-03-09T16:12:09.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6f18076170 0x7f6f18078630 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f40103c60 0x7f6f4019a950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 --2- 192.168.123.103:0/1823798468 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f40102a60 0x7f6f4019a410 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 >> 192.168.123.103:0/1823798468 conn(0x7f6f400fe250 msgr2=0x7f6f400ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:09.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 shutdown_connections 2026-03-09T16:12:09.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.207+0000 7f6f4590e640 1 -- 192.168.123.103:0/1823798468 wait complete. 2026-03-09T16:12:09.270 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph orch host ls' 2026-03-09T16:12:09.420 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/3756590180 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 msgr2=0x7f5f94102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/3756590180 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f94102e80 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f5f880099b0 tx=0x7f5f8802f240 comp rx=0 tx=0).stop 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/3756590180 shutdown_connections 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/3756590180 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 0x7f5f94104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/3756590180 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f94102e80 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.651+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/3756590180 >> 192.168.123.103:0/3756590180 conn(0x7f5f940fe250 msgr2=0x7f5f94100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.652+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/3756590180 shutdown_connections 2026-03-09T16:12:09.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.652+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/3756590180 wait complete. 2026-03-09T16:12:09.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.652+0000 7f5f9bd2f640 1 Processor -- start 2026-03-09T16:12:09.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.652+0000 7f5f9bd2f640 1 -- start start 2026-03-09T16:12:09.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f9bd2f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f99aa4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f99aa4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48852/0 (socket says 192.168.123.103:48852) 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f9bd2f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 0x7f5f9419a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f99aa4640 1 -- 192.168.123.103:0/2247947017 learned_addr learned my addr 192.168.123.103:0/2247947017 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f9419af60 con 0x7f5f94102a80 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.653+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f9419b0d0 con 0x7f5f94103c80 2026-03-09T16:12:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.654+0000 7f5f992a3640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 0x7f5f9419a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.654+0000 7f5f99aa4640 1 -- 192.168.123.103:0/2247947017 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 msgr2=0x7f5f9419a990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.654+0000 7f5f99aa4640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 0x7f5f9419a990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.654+0000 7f5f99aa4640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5f88009660 con 0x7f5f94102a80 2026-03-09T16:12:09.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.654+0000 7f5f99aa4640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f5f880042c0 tx=0x7f5f880042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:09.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.655+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f8803d070 con 0x7f5f94102a80 2026-03-09T16:12:09.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.655+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5f8802fcb0 con 0x7f5f94102a80 2026-03-09T16:12:09.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.655+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f880417b0 con 0x7f5f94102a80 2026-03-09T16:12:09.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.655+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5f9419fb10 con 0x7f5f94102a80 2026-03-09T16:12:09.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.655+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5f94075910 con 0x7f5f94102a80 2026-03-09T16:12:09.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.656+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f5f88049050 con 0x7f5f94102a80 2026-03-09T16:12:09.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.656+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5f5c005350 con 0x7f5f94102a80 2026-03-09T16:12:09.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.659+0000 7f5f82ffd640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 0x7f5f6c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:09.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.659+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f5f8802fe20 con 0x7f5f94102a80 2026-03-09T16:12:09.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.660+0000 7f5f992a3640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 0x7f5f6c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:09.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.660+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5f88038730 con 0x7f5f94102a80 2026-03-09T16:12:09.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.660+0000 7f5f992a3640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 0x7f5f6c078680 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5f9419b970 tx=0x7f5f84009290 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:09.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:09 vm03 ceph-mon[51019]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:09.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:09 vm03 ceph-mon[51019]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:09.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.757+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f5f5c002bf0 con 0x7f5f6c0761c0 2026-03-09T16:12:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.760+0000 7f5f82ffd640 1 -- 192.168.123.103:0/2247947017 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f5f5c002bf0 con 0x7f5f6c0761c0 2026-03-09T16:12:09.761 INFO:teuthology.orchestra.run.vm03.stdout:HOST ADDR LABELS STATUS 2026-03-09T16:12:09.761 INFO:teuthology.orchestra.run.vm03.stdout:vm03 192.168.123.103 2026-03-09T16:12:09.761 INFO:teuthology.orchestra.run.vm03.stdout:vm05 192.168.123.105 2026-03-09T16:12:09.761 INFO:teuthology.orchestra.run.vm03.stdout:2 hosts in cluster 2026-03-09T16:12:09.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 msgr2=0x7f5f6c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 0x7f5f6c078680 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5f9419b970 tx=0x7f5f84009290 comp rx=0 tx=0).stop 2026-03-09T16:12:09.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 msgr2=0x7f5f9419a450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f5f880042c0 tx=0x7f5f880042f0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 shutdown_connections 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f6c0761c0 0x7f5f6c078680 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f94103c80 0x7f5f9419a990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 --2- 192.168.123.103:0/2247947017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94102a80 0x7f5f9419a450 secure :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f5f880042c0 tx=0x7f5f880042f0 comp rx=0 tx=0).stop 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.762+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 >> 192.168.123.103:0/2247947017 conn(0x7f5f940fe250 msgr2=0x7f5f940ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.763+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 shutdown_connections 2026-03-09T16:12:09.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:09.763+0000 7f5f9bd2f640 1 -- 192.168.123.103:0/2247947017 wait complete. 2026-03-09T16:12:09.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:09 vm05 ceph-mon[58702]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:09.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:09 vm05 ceph-mon[58702]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:09.829 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph orch device ls' 2026-03-09T16:12:09.984 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:10.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.203+0000 7f88b3050640 1 -- 192.168.123.103:0/3257244727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac103c60 msgr2=0x7f88ac1040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.203+0000 7f88b3050640 1 --2- 192.168.123.103:0/3257244727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac103c60 0x7f88ac1040e0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f889c009a00 tx=0x7f889c02f280 comp rx=0 tx=0).stop 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 -- 192.168.123.103:0/3257244727 shutdown_connections 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 --2- 192.168.123.103:0/3257244727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac103c60 0x7f88ac1040e0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 --2- 192.168.123.103:0/3257244727 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac102a60 0x7f88ac102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 -- 192.168.123.103:0/3257244727 >> 192.168.123.103:0/3257244727 conn(0x7f88ac0fe250 msgr2=0x7f88ac100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 -- 192.168.123.103:0/3257244727 shutdown_connections 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 -- 192.168.123.103:0/3257244727 wait complete. 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.204+0000 7f88b3050640 1 Processor -- start 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b3050640 1 -- start start 2026-03-09T16:12:10.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b3050640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b3050640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac103c60 0x7f88ac19a840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b3050640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88ac19ae10 con 0x7f88ac102a60 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b3050640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88ac19af80 con 0x7f88ac103c60 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b0dc5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b0dc5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48872/0 (socket says 192.168.123.103:48872) 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88b0dc5640 1 -- 192.168.123.103:0/2986047573 learned_addr learned my addr 192.168.123.103:0/2986047573 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.205+0000 7f88a3fff640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac103c60 0x7f88ac19a840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88b0dc5640 1 -- 192.168.123.103:0/2986047573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac103c60 msgr2=0x7f88ac19a840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88b0dc5640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac103c60 0x7f88ac19a840 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88b0dc5640 1 -- 192.168.123.103:0/2986047573 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f889c009660 con 0x7f88ac102a60 2026-03-09T16:12:10.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88b0dc5640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f889400d930 tx=0x7f889400de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:10.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8894004490 con 0x7f88ac102a60 2026-03-09T16:12:10.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f88940045f0 con 0x7f88ac102a60 2026-03-09T16:12:10.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.206+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8894002ea0 con 0x7f88ac102a60 2026-03-09T16:12:10.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.207+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f88ac19fa20 con 0x7f88ac102a60 2026-03-09T16:12:10.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.207+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88ac19fef0 con 0x7f88ac102a60 2026-03-09T16:12:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.208+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f889400b840 con 0x7f88ac102a60 2026-03-09T16:12:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.208+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f88ac102e60 con 0x7f88ac102a60 2026-03-09T16:12:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.209+0000 7f88a1ffb640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 0x7f887c078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.209+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f8894097320 con 0x7f88ac102a60 2026-03-09T16:12:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.211+0000 7f88a3fff640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 0x7f887c078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.211+0000 7f88a3fff640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 0x7f887c078470 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f88ac19b820 tx=0x7f889c002f70 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:10.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.211+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8894060c70 con 0x7f88ac102a60 2026-03-09T16:12:10.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.301+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f88ac107ea0 con 0x7f887c075fb0 2026-03-09T16:12:10.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.303+0000 7f88a1ffb640 1 -- 192.168.123.103:0/2986047573 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1617 (secure 0 0 0) 0x7f88ac107ea0 con 0x7f887c075fb0 2026-03-09T16:12:10.304 INFO:teuthology.orchestra.run.vm03.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/sr0 hdd QEMU_DVD-ROM_QM00003 366k No 43s ago Has a FileSystem, Insufficient space (<5GB) 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdb hdd DWNBRSTVMM03001 20.0G Yes 43s ago 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdc hdd DWNBRSTVMM03002 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdd hdd DWNBRSTVMM03003 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vde hdd DWNBRSTVMM03004 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm05 /dev/sr0 hdd QEMU_DVD-ROM_QM00003 366k No 15s ago Has a FileSystem, Insufficient space (<5GB) 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm05 /dev/vdb hdd DWNBRSTVMM05001 20.0G Yes 15s ago 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm05 /dev/vdc hdd DWNBRSTVMM05002 20.0G No 15s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm05 /dev/vdd hdd DWNBRSTVMM05003 20.0G No 15s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.305 INFO:teuthology.orchestra.run.vm03.stdout:vm05 /dev/vde hdd DWNBRSTVMM05004 20.0G No 15s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T16:12:10.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 msgr2=0x7f887c078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 0x7f887c078470 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f88ac19b820 tx=0x7f889c002f70 comp rx=0 tx=0).stop 2026-03-09T16:12:10.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 msgr2=0x7f88ac19a300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f889400d930 tx=0x7f889400de00 comp rx=0 tx=0).stop 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 shutdown_connections 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f887c075fb0 0x7f887c078470 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88ac103c60 0x7f88ac19a840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 --2- 192.168.123.103:0/2986047573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88ac102a60 0x7f88ac19a300 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 >> 192.168.123.103:0/2986047573 conn(0x7f88ac0fe250 msgr2=0x7f88ac0ffce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 shutdown_connections 2026-03-09T16:12:10.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.306+0000 7f88b3050640 1 -- 192.168.123.103:0/2986047573 wait complete. 2026-03-09T16:12:10.369 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:10.373 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:10.373 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-09T16:12:10.521 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:10.554 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:10 vm03 ceph-mon[51019]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:10.554 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:10 vm03 ceph-mon[51019]: from='client.24257 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 -- 192.168.123.103:0/1434097538 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 msgr2=0x7f8250102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 --2- 192.168.123.103:0/1434097538 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250102e80 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f8240009a00 tx=0x7f824002f280 comp rx=0 tx=0).stop 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 -- 192.168.123.103:0/1434097538 shutdown_connections 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 --2- 192.168.123.103:0/1434097538 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 --2- 192.168.123.103:0/1434097538 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250102e80 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 -- 192.168.123.103:0/1434097538 >> 192.168.123.103:0/1434097538 conn(0x7f82500fe250 msgr2=0x7f8250100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 -- 192.168.123.103:0/1434097538 shutdown_connections 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 -- 192.168.123.103:0/1434097538 wait complete. 2026-03-09T16:12:10.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.787+0000 7f825719a640 1 Processor -- start 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f825719a640 1 -- start start 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f825719a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250195f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f825719a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250196440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f825719a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82501969d0 con 0x7f8250102a80 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f825719a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8250196b40 con 0x7f8250103c80 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8254f0f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250195f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8247fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250196440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8247fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250196440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:52266/0 (socket says 192.168.123.103:52266) 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8247fff640 1 -- 192.168.123.103:0/1607140821 learned_addr learned my addr 192.168.123.103:0/1607140821 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8254f0f640 1 -- 192.168.123.103:0/1607140821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 msgr2=0x7f8250196440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8254f0f640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250196440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:10.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.788+0000 7f8254f0f640 1 -- 192.168.123.103:0/1607140821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8240009660 con 0x7f8250102a80 2026-03-09T16:12:10.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.789+0000 7f8254f0f640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250195f00 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f8240009a00 tx=0x7f8240031d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:10.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.789+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8240031ea0 con 0x7f8250102a80 2026-03-09T16:12:10.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.789+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f82400043d0 con 0x7f8250102a80 2026-03-09T16:12:10.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.789+0000 7f825719a640 1 -- 192.168.123.103:0/1607140821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f825019b5b0 con 0x7f8250102a80 2026-03-09T16:12:10.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.789+0000 7f825719a640 1 -- 192.168.123.103:0/1607140821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f825019baa0 con 0x7f8250102a80 2026-03-09T16:12:10.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.790+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8240031260 con 0x7f8250102a80 2026-03-09T16:12:10.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.791+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f824003f070 con 0x7f8250102a80 2026-03-09T16:12:10.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.791+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f825010b6b0 con 0x7f8250102a80 2026-03-09T16:12:10.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.792+0000 7f8245ffb640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 0x7f8220078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:10.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.792+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f82400bcef0 con 0x7f8250102a80 2026-03-09T16:12:10.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.794+0000 7f8247fff640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 0x7f8220078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:10.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.794+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f82400868f0 con 0x7f8250102a80 2026-03-09T16:12:10.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.794+0000 7f8247fff640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 0x7f8220078750 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8250197030 tx=0x7f823800a480 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:10.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:10.897+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f825010b8e0 con 0x7f8220076290 2026-03-09T16:12:11.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:10 vm05 ceph-mon[58702]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:11.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:10 vm05 ceph-mon[58702]: from='client.24257 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:11 vm03 ceph-mon[51019]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:11 vm03 ceph-mon[51019]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:11 vm03 ceph-mon[51019]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:11 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T16:12:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:11 vm05 ceph-mon[58702]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:11 vm05 ceph-mon[58702]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:11 vm05 ceph-mon[58702]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:11 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T16:12:12.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.586+0000 7f8245ffb640 1 -- 192.168.123.103:0/1607140821 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f825010b8e0 con 0x7f8220076290 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.588+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 msgr2=0x7f8220078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.588+0000 7f823f7fe640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 0x7f8220078750 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8250197030 tx=0x7f823800a480 comp rx=0 tx=0).stop 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.588+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 msgr2=0x7f8250195f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.588+0000 7f823f7fe640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250195f00 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f8240009a00 tx=0x7f8240031d30 comp rx=0 tx=0).stop 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 shutdown_connections 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8220076290 0x7f8220078750 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8250103c80 0x7f8250196440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 --2- 192.168.123.103:0/1607140821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8250102a80 0x7f8250195f00 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 >> 192.168.123.103:0/1607140821 conn(0x7f82500fe250 msgr2=0x7f82500ffa00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 shutdown_connections 2026-03-09T16:12:12.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:12.589+0000 7f823f7fe640 1 -- 192.168.123.103:0/1607140821 wait complete. 2026-03-09T16:12:12.641 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs dump' 2026-03-09T16:12:12.817 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[51015]: 2026-03-09T16:12:12.559+0000 7f3a5a9e9640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:12:12.855 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:12 vm03 ceph-mon[51019]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:12:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:12 vm05 ceph-mon[58702]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T16:12:13.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.273+0000 7f0be7577640 1 -- 192.168.123.103:0/1599887065 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be8072370 msgr2=0x7f0be810c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:13.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.273+0000 7f0be7577640 1 --2- 192.168.123.103:0/1599887065 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be8072370 0x7f0be810c590 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f0be001c4b0 tx=0x7f0be0040860 comp rx=0 tx=0).stop 2026-03-09T16:12:13.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.276+0000 7f0be7577640 1 -- 192.168.123.103:0/1599887065 shutdown_connections 2026-03-09T16:12:13.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.276+0000 7f0be7577640 1 --2- 192.168.123.103:0/1599887065 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be8072370 0x7f0be810c590 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.276+0000 7f0be7577640 1 --2- 192.168.123.103:0/1599887065 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0be80719a0 0x7f0be8071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.276+0000 7f0be7577640 1 -- 192.168.123.103:0/1599887065 >> 192.168.123.103:0/1599887065 conn(0x7f0be806d4f0 msgr2=0x7f0be806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:13.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.276+0000 7f0be7577640 1 -- 192.168.123.103:0/1599887065 shutdown_connections 2026-03-09T16:12:13.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 -- 192.168.123.103:0/1599887065 wait complete. 2026-03-09T16:12:13.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 Processor -- start 2026-03-09T16:12:13.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 -- start start 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0be819ec00 0x7f0be81a3c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0be819f080 con 0x7f0be80719a0 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.278+0000 7f0be7577640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0be819f1f0 con 0x7f0be819ec00 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48916/0 (socket says 192.168.123.103:48916) 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 -- 192.168.123.103:0/2009516019 learned_addr learned my addr 192.168.123.103:0/2009516019 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 -- 192.168.123.103:0/2009516019 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0be819ec00 msgr2=0x7f0be81a3c70 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0be819ec00 0x7f0be81a3c70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 -- 192.168.123.103:0/2009516019 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0be0009d00 con 0x7f0be80719a0 2026-03-09T16:12:13.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.280+0000 7f0be6575640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f0bd800c910 tx=0x7f0bd800cde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:13.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.281+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bd8007c20 con 0x7f0be80719a0 2026-03-09T16:12:13.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.282+0000 7f0be7577640 1 -- 192.168.123.103:0/2009516019 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0be81a4210 con 0x7f0be80719a0 2026-03-09T16:12:13.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.282+0000 7f0be7577640 1 -- 192.168.123.103:0/2009516019 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0be81a4710 con 0x7f0be80719a0 2026-03-09T16:12:13.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.284+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0bd8007d80 con 0x7f0be80719a0 2026-03-09T16:12:13.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.286+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bd800f830 con 0x7f0be80719a0 2026-03-09T16:12:13.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.286+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f0bd800fab0 con 0x7f0be80719a0 2026-03-09T16:12:13.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.287+0000 7f0bd77fe640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 0x7f0bbc078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:13.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.287+0000 7f0be5d74640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 0x7f0bbc078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:13.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.287+0000 7f0be5d74640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 0x7f0bbc078820 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f0be001c480 tx=0x7f0be0002750 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:13.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.288+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f0bd8013070 con 0x7f0be80719a0 2026-03-09T16:12:13.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.288+0000 7f0be7577640 1 -- 192.168.123.103:0/2009516019 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bac005350 con 0x7f0be80719a0 2026-03-09T16:12:13.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.293+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0bd80627b0 con 0x7f0be80719a0 2026-03-09T16:12:13.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.421+0000 7f0be7577640 1 -- 192.168.123.103:0/2009516019 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0bac005e10 con 0x7f0be80719a0 2026-03-09T16:12:13.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.423+0000 7f0bd77fe640 1 -- 192.168.123.103:0/2009516019 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1114 (secure 0 0 0) 0x7f0bd8062150 con 0x7f0be80719a0 2026-03-09T16:12:13.423 INFO:teuthology.orchestra.run.vm03.stdout:e2 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:epoch 2 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:12.560067+0000 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:in 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:up {} 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 0 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:13.424 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 2 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.425+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 msgr2=0x7f0bbc078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.425+0000 7f0bd57fa640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 0x7f0bbc078820 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f0be001c480 tx=0x7f0be0002750 comp rx=0 tx=0).stop 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.425+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 msgr2=0x7f0be819e6c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.425+0000 7f0bd57fa640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f0bd800c910 tx=0x7f0bd800cde0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 shutdown_connections 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0bbc076360 0x7f0bbc078820 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0be819ec00 0x7f0be81a3c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 --2- 192.168.123.103:0/2009516019 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0be80719a0 0x7f0be819e6c0 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:13.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 >> 192.168.123.103:0/2009516019 conn(0x7f0be806d4f0 msgr2=0x7f0be8070380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:13.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 shutdown_connections 2026-03-09T16:12:13.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:13.426+0000 7f0bd57fa640 1 -- 192.168.123.103:0/2009516019 wait complete. 2026-03-09T16:12:13.500 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:13.503 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:13.503 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-09T16:12:13.743 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: pgmap v73: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: fsmap cephfs:0 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: Saving service mds.cephfs spec with placement count:4 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:13.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:13 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/2009516019' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: pgmap v73: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: fsmap cephfs:0 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: Saving service mds.cephfs spec with placement count:4 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:13.980 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:13 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/2009516019' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.028+0000 7f9b15601640 1 -- 192.168.123.103:0/1862183038 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b10075ba0 msgr2=0x7f9b10075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.028+0000 7f9b15601640 1 --2- 192.168.123.103:0/1862183038 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b10075ba0 0x7f9b10075fa0 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f9af80098e0 tx=0x7f9af802f1d0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.030+0000 7f9b15601640 1 -- 192.168.123.103:0/1862183038 shutdown_connections 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.030+0000 7f9b15601640 1 --2- 192.168.123.103:0/1862183038 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b10077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.030+0000 7f9b15601640 1 --2- 192.168.123.103:0/1862183038 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b10075ba0 0x7f9b10075fa0 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.030+0000 7f9b15601640 1 -- 192.168.123.103:0/1862183038 >> 192.168.123.103:0/1862183038 conn(0x7f9b100fe250 msgr2=0x7f9b10100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:14.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.031+0000 7f9b15601640 1 -- 192.168.123.103:0/1862183038 shutdown_connections 2026-03-09T16:12:14.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.031+0000 7f9b15601640 1 -- 192.168.123.103:0/1862183038 wait complete. 2026-03-09T16:12:14.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.031+0000 7f9b15601640 1 Processor -- start 2026-03-09T16:12:14.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b15601640 1 -- start start 2026-03-09T16:12:14.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b15601640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b1019eb70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b15601640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b15601640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b1019f530 con 0x7f9b1019f0b0 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b15601640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b1019f6a0 con 0x7f9b10076df0 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b0e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b0e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48960/0 (socket says 192.168.123.103:48960) 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b0e7fc640 1 -- 192.168.123.103:0/1938974074 learned_addr learned my addr 192.168.123.103:0/1938974074 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:14.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.032+0000 7f9b0effd640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b1019eb70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b0e7fc640 1 -- 192.168.123.103:0/1938974074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 msgr2=0x7f9b1019eb70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b0e7fc640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b1019eb70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b0e7fc640 1 -- 192.168.123.103:0/1938974074 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9af8009590 con 0x7f9b1019f0b0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b0effd640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b1019eb70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b0e7fc640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f9b0400e990 tx=0x7f9b0400ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b0400cd30 con 0x7f9b1019f0b0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9b0400ce90 con 0x7f9b1019f0b0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9b101a4670 con 0x7f9b1019f0b0 2026-03-09T16:12:14.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.033+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9b101a4b70 con 0x7f9b1019f0b0 2026-03-09T16:12:14.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.034+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b040106b0 con 0x7f9b1019f0b0 2026-03-09T16:12:14.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.035+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9adc005350 con 0x7f9b1019f0b0 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.036+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f9b040026e0 con 0x7f9b1019f0b0 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.036+0000 7f9aeffff640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 0x7f9ae8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.036+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f9b04014070 con 0x7f9b1019f0b0 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.039+0000 7f9b0effd640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 0x7f9ae8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.039+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9b04061620 con 0x7f9b1019f0b0 2026-03-09T16:12:14.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.039+0000 7f9b0effd640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 0x7f9ae8078680 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f9af8002b10 tx=0x7f9af8002920 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:14.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.156+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7f9adc005b80 con 0x7f9b1019f0b0 2026-03-09T16:12:14.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.606+0000 7f9aeffff640 1 -- 192.168.123.103:0/1938974074 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7f9b04060fc0 con 0x7f9b1019f0b0 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.608+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 msgr2=0x7f9ae8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.608+0000 7f9b15601640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 0x7f9ae8078680 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f9af8002b10 tx=0x7f9af8002920 comp rx=0 tx=0).stop 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.608+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 msgr2=0x7f9b101a40d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.608+0000 7f9b15601640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f9b0400e990 tx=0x7f9b0400ee60 comp rx=0 tx=0).stop 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 shutdown_connections 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9ae80761c0 0x7f9ae8078680 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b1019f0b0 0x7f9b101a40d0 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 --2- 192.168.123.103:0/1938974074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b10076df0 0x7f9b1019eb70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:14.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 >> 192.168.123.103:0/1938974074 conn(0x7f9b100fe250 msgr2=0x7f9b100ffba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:14.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 shutdown_connections 2026-03-09T16:12:14.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:14.609+0000 7f9b15601640 1 -- 192.168.123.103:0/1938974074 wait complete. 2026-03-09T16:12:14.702 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:14.705 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:14.705 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: Deploying daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1938974074' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T16:12:14.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:14 vm05 ceph-mon[58702]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: Deploying daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1938974074' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T16:12:14.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:14 vm03 ceph-mon[51019]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T16:12:14.894 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:15.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- 192.168.123.103:0/3559536836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c0071c20 msgr2=0x7fd5c0072020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:15.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 --2- 192.168.123.103:0/3559536836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c0071c20 0x7fd5c0072020 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7fd5b0009930 tx=0x7fd5b002f1d0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- 192.168.123.103:0/3559536836 shutdown_connections 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 --2- 192.168.123.103:0/3559536836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5c00725f0 0x7fd5c0077360 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 --2- 192.168.123.103:0/3559536836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c0071c20 0x7fd5c0072020 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- 192.168.123.103:0/3559536836 >> 192.168.123.103:0/3559536836 conn(0x7fd5c006d660 msgr2=0x7fd5c006faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- 192.168.123.103:0/3559536836 shutdown_connections 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- 192.168.123.103:0/3559536836 wait complete. 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 Processor -- start 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- start start 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5c00725f0 0x7fd5c007cc00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5c0133640 con 0x7fd5c007d140 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5c509b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5c007fb40 con 0x7fd5c00725f0 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5be575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5be575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48974/0 (socket says 192.168.123.103:48974) 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.364+0000 7fd5be575640 1 -- 192.168.123.103:0/3310629291 learned_addr learned my addr 192.168.123.103:0/3310629291 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd5be575640 1 -- 192.168.123.103:0/3310629291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5c00725f0 msgr2=0x7fd5c007cc00 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd5be575640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5c00725f0 0x7fd5c007cc00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd5be575640 1 -- 192.168.123.103:0/3310629291 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5b0009590 con 0x7fd5c007d140 2026-03-09T16:12:15.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd5be575640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7fd5b800c340 tx=0x7fd5b800ef90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:15.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5b8009280 con 0x7fd5c007d140 2026-03-09T16:12:15.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.365+0000 7fd5c509b640 1 -- 192.168.123.103:0/3310629291 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5c007fdc0 con 0x7fd5c007d140 2026-03-09T16:12:15.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.366+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd5b800f040 con 0x7fd5c007d140 2026-03-09T16:12:15.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.366+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5b80075c0 con 0x7fd5c007d140 2026-03-09T16:12:15.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.366+0000 7fd5c509b640 1 -- 192.168.123.103:0/3310629291 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5c0080310 con 0x7fd5c007d140 2026-03-09T16:12:15.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.368+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd5b8007720 con 0x7fd5c007d140 2026-03-09T16:12:15.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.369+0000 7fd59ffff640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 0x7fd5a0078780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:15.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.369+0000 7fd5bed76640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 0x7fd5a0078780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:15.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.369+0000 7fd5c509b640 1 -- 192.168.123.103:0/3310629291 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd5c0072020 con 0x7fd5c007d140 2026-03-09T16:12:15.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.369+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fd5b8098460 con 0x7fd5c007d140 2026-03-09T16:12:15.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.373+0000 7fd5bed76640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 0x7fd5a0078780 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fd5b002f6e0 tx=0x7fd5b00023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:15.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.373+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd5b80619f0 con 0x7fd5c007d140 2026-03-09T16:12:15.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.488+0000 7fd5c509b640 1 -- 192.168.123.103:0/3310629291 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7fd5c00805e0 con 0x7fd5c007d140 2026-03-09T16:12:15.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.644+0000 7fd59ffff640 1 -- 192.168.123.103:0/3310629291 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v5) v1 ==== 122+0+0 (secure 0 0 0) 0x7fd5b8061390 con 0x7fd5c007d140 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 msgr2=0x7fd5a0078780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 0x7fd5a0078780 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fd5b002f6e0 tx=0x7fd5b00023d0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 msgr2=0x7fd5c007f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7fd5b800c340 tx=0x7fd5b800ef90 comp rx=0 tx=0).stop 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 shutdown_connections 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fd5a00762c0 0x7fd5a0078780 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5c007d140 0x7fd5c007f5d0 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 --2- 192.168.123.103:0/3310629291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5c00725f0 0x7fd5c007cc00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 >> 192.168.123.103:0/3310629291 conn(0x7fd5c006d660 msgr2=0x7fd5c00758f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 shutdown_connections 2026-03-09T16:12:15.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:15.647+0000 7fd59dffb640 1 -- 192.168.123.103:0/3310629291 wait complete. 2026-03-09T16:12:15.729 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:15.731 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:15.731 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs set cephfs inline_data true --yes-i-really-really-mean-it' 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: Deploying daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: pgmap v77: 65 pgs: 17 creating+peering, 36 unknown, 12 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1938974074' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:boot 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: daemon mds.cephfs.vm05.jgzfvu assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: fsmap cephfs:0 1 up:standby 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.105:6824/3577484575,v1:192.168.123.105:6825/3577484575] up:boot 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:creating} 1 up:standby 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: daemon mds.cephfs.vm05.jgzfvu is now active in filesystem cephfs as rank 0 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-09T16:12:15.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: Cluster is now healthy 2026-03-09T16:12:15.893 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:15 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3310629291' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T16:12:15.905 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: Deploying daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: pgmap v77: 65 pgs: 17 creating+peering, 36 unknown, 12 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1938974074' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:boot 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: daemon mds.cephfs.vm05.jgzfvu assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: fsmap cephfs:0 1 up:standby 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.105:6824/3577484575,v1:192.168.123.105:6825/3577484575] up:boot 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:creating} 1 up:standby 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: daemon mds.cephfs.vm05.jgzfvu is now active in filesystem cephfs as rank 0 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: Cluster is now healthy 2026-03-09T16:12:15.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:15 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3310629291' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.152+0000 7f6b10a68640 1 -- 192.168.123.103:0/1528915927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c1008e0 msgr2=0x7f6b0c100d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.152+0000 7f6b10a68640 1 --2- 192.168.123.103:0/1528915927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c1008e0 0x7f6b0c100d60 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f6b00009a00 tx=0x7f6b0002f280 comp rx=0 tx=0).stop 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.153+0000 7f6b10a68640 1 -- 192.168.123.103:0/1528915927 shutdown_connections 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.153+0000 7f6b10a68640 1 --2- 192.168.123.103:0/1528915927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c1008e0 0x7f6b0c100d60 secure :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f6b00009a00 tx=0x7f6b0002f280 comp rx=0 tx=0).stop 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.153+0000 7f6b10a68640 1 --2- 192.168.123.103:0/1528915927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c0ffae0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.153+0000 7f6b10a68640 1 -- 192.168.123.103:0/1528915927 >> 192.168.123.103:0/1528915927 conn(0x7f6b0c0fae50 msgr2=0x7f6b0c0fd2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.154+0000 7f6b10a68640 1 -- 192.168.123.103:0/1528915927 shutdown_connections 2026-03-09T16:12:16.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.154+0000 7f6b10a68640 1 -- 192.168.123.103:0/1528915927 wait complete. 2026-03-09T16:12:16.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.154+0000 7f6b10a68640 1 Processor -- start 2026-03-09T16:12:16.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b10a68640 1 -- start start 2026-03-09T16:12:16.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b10a68640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b10a68640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 0x7f6b0c19fd40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b10a68640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b0c19b150 con 0x7f6b0c19acd0 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b10a68640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b0c19b2c0 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b0a575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b0a575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:52366/0 (socket says 192.168.123.103:52366) 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b0a575640 1 -- 192.168.123.103:0/635382590 learned_addr learned my addr 192.168.123.103:0/635382590 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.155+0000 7f6b0a575640 1 -- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 msgr2=0x7f6b0c19fd40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:16.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.156+0000 7f6b09d74640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 0x7f6b0c19fd40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:16.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.156+0000 7f6b0a575640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 0x7f6b0c19fd40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.156+0000 7f6b0a575640 1 -- 192.168.123.103:0/635382590 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b00009660 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.156+0000 7f6b09d74640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 0x7f6b0c19fd40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:16.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.156+0000 7f6b0a575640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f6af400e940 tx=0x7f6af400ee10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:16.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.157+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6af400cd30 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.157+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b0c1a02e0 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.157+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b0c1a07e0 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.157+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6af400ce90 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.157+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6af4022430 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.158+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6af4022590 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.159+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b0c109480 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.159+0000 7f6af37fe640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 0x7f6adc078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:16.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.159+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f6af4014070 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.160+0000 7f6b09d74640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 0x7f6adc078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:16.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.160+0000 7f6b09d74640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 0x7f6adc078630 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f6b00002c80 tx=0x7f6b000023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:16.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.163+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6af40cc820 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.277+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true} v 0) v1 -- 0x7f6b0c105ea0 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: Deploying daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:12:16.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3310629291' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T16:12:16.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.105:6824/3577484575,v1:192.168.123.105:6825/3577484575] up:active 2026-03-09T16:12:16.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 1 up:standby 2026-03-09T16:12:16.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 1 up:standby 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/635382590' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:16 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.716+0000 7f6af37fe640 1 -- 192.168.123.103:0/635382590 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]=0 inline data enabled v7) v1 ==== 168+0+0 (secure 0 0 0) 0x7f6af4062020 con 0x7f6b0c0ff6e0 2026-03-09T16:12:16.718 INFO:teuthology.orchestra.run.vm03.stderr:inline data enabled 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 msgr2=0x7f6adc078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 0x7f6adc078630 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f6b00002c80 tx=0x7f6b000023d0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 msgr2=0x7f6b0c19a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f6af400e940 tx=0x7f6af400ee10 comp rx=0 tx=0).stop 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 shutdown_connections 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6adc076170 0x7f6adc078630 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.721+0000 7f6b10a68640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b0c19acd0 0x7f6b0c19fd40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.722+0000 7f6b10a68640 1 --2- 192.168.123.103:0/635382590 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b0c0ff6e0 0x7f6b0c19a790 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.722+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 >> 192.168.123.103:0/635382590 conn(0x7f6b0c0fae50 msgr2=0x7f6b0c0fcb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.722+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 shutdown_connections 2026-03-09T16:12:16.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:16.722+0000 7f6b10a68640 1 -- 192.168.123.103:0/635382590 wait complete. 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: Deploying daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3310629291' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.105:6824/3577484575,v1:192.168.123.105:6825/3577484575] up:active 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 1 up:standby 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 1 up:standby 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/635382590' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:16 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:16.777 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:12:16.779 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:16.780 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs dump' 2026-03-09T16:12:16.977 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:17.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.285+0000 7f8947445640 1 -- 192.168.123.103:0/2826060262 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8940072390 msgr2=0x7f894010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:17.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.285+0000 7f8947445640 1 --2- 192.168.123.103:0/2826060262 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8940072390 0x7f894010c590 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f89340099b0 tx=0x7f893402f240 comp rx=0 tx=0).stop 2026-03-09T16:12:17.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.286+0000 7f8947445640 1 -- 192.168.123.103:0/2826060262 shutdown_connections 2026-03-09T16:12:17.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.286+0000 7f8947445640 1 --2- 192.168.123.103:0/2826060262 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8940072390 0x7f894010c590 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.286+0000 7f8947445640 1 --2- 192.168.123.103:0/2826060262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89400719c0 0x7f8940071dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.286+0000 7f8947445640 1 -- 192.168.123.103:0/2826060262 >> 192.168.123.103:0/2826060262 conn(0x7f894006d4f0 msgr2=0x7f894006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:17.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.289+0000 7f8947445640 1 -- 192.168.123.103:0/2826060262 shutdown_connections 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.289+0000 7f8947445640 1 -- 192.168.123.103:0/2826060262 wait complete. 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.289+0000 7f8947445640 1 Processor -- start 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f8947445640 1 -- start start 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f8947445640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f8947445640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89401a76e0 0x7f89400773c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f8947445640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89401a7bf0 con 0x7f89400719c0 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f8947445640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89401a7d60 con 0x7f89401a76e0 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:49038/0 (socket says 192.168.123.103:49038) 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 -- 192.168.123.103:0/3792987500 learned_addr learned my addr 192.168.123.103:0/3792987500 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 -- 192.168.123.103:0/3792987500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89401a76e0 msgr2=0x7f89400773c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89401a76e0 0x7f89400773c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 -- 192.168.123.103:0/3792987500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8934009660 con 0x7f89400719c0 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.290+0000 7f89451ba640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7f893000b750 tx=0x7f893000bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:17.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.292+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8930004070 con 0x7f89400719c0 2026-03-09T16:12:17.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.292+0000 7f8947445640 1 -- 192.168.123.103:0/3792987500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8940077900 con 0x7f89400719c0 2026-03-09T16:12:17.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.292+0000 7f8947445640 1 -- 192.168.123.103:0/3792987500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8940077e50 con 0x7f89400719c0 2026-03-09T16:12:17.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.294+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8930002780 con 0x7f89400719c0 2026-03-09T16:12:17.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.295+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f893000ca90 con 0x7f89400719c0 2026-03-09T16:12:17.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.295+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f893000ccb0 con 0x7f89400719c0 2026-03-09T16:12:17.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.296+0000 7f8947445640 1 -- 192.168.123.103:0/3792987500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8908005350 con 0x7f89400719c0 2026-03-09T16:12:17.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.299+0000 7f892e7fc640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 0x7f8914078540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:17.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.299+0000 7f89449b9640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 0x7f8914078540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:17.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.300+0000 7f89449b9640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 0x7f8914078540 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f89340099b0 tx=0x7f89340023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:17.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.300+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f8930097da0 con 0x7f89400719c0 2026-03-09T16:12:17.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.300+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8930004220 con 0x7f89400719c0 2026-03-09T16:12:17.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.449+0000 7f8947445640 1 -- 192.168.123.103:0/3792987500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8908005e10 con 0x7f89400719c0 2026-03-09T16:12:17.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.450+0000 7f892e7fc640 1 -- 192.168.123.103:0/3792987500 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 7 v7) v1 ==== 75+0+1794 (secure 0 0 0) 0x7f89300614c0 con 0x7f89400719c0 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:e7 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:epoch 7 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:12:17.452 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:16.709564+0000 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14484} 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{0:14484} state up:active seq 2 addr [v2:192.168.123.105:6824/3577484575,v1:192.168.123.105:6825/3577484575] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{-1:14476} state up:standby seq 1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:12:17.453 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 7 2026-03-09T16:12:17.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.454+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 msgr2=0x7f8914078540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:17.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.454+0000 7f890ffff640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 0x7f8914078540 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f89340099b0 tx=0x7f89340023d0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 msgr2=0x7f89401a71a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7f893000b750 tx=0x7f893000bc20 comp rx=0 tx=0).stop 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 shutdown_connections 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f8914076080 0x7f8914078540 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89401a76e0 0x7f89400773c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 --2- 192.168.123.103:0/3792987500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89400719c0 0x7f89401a71a0 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 >> 192.168.123.103:0/3792987500 conn(0x7f894006d4f0 msgr2=0x7f8940070470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 shutdown_connections 2026-03-09T16:12:17.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.455+0000 7f890ffff640 1 -- 192.168.123.103:0/3792987500 wait complete. 2026-03-09T16:12:17.519 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-09T16:12:17.718 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: Deploying daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: pgmap v79: 65 pgs: 17 creating+peering, 48 active+clean; 449 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 1 op/s 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:boot 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:boot 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 3 up:standby 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:17.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:17 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3792987500' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:12:17.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.997+0000 7f72dc87e640 1 -- 192.168.123.103:0/652450885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 msgr2=0x7f72d4077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:17.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.997+0000 7f72dc87e640 1 --2- 192.168.123.103:0/652450885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d4077250 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7f72c00099b0 tx=0x7f72c002f220 comp rx=0 tx=0).stop 2026-03-09T16:12:17.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 -- 192.168.123.103:0/652450885 shutdown_connections 2026-03-09T16:12:17.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 --2- 192.168.123.103:0/652450885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d4077250 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 --2- 192.168.123.103:0/652450885 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d4075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 -- 192.168.123.103:0/652450885 >> 192.168.123.103:0/652450885 conn(0x7f72d40fe250 msgr2=0x7f72d4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 -- 192.168.123.103:0/652450885 shutdown_connections 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.998+0000 7f72dc87e640 1 -- 192.168.123.103:0/652450885 wait complete. 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 Processor -- start 2026-03-09T16:12:17.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 -- start start 2026-03-09T16:12:18.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d419e9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72d419f4c0 con 0x7f72d4076df0 2026-03-09T16:12:18.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72dc87e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72d419f630 con 0x7f72d4075ba0 2026-03-09T16:12:18.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72d9df2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72d9df2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:49058/0 (socket says 192.168.123.103:49058) 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:17.999+0000 7f72d9df2640 1 -- 192.168.123.103:0/1429470193 learned_addr learned my addr 192.168.123.103:0/1429470193 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.000+0000 7f72d9df2640 1 -- 192.168.123.103:0/1429470193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 msgr2=0x7f72d419e9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.000+0000 7f72da5f3640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d419e9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.000+0000 7f72d9df2640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d419e9b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.000+0000 7f72d9df2640 1 -- 192.168.123.103:0/1429470193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72c0009660 con 0x7f72d4076df0 2026-03-09T16:12:18.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72da5f3640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d419e9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:18.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72d9df2640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f72c0009ae0 tx=0x7f72c0002980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:18.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72c003d070 con 0x7f72d4076df0 2026-03-09T16:12:18.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f72c0031df0 con 0x7f72d4076df0 2026-03-09T16:12:18.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72c0031280 con 0x7f72d4076df0 2026-03-09T16:12:18.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72d41a4070 con 0x7f72d4076df0 2026-03-09T16:12:18.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.001+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72d41a45e0 con 0x7f72d4076df0 2026-03-09T16:12:18.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.003+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f72c0049050 con 0x7f72d4076df0 2026-03-09T16:12:18.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.003+0000 7f72bf7fe640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 0x7f72b0078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.003+0000 7f72da5f3640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 0x7f72b0078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.004+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f72c00bd410 con 0x7f72d4076df0 2026-03-09T16:12:18.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.004+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72d410fc00 con 0x7f72d4076df0 2026-03-09T16:12:18.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.007+0000 7f72da5f3640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 0x7f72b0078680 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f72cc006fd0 tx=0x7f72cc008040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:18.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.008+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f72c00869c0 con 0x7f72d4076df0 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: Deploying daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: pgmap v79: 65 pgs: 17 creating+peering, 48 active+clean; 449 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 1 op/s 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:boot 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:boot 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 3 up:standby 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:17 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3792987500' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:12:18.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.164+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f72d410fde0 con 0x7f72d4076df0 2026-03-09T16:12:18.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.164+0000 7f72bf7fe640 1 -- 192.168.123.103:0/1429470193 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 7 v7) v1 ==== 93+0+4838 (secure 0 0 0) 0x7f72c0086360 con 0x7f72d4076df0 2026-03-09T16:12:18.166 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 7 2026-03-09T16:12:18.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.169+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 msgr2=0x7f72b0078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.169+0000 7f72dc87e640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 0x7f72b0078680 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f72cc006fd0 tx=0x7f72cc008040 comp rx=0 tx=0).stop 2026-03-09T16:12:18.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.169+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 msgr2=0x7f72d419eef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.169+0000 7f72dc87e640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f72c0009ae0 tx=0x7f72c0002980 comp rx=0 tx=0).stop 2026-03-09T16:12:18.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 shutdown_connections 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f72b00761c0 0x7f72b0078680 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72d4076df0 0x7f72d419eef0 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 --2- 192.168.123.103:0/1429470193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72d4075ba0 0x7f72d419e9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 >> 192.168.123.103:0/1429470193 conn(0x7f72d40fe250 msgr2=0x7f72d40ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 shutdown_connections 2026-03-09T16:12:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.172+0000 7f72dc87e640 1 -- 192.168.123.103:0/1429470193 wait complete. 2026-03-09T16:12:18.180 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:12:18.225 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-09T16:12:18.436 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.718+0000 7f5504df7640 1 -- 192.168.123.103:0/2337656889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 msgr2=0x7f550010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.718+0000 7f5504df7640 1 --2- 192.168.123.103:0/2337656889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f550010c590 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f54f00099b0 tx=0x7f54f002f220 comp rx=0 tx=0).stop 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.719+0000 7f5504df7640 1 -- 192.168.123.103:0/2337656889 shutdown_connections 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.719+0000 7f5504df7640 1 --2- 192.168.123.103:0/2337656889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f550010c590 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.719+0000 7f5504df7640 1 --2- 192.168.123.103:0/2337656889 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f5500071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.719+0000 7f5504df7640 1 -- 192.168.123.103:0/2337656889 >> 192.168.123.103:0/2337656889 conn(0x7f550006d4f0 msgr2=0x7f550006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.719+0000 7f5504df7640 1 -- 192.168.123.103:0/2337656889 shutdown_connections 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 -- 192.168.123.103:0/2337656889 wait complete. 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 Processor -- start 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 -- start start 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f55001a7160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f55001a76a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55001a7c70 con 0x7f5500072370 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.720+0000 7f5504df7640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55001a7de0 con 0x7f55000719a0 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54feffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f55001a76a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54ff7fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f55001a7160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54ff7fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f55001a7160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58640/0 (socket says 192.168.123.103:58640) 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54ff7fe640 1 -- 192.168.123.103:0/3135764628 learned_addr learned my addr 192.168.123.103:0/3135764628 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54feffd640 1 -- 192.168.123.103:0/3135764628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 msgr2=0x7f55001a7160 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54feffd640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f55001a7160 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54feffd640 1 -- 192.168.123.103:0/3135764628 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54f0009660 con 0x7f5500072370 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54feffd640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f55001a76a0 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7f54f0002c20 tx=0x7f54f0002910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54f003d070 con 0x7f5500072370 2026-03-09T16:12:18.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.721+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f54f0002e20 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.722+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54f00416b0 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.723+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f550010ee50 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.723+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f550010f2c0 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.724+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5500071da0 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.726+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f54f002fc90 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.727+0000 7f54fcff9640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 0x7f54d40784c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.727+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f54f00bc790 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.729+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f54f0085e00 con 0x7f5500072370 2026-03-09T16:12:18.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.729+0000 7f54ff7fe640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 0x7f54d40784c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/1429470193' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.731 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:18 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:18.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.733+0000 7f54ff7fe640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 0x7f54d40784c0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f54ec0059c0 tx=0x7f54ec009290 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:18.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.873+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f55001184d0 con 0x7f5500072370 2026-03-09T16:12:18.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.877+0000 7f54fcff9640 1 -- 192.168.123.103:0/3135764628 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v9) v1 ==== 78+0+98 (secure 0 0 0) 0x7f54f00857a0 con 0x7f5500072370 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 msgr2=0x7f54d40784c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 0x7f54d40784c0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f54ec0059c0 tx=0x7f54ec009290 comp rx=0 tx=0).stop 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 msgr2=0x7f55001a76a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f55001a76a0 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7f54f0002c20 tx=0x7f54f0002910 comp rx=0 tx=0).stop 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 shutdown_connections 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f54d4076000 0x7f54d40784c0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5500072370 0x7f55001a76a0 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 --2- 192.168.123.103:0/3135764628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55000719a0 0x7f55001a7160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.881+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 >> 192.168.123.103:0/3135764628 conn(0x7f550006d4f0 msgr2=0x7f550010a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:18.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.882+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 shutdown_connections 2026-03-09T16:12:18.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:18.882+0000 7f5504df7640 1 -- 192.168.123.103:0/3135764628 wait complete. 2026-03-09T16:12:18.891 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/1429470193' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:18.929 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:18 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:12:18.941 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-09T16:12:18.945 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 2026-03-09T16:12:19.112 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.387+0000 7fa5a0a30640 1 -- 192.168.123.103:0/930810525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c071ec0 msgr2=0x7fa59c072320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.387+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/930810525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c071ec0 0x7fa59c072320 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7fa58c0099b0 tx=0x7fa58c02f220 comp rx=0 tx=0).stop 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 -- 192.168.123.103:0/930810525 shutdown_connections 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/930810525 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c071ec0 0x7fa59c072320 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/930810525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c10ae70 0x7fa59c10b250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 -- 192.168.123.103:0/930810525 >> 192.168.123.103:0/930810525 conn(0x7fa59c06c7d0 msgr2=0x7fa59c06cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 -- 192.168.123.103:0/930810525 shutdown_connections 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.388+0000 7fa5a0a30640 1 -- 192.168.123.103:0/930810525 wait complete. 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 Processor -- start 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 -- start start 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c071ec0 0x7fa59c110ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa59c10be50 con 0x7fa59c10ae70 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa5a0a30640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa59c10bfc0 con 0x7fa59c071ec0 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa59affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa59affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38378/0 (socket says 192.168.123.103:38378) 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa59affd640 1 -- 192.168.123.103:0/143191036 learned_addr learned my addr 192.168.123.103:0/143191036 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.389+0000 7fa59b7fe640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c071ec0 0x7fa59c110ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa59affd640 1 -- 192.168.123.103:0/143191036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c071ec0 msgr2=0x7fa59c110ce0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa59affd640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c071ec0 0x7fa59c110ce0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa59affd640 1 -- 192.168.123.103:0/143191036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa58c009660 con 0x7fa59c10ae70 2026-03-09T16:12:19.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa59affd640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7fa58c002410 tx=0x7fa58c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa58c03d070 con 0x7fa59c10ae70 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa58c038730 con 0x7fa59c10ae70 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa58c041620 con 0x7fa59c10ae70 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa59c10c240 con 0x7fa59c10ae70 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.390+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa59c10c730 con 0x7fa59c10ae70 2026-03-09T16:12:19.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.391+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa568005350 con 0x7fa59c10ae70 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.396+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa58c0388a0 con 0x7fa59c10ae70 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.397+0000 7fa598ff9640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 0x7fa574078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.397+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa58c0bc7b0 con 0x7fa59c10ae70 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.397+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa58c0bcc40 con 0x7fa59c10ae70 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.397+0000 7fa59b7fe640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 0x7fa574078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:19.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.398+0000 7fa59b7fe640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 0x7fa574078630 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa590009d50 tx=0x7fa59000b040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:19.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.530+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fa5680051c0 con 0x7fa59c10ae70 2026-03-09T16:12:19.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.532+0000 7fa598ff9640 1 -- 192.168.123.103:0/143191036 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 9 v9) v1 ==== 93+0+4055 (secure 0 0 0) 0x7fa58c085e20 con 0x7fa59c10ae70 2026-03-09T16:12:19.533 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:12:19.533 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":9,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7}],"filesystems":[{"mdsmap":{"epoch":9,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:12:18.629355+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm03.kygyjl","rank":0,"incarnation":9,"state":"up:replay","state_seq":2,"addr":"192.168.123.103:6827/1622851291","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":1622851291},{"type":"v1","addr":"192.168.123.103:6827","nonce":1622851291}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-09T16:12:19.533 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 9 2026-03-09T16:12:19.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.533+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 msgr2=0x7fa574078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.533+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 0x7fa574078630 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa590009d50 tx=0x7fa59000b040 comp rx=0 tx=0).stop 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.533+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 msgr2=0x7fa59c111220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.533+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7fa58c002410 tx=0x7fa58c004290 comp rx=0 tx=0).stop 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 shutdown_connections 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa574076170 0x7fa574078630 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa59c10ae70 0x7fa59c111220 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 --2- 192.168.123.103:0/143191036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c071ec0 0x7fa59c110ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 >> 192.168.123.103:0/143191036 conn(0x7fa59c06c7d0 msgr2=0x7fa59c1a01a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 shutdown_connections 2026-03-09T16:12:19.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:19.534+0000 7fa5a0a30640 1 -- 192.168.123.103:0/143191036 wait complete. 2026-03-09T16:12:19.589 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 18} 2026-03-09T16:12:19.589 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-09T16:12:19.600 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-09T16:12:19.600 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-09T16:12:19.600 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-09T16:12:19.600 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T16:12:19.600 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:19.600 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-09T16:12:19.600 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T16:12:19.600 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:19.600 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:19.600 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-09T16:12:19.620 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:19.620 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link delete ceph-brx 2026-03-09T16:12:19.691 INFO:teuthology.orchestra.run.vm05.stderr:Cannot find device "ceph-brx" 2026-03-09T16:12:19.692 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:12:19.692 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:19.692 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-09T16:12:19.723 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:19.723 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link delete ceph-brx 2026-03-09T16:12:19.811 INFO:teuthology.orchestra.run.vm03.stderr:Cannot find device "ceph-brx" 2026-03-09T16:12:19.812 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:12:19.812 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-09T16:12:19.812 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T16:12:19.812 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs ls 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: pgmap v80: 65 pgs: 6 creating+peering, 59 active+clean; 449 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 579 B/s wr, 1 op/s 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:standby 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: Dropping low affinity active daemon mds.cephfs.vm05.jgzfvu in favor of higher affinity standby. 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: Replacing daemon mds.cephfs.vm05.jgzfvu as rank 0 with standby daemon mds.cephfs.vm03.kygyjl 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 3 up:standby 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:replay} 2 up:standby 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3135764628' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:19.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:19 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/143191036' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:12:19.999 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: pgmap v80: 65 pgs: 6 creating+peering, 59 active+clean; 449 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 579 B/s wr, 1 op/s 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:standby 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: Dropping low affinity active daemon mds.cephfs.vm05.jgzfvu in favor of higher affinity standby. 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: Replacing daemon mds.cephfs.vm05.jgzfvu as rank 0 with standby daemon mds.cephfs.vm03.kygyjl 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm05.jgzfvu=up:active} 3 up:standby 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:replay} 2 up:standby 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3135764628' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:19 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/143191036' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:12:20.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.306+0000 7fcba1931640 1 -- 192.168.123.103:0/1502958763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 msgr2=0x7fcb9c105900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:20.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.306+0000 7fcba1931640 1 --2- 192.168.123.103:0/1502958763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c105900 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7fcb900099b0 tx=0x7fcb9002f240 comp rx=0 tx=0).stop 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 -- 192.168.123.103:0/1502958763 shutdown_connections 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 --2- 192.168.123.103:0/1502958763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 0x7fcb9c068a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 --2- 192.168.123.103:0/1502958763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c105900 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 -- 192.168.123.103:0/1502958763 >> 192.168.123.103:0/1502958763 conn(0x7fcb9c075250 msgr2=0x7fcb9c075660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 -- 192.168.123.103:0/1502958763 shutdown_connections 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.308+0000 7fcba1931640 1 -- 192.168.123.103:0/1502958763 wait complete. 2026-03-09T16:12:20.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.309+0000 7fcba1931640 1 Processor -- start 2026-03-09T16:12:20.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcba1931640 1 -- start start 2026-03-09T16:12:20.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcba1931640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 0x7fcb9c19ecc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcba1931640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcba1931640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb9c19f8e0 con 0x7fcb9c105520 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcba1931640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb9c1a2550 con 0x7fcb9c068600 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9affd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 0x7fcb9c19ecc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38404/0 (socket says 192.168.123.103:38404) 2026-03-09T16:12:20.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 -- 192.168.123.103:0/3927386079 learned_addr learned my addr 192.168.123.103:0/3927386079 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 -- 192.168.123.103:0/3927386079 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 msgr2=0x7fcb9c19ecc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 0x7fcb9c19ecc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 -- 192.168.123.103:0/3927386079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb90009660 con 0x7fcb9c105520 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.310+0000 7fcb9a7fc640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7fcb8400d8d0 tx=0x7fcb8400dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.311+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb84004490 con 0x7fcb9c105520 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.311+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcb840076c0 con 0x7fcb9c105520 2026-03-09T16:12:20.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.311+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb84002e90 con 0x7fcb9c105520 2026-03-09T16:12:20.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.312+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb9c1a2830 con 0x7fcb9c105520 2026-03-09T16:12:20.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.312+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb9c1a2d00 con 0x7fcb9c105520 2026-03-09T16:12:20.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.313+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fcb8400b840 con 0x7fcb9c105520 2026-03-09T16:12:20.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.313+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb60005350 con 0x7fcb9c105520 2026-03-09T16:12:20.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.317+0000 7fcba092f640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 0x7fcb70078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:20.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.317+0000 7fcb9affd640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 0x7fcb70078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:20.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.317+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fcb84096d90 con 0x7fcb9c105520 2026-03-09T16:12:20.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.317+0000 7fcb9affd640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 0x7fcb70078470 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fcb90002410 tx=0x7fcb9003a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:20.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.317+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fcb8409c050 con 0x7fcb9c105520 2026-03-09T16:12:20.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.445+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fcb600058d0 con 0x7fcb9c105520 2026-03-09T16:12:20.447 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T16:12:20.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.446+0000 7fcba092f640 1 -- 192.168.123.103:0/3927386079 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v10) v1 ==== 53+0+83 (secure 0 0 0) 0x7fcb84060360 con 0x7fcb9c105520 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.451+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 msgr2=0x7fcb70078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.451+0000 7fcba1931640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 0x7fcb70078470 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fcb90002410 tx=0x7fcb9003a040 comp rx=0 tx=0).stop 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.451+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 msgr2=0x7fcb9c19f200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.451+0000 7fcba1931640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7fcb8400d8d0 tx=0x7fcb8400dda0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 shutdown_connections 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcb70075fb0 0x7fcb70078470 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcb9c105520 0x7fcb9c19f200 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 --2- 192.168.123.103:0/3927386079 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb9c068600 0x7fcb9c19ecc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 >> 192.168.123.103:0/3927386079 conn(0x7fcb9c075250 msgr2=0x7fcb9c111b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 shutdown_connections 2026-03-09T16:12:20.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20.452+0000 7fcba1931640 1 -- 192.168.123.103:0/3927386079 wait complete. 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm03.local 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T16:12:20.501 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-09T16:12:20.501 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:20.502 DEBUG:teuthology.orchestra.run.vm03:> ip addr 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: inet6 ::1/128 scope host 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: link/ether 52:55:00:00:00:03 brd ff:ff:ff:ff:ff:ff 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: altname enp0s3 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: altname ens3 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: inet 192.168.123.103/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft 3100sec preferred_lft 3100sec 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-09T16:12:20.562 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:20.563 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add name ceph-brx type bridge 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr flush dev ceph-brx 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set ceph-brx up 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T16:12:20.563 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T16:12:20.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] up:boot 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:reconnect 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:reconnect} 3 up:standby 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 816 B/s rd, 2.2 KiB/s wr, 7 op/s 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='client.? 192.168.123.103:0/3927386079' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:20.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:20 vm03 ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:20.736 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:20.736 DEBUG:teuthology.orchestra.run.vm03:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T16:12:20.808 INFO:teuthology.orchestra.run.vm03.stdout:1 2026-03-09T16:12:20.810 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:20.810 DEBUG:teuthology.orchestra.run.vm03:> ip r 2026-03-09T16:12:20.866 INFO:teuthology.orchestra.run.vm03.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-09T16:12:20.866 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-09T16:12:20.866 INFO:teuthology.orchestra.run.vm03.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T16:12:20.866 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:20.866 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T16:12:20.867 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T16:12:20.867 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T16:12:20.867 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T16:12:20.867 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T16:12:20.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:21.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:21.017 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:21.018 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] up:boot 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:reconnect 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:reconnect} 3 up:standby 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 816 B/s rd, 2.2 KiB/s wr, 7 op/s 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='client.? 192.168.123.103:0/3927386079' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:12:21.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:20 vm05 ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:21.074 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:21.074 DEBUG:teuthology.orchestra.run.vm03:> ip netns list-id 2026-03-09T16:12:21.128 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:21.128 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T16:12:21.128 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T16:12:21.128 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-09T16:12:21.128 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T16:12:21.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:21.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:21.228 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-09T16:12:21.228 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T16:12:21.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:21.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:21.373 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:21.373 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T16:12:21.373 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set brx.0 up 2026-03-09T16:12:21.374 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T16:12:21.374 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T16:12:21.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:21.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:21.479 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-09T16:12:21.479 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T16:12:21.479 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:21.534 INFO:teuthology.orchestra.run.vm03.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-09T16:12:21.534 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T16:12:21.534 DEBUG:teuthology.orchestra.run.vm03:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:21.595 DEBUG:teuthology.orchestra.run.vm03:> sudo modprobe fuse 2026-03-09T16:12:21.661 DEBUG:teuthology.orchestra.run.vm03:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/proc 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/sys 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/dev 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/security 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/dev/shm 2026-03-09T16:12:21.718 INFO:teuthology.orchestra.run.vm03.stdout:/dev/pts 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/cgroup 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/pstore 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/bpf 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/config 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/ 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/selinux 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/dev/mqueue 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/dev/hugepages 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/debug 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/tracing 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/fuse/connections 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/1000 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/829c97f49c3f8b716655a9b5c8e6cc0d05afa2314329781402510a200ffab14b/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/3765126a3e9da3bb3c693c4db737f2dc7fb734fae337796ebb82545ab0e93e27/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/0 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/551c725788ba5095bac3a17e474876ad9e35b5e766a36869d78a0da1d8be8986/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/fede52af5f080efff8c68c3c634e65e3daa813ffc876ee7b8cd28a6ede659c3b/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/416206c19b42d07f6f6b8d15b96876795549e00a34c9ed60a1ff8a3f1bb1f1af/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/46a59001d7bfbb248028a5e46ed6a002bcec3843cc475180a1c0c4f313ea6242/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/15b28911e3cad691df069c3b330c072694a284de56340ada20c551dc4b698988/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/2da9843106e21292de6cacc24ac08e515a88d47747bbb58c6c7e5587b4490863/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/4f8b748b7d8f68400f9c34ba05bd36914ae4eaa91921a8967a58f0d0b52bae93/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/9ea3316c45ab6b9e9be0b105d252880c34464ed681a635d5f9d75cbb8dc6e75a/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/284856404a2c2910ca7b0da802b59d9f3df09f5b523acd208a45fdbaed161aec/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/0d2857638bc56eb11af822f1358dfa5bd71b316ab767f541ebbb27a0619c5764/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/2f2885408f3789f78dd96e200d03e0df0d745c987604afe078e957b2f58d019b/merged 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T16:12:21.719 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T16:12:21.720 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:21.720 DEBUG:teuthology.orchestra.run.vm03:> ls /sys/fs/fuse/connections 2026-03-09T16:12:21.776 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T16:12:21.776 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-09T16:12:21.818 DEBUG:teuthology.orchestra.run.vm03:> sudo modprobe fuse 2026-03-09T16:12:21.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:21 vm03.local ceph-mon[51019]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:rejoin 2026-03-09T16:12:21.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:21 vm03.local ceph-mon[51019]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:standby 2026-03-09T16:12:21.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:21 vm03.local ceph-mon[51019]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:standby 2026-03-09T16:12:21.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:21 vm03.local ceph-mon[51019]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:rejoin} 3 up:standby 2026-03-09T16:12:21.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:21 vm03.local ceph-mon[51019]: daemon mds.cephfs.vm03.kygyjl is now active in filesystem cephfs as rank 0 2026-03-09T16:12:21.847 DEBUG:teuthology.orchestra.run.vm03:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T16:12:21.892 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:2026-03-09T16:12:21.891+0000 7f5fed9cb580 -1 init, newargv = 0x55c27dea7150 newargc=15 2026-03-09T16:12:21.893 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[98426]: starting ceph client 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/proc 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/dev 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/security 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/dev/shm 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/dev/pts 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/run 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/cgroup 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/pstore 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/bpf 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/config 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/ 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/selinux 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/dev/mqueue 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/dev/hugepages 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/debug 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/tracing 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T16:12:21.911 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/fuse/connections 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/1000 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/829c97f49c3f8b716655a9b5c8e6cc0d05afa2314329781402510a200ffab14b/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/3765126a3e9da3bb3c693c4db737f2dc7fb734fae337796ebb82545ab0e93e27/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/0 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/551c725788ba5095bac3a17e474876ad9e35b5e766a36869d78a0da1d8be8986/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/fede52af5f080efff8c68c3c634e65e3daa813ffc876ee7b8cd28a6ede659c3b/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/416206c19b42d07f6f6b8d15b96876795549e00a34c9ed60a1ff8a3f1bb1f1af/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/46a59001d7bfbb248028a5e46ed6a002bcec3843cc475180a1c0c4f313ea6242/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/15b28911e3cad691df069c3b330c072694a284de56340ada20c551dc4b698988/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/2da9843106e21292de6cacc24ac08e515a88d47747bbb58c6c7e5587b4490863/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/4f8b748b7d8f68400f9c34ba05bd36914ae4eaa91921a8967a58f0d0b52bae93/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/9ea3316c45ab6b9e9be0b105d252880c34464ed681a635d5f9d75cbb8dc6e75a/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/284856404a2c2910ca7b0da802b59d9f3df09f5b523acd208a45fdbaed161aec/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/0d2857638bc56eb11af822f1358dfa5bd71b316ab767f541ebbb27a0619c5764/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/2f2885408f3789f78dd96e200d03e0df0d745c987604afe078e957b2f58d019b/merged 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T16:12:21.912 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:21.912 DEBUG:teuthology.orchestra.run.vm03:> ls /sys/fs/fuse/connections 2026-03-09T16:12:21.926 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[98426]: starting fuse 2026-03-09T16:12:21.944 INFO:teuthology.orchestra.run.vm03.stdout:77 2026-03-09T16:12:21.944 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [77] 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> sudo stdin-killer -- python3 -c ' 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> import glob 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> import re 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> import os 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> import subprocess 2026-03-09T16:12:21.944 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> def _find_admin_socket(client_name): 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> files = glob.glob(asok_path) 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> # Given a non-glob path, it better be there 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> if "*" not in asok_path: 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> assert(len(files) == 1) 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> return files[0] 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> for f in files: 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> contents = proc_f.read() 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> if mountpoint in contents: 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> return f 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> print(_find_admin_socket("client.0")) 2026-03-09T16:12:21.945 DEBUG:teuthology.orchestra.run.vm03:> ' 2026-03-09T16:12:22.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:21 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:rejoin 2026-03-09T16:12:22.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:21 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:standby 2026-03-09T16:12:22.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:21 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:standby 2026-03-09T16:12:22.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:21 vm05 ceph-mon[58702]: fsmap cephfs:1/1 {0=cephfs.vm03.kygyjl=up:rejoin} 3 up:standby 2026-03-09T16:12:22.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:21 vm05 ceph-mon[58702]: daemon mds.cephfs.vm03.kygyjl is now active in filesystem cephfs as rank 0 2026-03-09T16:12:22.042 INFO:teuthology.orchestra.run.vm03.stdout:/var/run/ceph/ceph-client.0.98426.asok 2026-03-09T16:12:22.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:22.050 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.98426.asok 2026-03-09T16:12:22.050 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:22.050 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.98426.asok status 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "metadata": { 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_sha1": "ab47f43c099b2cbae6e21342fe673ce251da54d6", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "entity_id": "0", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "hostname": "vm03.local", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "pid": "98426", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "root": "/" 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "dentry_count": 0, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "dentry_pinned_count": 0, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "id": 14522, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "inst": { 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "name": { 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "type": "client", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "num": 14522 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "addr": { 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "type": "v1", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "addr": "192.168.144.1:0", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "nonce": 2261305767 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "addr": { 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "type": "v1", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "addr": "192.168.144.1:0", 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: "nonce": 2261305767 2026-03-09T16:12:22.161 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "inst_str": "client.14522 192.168.144.1:0/2261305767", 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "addr_str": "192.168.144.1:0/2261305767", 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "inode_count": 1, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "mds_epoch": 12, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "osd_epoch": 41, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "osd_epoch_barrier": 0, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "blocklisted": false, 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout: "fs_name": "cephfs" 2026-03-09T16:12:22.162 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:12:22.169 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T16:12:22.169 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs ls 2026-03-09T16:12:22.329 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:22.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.591+0000 7fb213c8b640 1 -- 192.168.123.103:0/1869338607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073510 msgr2=0x7fb20c0738f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:22.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.591+0000 7fb213c8b640 1 --2- 192.168.123.103:0/1869338607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073510 0x7fb20c0738f0 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7fb1f40099b0 tx=0x7fb1f402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 -- 192.168.123.103:0/1869338607 shutdown_connections 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 --2- 192.168.123.103:0/1869338607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073e30 0x7fb20c10cb80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 --2- 192.168.123.103:0/1869338607 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073510 0x7fb20c0738f0 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 -- 192.168.123.103:0/1869338607 >> 192.168.123.103:0/1869338607 conn(0x7fb20c0fc460 msgr2=0x7fb20c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 -- 192.168.123.103:0/1869338607 shutdown_connections 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 -- 192.168.123.103:0/1869338607 wait complete. 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 Processor -- start 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.592+0000 7fb213c8b640 1 -- start start 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb213c8b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb213c8b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 0x7fb20c100c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb213c8b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb20c104760 con 0x7fb20c073e30 2026-03-09T16:12:22.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb213c8b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb20c1048d0 con 0x7fb20c073510 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58704/0 (socket says 192.168.123.103:58704) 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 -- 192.168.123.103:0/3507703434 learned_addr learned my addr 192.168.123.103:0/3507703434 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb2111ff640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 0x7fb20c100c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 -- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 msgr2=0x7fb20c100c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 0x7fb20c100c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.593+0000 7fb211a00640 1 -- 192.168.123.103:0/3507703434 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1f4009660 con 0x7fb20c073510 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb211a00640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fb1f4002410 tx=0x7fb1f4004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:22.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb2111ff640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 0x7fb20c100c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:12:22.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1f403d070 con 0x7fb20c073510 2026-03-09T16:12:22.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb1f4031ea0 con 0x7fb20c073510 2026-03-09T16:12:22.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb20c1011c0 con 0x7fb20c073510 2026-03-09T16:12:22.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.594+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb20c1a90d0 con 0x7fb20c073510 2026-03-09T16:12:22.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.595+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1f4038680 con 0x7fb20c073510 2026-03-09T16:12:22.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.596+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb20c074c50 con 0x7fb20c073510 2026-03-09T16:12:22.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.596+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fb1f4049050 con 0x7fb20c073510 2026-03-09T16:12:22.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.596+0000 7fb202ffd640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 0x7fb1d80784c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:22.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.597+0000 7fb2111ff640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 0x7fb1d80784c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:22.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.597+0000 7fb2111ff640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 0x7fb1d80784c0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fb20c101e60 tx=0x7fb1fc00a680 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:22.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.597+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fb1f40bc390 con 0x7fb20c073510 2026-03-09T16:12:22.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.599+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb1f4085950 con 0x7fb20c073510 2026-03-09T16:12:22.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.715+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fb20c0738f0 con 0x7fb20c073510 2026-03-09T16:12:22.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.716+0000 7fb202ffd640 1 -- 192.168.123.103:0/3507703434 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7fb1f40852f0 con 0x7fb20c073510 2026-03-09T16:12:22.717 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 msgr2=0x7fb1d80784c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 0x7fb1d80784c0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fb20c101e60 tx=0x7fb1fc00a680 comp rx=0 tx=0).stop 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 msgr2=0x7fb20c1006e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fb1f4002410 tx=0x7fb1f4004290 comp rx=0 tx=0).stop 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 shutdown_connections 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fb1d8076000 0x7fb1d80784c0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb20c073e30 0x7fb20c100c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 --2- 192.168.123.103:0/3507703434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb20c073510 0x7fb20c1006e0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.718+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 >> 192.168.123.103:0/3507703434 conn(0x7fb20c0fc460 msgr2=0x7fb20c10c1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:22.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.719+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 shutdown_connections 2026-03-09T16:12:22.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:22.719+0000 7fb213c8b640 1 -- 192.168.123.103:0/3507703434 wait complete. 2026-03-09T16:12:22.778 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T16:12:22.778 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T16:12:22.778 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm05.local 2026-03-09T16:12:22.778 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-09T16:12:22.779 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:22.779 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T16:12:22.779 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T16:12:22.779 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T16:12:22.779 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-09T16:12:22.779 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:22.779 DEBUG:teuthology.orchestra.run.vm05:> ip addr 2026-03-09T16:12:22.794 INFO:teuthology.orchestra.run.vm05.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T16:12:22.794 INFO:teuthology.orchestra.run.vm05.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T16:12:22.794 INFO:teuthology.orchestra.run.vm05.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T16:12:22.794 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:22.794 INFO:teuthology.orchestra.run.vm05.stdout: inet6 ::1/128 scope host 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: link/ether 52:55:00:00:00:05 brd ff:ff:ff:ff:ff:ff 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: altname enp0s3 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: altname ens3 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: inet 192.168.123.105/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft 3067sec preferred_lft 3067sec 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: inet6 fe80::5055:ff:fe00:5/64 scope link noprefixroute 2026-03-09T16:12:22.795 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-09T16:12:22.795 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add name ceph-brx type bridge 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr flush dev ceph-brx 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set ceph-brx up 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T16:12:22.795 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-09T16:12:22.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:22 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:22.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:22 vm05 ceph-mon[58702]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:12:22.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:22 vm05 ceph-mon[58702]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:active 2026-03-09T16:12:22.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:22 vm05 ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:12:22.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:22 vm05 ceph-mon[58702]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 895 B/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-09T16:12:22.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:22 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:22.958 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:22.958 DEBUG:teuthology.orchestra.run.vm05:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T16:12:23.033 INFO:teuthology.orchestra.run.vm05.stdout:1 2026-03-09T16:12:23.034 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:23.034 DEBUG:teuthology.orchestra.run.vm05:> ip r 2026-03-09T16:12:23.088 INFO:teuthology.orchestra.run.vm05.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.105 metric 100 2026-03-09T16:12:23.088 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.105 metric 100 2026-03-09T16:12:23.088 INFO:teuthology.orchestra.run.vm05.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T16:12:23.088 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:23.089 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-09T16:12:23.089 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T16:12:23.089 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T16:12:23.089 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T16:12:23.089 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-09T16:12:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:22 vm03.local ceph-mon[51019]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:12:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:22 vm03.local ceph-mon[51019]: mds.? [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] up:active 2026-03-09T16:12:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:22 vm03.local ceph-mon[51019]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:12:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:22 vm03.local ceph-mon[51019]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 895 B/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-09T16:12:23.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:23.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:23.230 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:23.230 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-09T16:12:23.286 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:23.286 DEBUG:teuthology.orchestra.run.vm05:> ip netns list-id 2026-03-09T16:12:23.342 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:23.342 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-09T16:12:23.342 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T16:12:23.342 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-09T16:12:23.342 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-09T16:12:23.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:23.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:23.474 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-09T16:12:23.474 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-09T16:12:23.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:23.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:23.626 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T16:12:23.626 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-09T16:12:23.626 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set brx.0 up 2026-03-09T16:12:23.626 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T16:12:23.626 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-09T16:12:23.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T16:12:23.725 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:23 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/3507703434' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T16:12:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:23.752 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-09T16:12:23.752 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T16:12:23.752 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:23.769 INFO:teuthology.orchestra.run.vm05.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-09T16:12:23.769 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T16:12:23.769 DEBUG:teuthology.orchestra.run.vm05:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:23.825 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-09T16:12:23.891 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/b38b4904d7e9039ccc34d301a4f97efc2e64114b419de903a6cc8f977e6bad0a/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/cda2bfe61a110f3619f379aa8c810c792e7a34bcd8b252239ffb99f4335e577e/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/61b748ccb15506f18fcfb383cacaa1bb243126cb40fca09c46bcce954476ea51/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/427237548c9562dd0e835064532731d774f62234344939cf812b847261ddb293/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/bd0872b59ebc6f356c58f233e86389c396c8328d1cdcd95aa6fec715ecba9773/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/e9d606d3036af1155de018269844ad85b15e27e0e345e59e4db90453b2c417cc/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/c6b1bab71e682956a1aed774294dbeaf57133f2d34e323b55b4e9c4e785eb7dc/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/83d159bcd347ec6937cf34efe54b298fb85f1de590817f492b82142927fbef95/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/83ceecf63c6bdbba48a5a6c02a92969ce481c09d0a52b13e119d333b988f07f5/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/95982ef82d2ce378c6c1c55e5da3fc8c8f9e41b8ba0125a79d412929600b7876/merged 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-09T16:12:23.950 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T16:12:23.951 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T16:12:23.951 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:23.951 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-09T16:12:24.009 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T16:12:24.009 DEBUG:teuthology.orchestra.run.vm05:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-09T16:12:24.051 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-09T16:12:24.077 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T16:12:24.128 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm05.stderr:ceph-fuse[83944]: starting ceph client 2026-03-09T16:12:24.128 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm05.stderr:2026-03-09T16:12:24.127+0000 7f660ed6b580 -1 init, newargv = 0x5623dab6efe0 newargc=15 2026-03-09T16:12:24.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:23 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/3507703434' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T16:12:24.146 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-09T16:12:24.149 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/b38b4904d7e9039ccc34d301a4f97efc2e64114b419de903a6cc8f977e6bad0a/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/cda2bfe61a110f3619f379aa8c810c792e7a34bcd8b252239ffb99f4335e577e/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/61b748ccb15506f18fcfb383cacaa1bb243126cb40fca09c46bcce954476ea51/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/427237548c9562dd0e835064532731d774f62234344939cf812b847261ddb293/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/bd0872b59ebc6f356c58f233e86389c396c8328d1cdcd95aa6fec715ecba9773/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/e9d606d3036af1155de018269844ad85b15e27e0e345e59e4db90453b2c417cc/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/c6b1bab71e682956a1aed774294dbeaf57133f2d34e323b55b4e9c4e785eb7dc/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/83d159bcd347ec6937cf34efe54b298fb85f1de590817f492b82142927fbef95/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/83ceecf63c6bdbba48a5a6c02a92969ce481c09d0a52b13e119d333b988f07f5/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/95982ef82d2ce378c6c1c55e5da3fc8c8f9e41b8ba0125a79d412929600b7876/merged 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T16:12:24.150 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.150 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-09T16:12:24.165 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm05.stderr:ceph-fuse[83944]: starting fuse 2026-03-09T16:12:24.186 INFO:teuthology.orchestra.run.vm05.stdout:90 2026-03-09T16:12:24.186 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> sudo stdin-killer -- python3 -c ' 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> import glob 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> import re 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> import os 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> import subprocess 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> def _find_admin_socket(client_name): 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> files = glob.glob(asok_path) 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> # Given a non-glob path, it better be there 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> if "*" not in asok_path: 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> assert(len(files) == 1) 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> return files[0] 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> for f in files: 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> contents = proc_f.read() 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> if mountpoint in contents: 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> return f 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> print(_find_admin_socket("client.1")) 2026-03-09T16:12:24.186 DEBUG:teuthology.orchestra.run.vm05:> ' 2026-03-09T16:12:24.285 INFO:teuthology.orchestra.run.vm05.stdout:/var/run/ceph/ceph-client.1.83944.asok 2026-03-09T16:12:24.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-09T16:12:24 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T16:12:24.293 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.83944.asok 2026-03-09T16:12:24.294 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.294 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.83944.asok status 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "metadata": { 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_sha1": "ab47f43c099b2cbae6e21342fe673ce251da54d6", 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "entity_id": "1", 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "hostname": "vm05.local", 2026-03-09T16:12:24.403 INFO:teuthology.orchestra.run.vm05.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "pid": "83944", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "root": "/" 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_count": 0, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_pinned_count": 0, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "id": 24311, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "inst": { 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "name": { 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "type": "client", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "num": 24311 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.144.1:0", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 4162745798 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.144.1:0", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 4162745798 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "inst_str": "client.24311 192.168.144.1:0/4162745798", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "addr_str": "192.168.144.1:0/4162745798", 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "inode_count": 1, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "mds_epoch": 12, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch": 41, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch_barrier": 0, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "blocklisted": false, 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout: "fs_name": "cephfs" 2026-03-09T16:12:24.404 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-09T16:12:24.412 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.412 DEBUG:teuthology.orchestra.run.vm03:> stat --file-system '--printf=%T 2026-03-09T16:12:24.412 DEBUG:teuthology.orchestra.run.vm03:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.431 INFO:teuthology.orchestra.run.vm03.stdout:fuseblk 2026-03-09T16:12:24.431 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.431 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.431 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.503 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.503 DEBUG:teuthology.orchestra.run.vm05:> stat --file-system '--printf=%T 2026-03-09T16:12:24.503 DEBUG:teuthology.orchestra.run.vm05:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.521 INFO:teuthology.orchestra.run.vm05.stdout:fuseblk 2026-03-09T16:12:24.521 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.521 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:12:24.522 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.592 INFO:teuthology.run_tasks:Running task print... 2026-03-09T16:12:24.595 INFO:teuthology.task.print:**** done client 2026-03-09T16:12:24.595 INFO:teuthology.run_tasks:Running task parallel... 2026-03-09T16:12:24.598 INFO:teuthology.task.parallel:starting parallel... 2026-03-09T16:12:24.598 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T16:12:24.599 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T16:12:24.599 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:12:24.599 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T16:12:24.599 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T16:12:24.599 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-09T16:12:24.601 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T16:12:24.601 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-09T16:12:24.601 INFO:tasks.workunit:timeout=3h 2026-03-09T16:12:24.601 INFO:tasks.workunit:cleanup=True 2026-03-09T16:12:24.601 DEBUG:teuthology.orchestra.run.vm03:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Device: 4dh/77d Inode: 1 Links: 2 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 16:12:14.621128348 +0000 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 16:12:24.500776091 +0000 2026-03-09T16:12:24.623 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-09T16:12:24.623 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-09T16:12:24.623 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-09T16:12:24.683 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:24 vm05.local ceph-mon[58702]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 1.5 KiB/s wr, 7 op/s 2026-03-09T16:12:24.708 DEBUG:teuthology.orchestra.run.vm05:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-09 16:12:24.702048762 +0000 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-09 16:12:24.702048762 +0000 2026-03-09T16:12:24.729 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-09T16:12:24.729 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-09T16:12:24.730 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-09T16:12:24.772 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:24.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:24 vm03.local ceph-mon[51019]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 1.5 KiB/s wr, 7 op/s 2026-03-09T16:12:24.809 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T16:12:24.809 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T16:12:24.842 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-09T16:12:24.869 INFO:tasks.workunit.client.1.vm05.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-09T16:12:25.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.069+0000 7fa9801ab640 1 -- 192.168.123.103:0/290107039 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 msgr2=0x7fa978104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.069+0000 7fa9801ab640 1 --2- 192.168.123.103:0/290107039 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa978104100 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fa96c009a00 tx=0x7fa96c02f270 comp rx=0 tx=0).stop 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.071+0000 7fa9801ab640 1 -- 192.168.123.103:0/290107039 shutdown_connections 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.071+0000 7fa9801ab640 1 --2- 192.168.123.103:0/290107039 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa978104100 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.071+0000 7fa9801ab640 1 --2- 192.168.123.103:0/290107039 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa978102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.071+0000 7fa9801ab640 1 -- 192.168.123.103:0/290107039 >> 192.168.123.103:0/290107039 conn(0x7fa9780fe230 msgr2=0x7fa978100650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.071+0000 7fa9801ab640 1 -- 192.168.123.103:0/290107039 shutdown_connections 2026-03-09T16:12:25.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.072+0000 7fa9801ab640 1 -- 192.168.123.103:0/290107039 wait complete. 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.072+0000 7fa9801ab640 1 Processor -- start 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.072+0000 7fa9801ab640 1 -- start start 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.072+0000 7fa9801ab640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa9801ab640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa97819a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa9801ab640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa97819af90 con 0x7fa978103c80 2026-03-09T16:12:25.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa9801ab640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa97819b100 con 0x7fa978102a80 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa97df20640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa97df20640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58714/0 (socket says 192.168.123.103:58714) 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa97df20640 1 -- 192.168.123.103:0/2354001911 learned_addr learned my addr 192.168.123.103:0/2354001911 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.073+0000 7fa97df20640 1 -- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 msgr2=0x7fa97819a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa97d71f640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa97819a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa97df20640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa97819a9c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa97df20640 1 -- 192.168.123.103:0/2354001911 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa96c009660 con 0x7fa978102a80 2026-03-09T16:12:25.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa97d71f640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa97819a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:12:25.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa97df20640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fa96800d900 tx=0x7fa96800ddd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:25.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa968004490 con 0x7fa978102a80 2026-03-09T16:12:25.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa96800bd00 con 0x7fa978102a80 2026-03-09T16:12:25.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa97819fba0 con 0x7fa978102a80 2026-03-09T16:12:25.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.074+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa9781a00f0 con 0x7fa978102a80 2026-03-09T16:12:25.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.075+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa97810b6b0 con 0x7fa978102a80 2026-03-09T16:12:25.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.075+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa968010460 con 0x7fa978102a80 2026-03-09T16:12:25.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.076+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa9680105c0 con 0x7fa978102a80 2026-03-09T16:12:25.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.076+0000 7fa966ffd640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 0x7fa954078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.077+0000 7fa97d71f640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 0x7fa954078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.077+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa968098e30 con 0x7fa978102a80 2026-03-09T16:12:25.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.077+0000 7fa97d71f640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 0x7fa954078680 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fa96c002c80 tx=0x7fa96c0023d0 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:25.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.078+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa9680623f0 con 0x7fa978102a80 2026-03-09T16:12:25.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.181+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7fa97819fd30 con 0x7fa978102a80 2026-03-09T16:12:25.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.183+0000 7fa966ffd640 1 -- 192.168.123.103:0/2354001911 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v17)=0 v17) v1 ==== 155+0+0 (secure 0 0 0) 0x7fa97819fd30 con 0x7fa978102a80 2026-03-09T16:12:25.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.185+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 msgr2=0x7fa954078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.185+0000 7fa9801ab640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 0x7fa954078680 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fa96c002c80 tx=0x7fa96c0023d0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 msgr2=0x7fa97819a480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fa96800d900 tx=0x7fa96800ddd0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 shutdown_connections 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa9540761c0 0x7fa954078680 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa978103c80 0x7fa97819a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.186+0000 7fa9801ab640 1 --2- 192.168.123.103:0/2354001911 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa978102a80 0x7fa97819a480 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.187+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 >> 192.168.123.103:0/2354001911 conn(0x7fa9780fe230 msgr2=0x7fa9780ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.187+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 shutdown_connections 2026-03-09T16:12:25.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.187+0000 7fa9801ab640 1 -- 192.168.123.103:0/2354001911 wait complete. 2026-03-09T16:12:25.255 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T16:12:25.462 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 -- 192.168.123.103:0/4196793068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a4840 msgr2=0x7fc3400a4c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/4196793068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a4840 0x7fc3400a4c40 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7fc33801cb30 tx=0x7fc338040420 comp rx=0 tx=0).stop 2026-03-09T16:12:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 -- 192.168.123.103:0/4196793068 shutdown_connections 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/4196793068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a5930 0x7fc3400a5d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/4196793068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a4840 0x7fc3400a4c40 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.720+0000 7fc34e2c7640 1 -- 192.168.123.103:0/4196793068 >> 192.168.123.103:0/4196793068 conn(0x7fc34009fe90 msgr2=0x7fc3400a22f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.721+0000 7fc34e2c7640 1 -- 192.168.123.103:0/4196793068 shutdown_connections 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.721+0000 7fc34e2c7640 1 -- 192.168.123.103:0/4196793068 wait complete. 2026-03-09T16:12:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 Processor -- start 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 -- start start 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 0x7fc34013c0c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 0x7fc34013c600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc34013cbd0 con 0x7fc3400a5930 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.722+0000 7fc34e2c7640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc34013cd40 con 0x7fc3400a4840 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34cac4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 0x7fc34013c600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34cac4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 0x7fc34013c600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38470/0 (socket says 192.168.123.103:38470) 2026-03-09T16:12:25.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34cac4640 1 -- 192.168.123.103:0/568563748 learned_addr learned my addr 192.168.123.103:0/568563748 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34d2c5640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 0x7fc34013c0c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34d2c5640 1 -- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 msgr2=0x7fc34013c600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34d2c5640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 0x7fc34013c600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.723+0000 7fc34d2c5640 1 -- 192.168.123.103:0/568563748 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc33801c790 con 0x7fc3400a4840 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.724+0000 7fc34d2c5640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 0x7fc34013c0c0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fc338040930 tx=0x7fc338040f10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:25.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.724+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc33804f070 con 0x7fc3400a4840 2026-03-09T16:12:25.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.724+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc340141780 con 0x7fc3400a4840 2026-03-09T16:12:25.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.724+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc340141c40 con 0x7fc3400a4840 2026-03-09T16:12:25.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.725+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc338002d30 con 0x7fc3400a4840 2026-03-09T16:12:25.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.725+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc338055430 con 0x7fc3400a4840 2026-03-09T16:12:25.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.726+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fc338055590 con 0x7fc3400a4840 2026-03-09T16:12:25.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.727+0000 7fc3367fc640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 0x7fc31c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:25.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.727+0000 7fc34cac4640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 0x7fc31c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:25.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.727+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fc3380cec20 con 0x7fc3400a4840 2026-03-09T16:12:25.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.727+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc310005350 con 0x7fc3400a4840 2026-03-09T16:12:25.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.728+0000 7fc34cac4640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 0x7fc31c078680 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc34013d5e0 tx=0x7fc34400b040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:25.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.730+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc338098160 con 0x7fc3400a4840 2026-03-09T16:12:25.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.828+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7fc3100051c0 con 0x7fc3400a4840 2026-03-09T16:12:25.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.829+0000 7fc3367fc640 1 -- 192.168.123.103:0/568563748 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v17)=0 v17) v1 ==== 163+0+0 (secure 0 0 0) 0x7fc338097b00 con 0x7fc3400a4840 2026-03-09T16:12:25.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 msgr2=0x7fc31c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 0x7fc31c078680 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc34013d5e0 tx=0x7fc34400b040 comp rx=0 tx=0).stop 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 msgr2=0x7fc34013c0c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 0x7fc34013c0c0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fc338040930 tx=0x7fc338040f10 comp rx=0 tx=0).stop 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 shutdown_connections 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fc31c0761c0 0x7fc31c078680 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3400a5930 0x7fc34013c600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 --2- 192.168.123.103:0/568563748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3400a4840 0x7fc34013c0c0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 >> 192.168.123.103:0/568563748 conn(0x7fc34009fe90 msgr2=0x7fc3400a1730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 shutdown_connections 2026-03-09T16:12:25.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:25.832+0000 7fc34e2c7640 1 -- 192.168.123.103:0/568563748 wait complete. 2026-03-09T16:12:25.878 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T16:12:26.039 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:26.067 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:25 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:25 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.332+0000 7f5f84a8f640 1 -- 192.168.123.103:0/3500734475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80075ba0 msgr2=0x7f5f80075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.332+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/3500734475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80075ba0 0x7f5f80075fa0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f5f640099b0 tx=0x7f5f6402f220 comp rx=0 tx=0).stop 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 -- 192.168.123.103:0/3500734475 shutdown_connections 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/3500734475 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80076df0 0x7f5f80077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/3500734475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80075ba0 0x7f5f80075fa0 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 -- 192.168.123.103:0/3500734475 >> 192.168.123.103:0/3500734475 conn(0x7f5f800fe060 msgr2=0x7f5f80100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 -- 192.168.123.103:0/3500734475 shutdown_connections 2026-03-09T16:12:26.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.334+0000 7f5f84a8f640 1 -- 192.168.123.103:0/3500734475 wait complete. 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 Processor -- start 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 -- start start 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80075ba0 0x7f5f8019e810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f8019f320 con 0x7f5f80076df0 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.335+0000 7f5f84a8f640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f8019f490 con 0x7f5f80075ba0 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38486/0 (socket says 192.168.123.103:38486) 2026-03-09T16:12:26.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 -- 192.168.123.103:0/1218816500 learned_addr learned my addr 192.168.123.103:0/1218816500 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7f7fe640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80075ba0 0x7f5f8019e810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 -- 192.168.123.103:0/1218816500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80075ba0 msgr2=0x7f5f8019e810 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80075ba0 0x7f5f8019e810 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 -- 192.168.123.103:0/1218816500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5f64009660 con 0x7f5f80076df0 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.336+0000 7f5f7effd640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f5f6c00b730 tx=0x7f5f6c00bc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.337+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f6c004280 con 0x7f5f80076df0 2026-03-09T16:12:26.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.337+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5f801a3f30 con 0x7f5f80076df0 2026-03-09T16:12:26.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.337+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5f801a44d0 con 0x7f5f80076df0 2026-03-09T16:12:26.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.337+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5f6c0043e0 con 0x7f5f80076df0 2026-03-09T16:12:26.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.337+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f6c00ca90 con 0x7f5f80076df0 2026-03-09T16:12:26.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.339+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f5f6c00cbf0 con 0x7f5f80076df0 2026-03-09T16:12:26.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.340+0000 7f5f7cff9640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 0x7f5f50078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.340+0000 7f5f7f7fe640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 0x7f5f50078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.340+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f5f6c095e60 con 0x7f5f80076df0 2026-03-09T16:12:26.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.341+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5f8010fa50 con 0x7f5f80076df0 2026-03-09T16:12:26.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.344+0000 7f5f7f7fe640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 0x7f5f50078680 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f5f64002c20 tx=0x7f5f6403a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:26.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.344+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5f6c09b050 con 0x7f5f80076df0 2026-03-09T16:12:26.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.442+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f5f801a49f0 con 0x7f5f80076df0 2026-03-09T16:12:26.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.443+0000 7f5f7cff9640 1 -- 192.168.123.103:0/1218816500 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v17)=0 v17) v1 ==== 135+0+0 (secure 0 0 0) 0x7f5f6c05f540 con 0x7f5f80076df0 2026-03-09T16:12:26.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 msgr2=0x7f5f50078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 0x7f5f50078680 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f5f64002c20 tx=0x7f5f6403a040 comp rx=0 tx=0).stop 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 msgr2=0x7f5f8019ed50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f5f6c00b730 tx=0x7f5f6c00bc00 comp rx=0 tx=0).stop 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 shutdown_connections 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f5f500761c0 0x7f5f50078680 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f80076df0 0x7f5f8019ed50 unknown :-1 s=CLOSED pgs=266 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 --2- 192.168.123.103:0/1218816500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f80075ba0 0x7f5f8019e810 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.446+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 >> 192.168.123.103:0/1218816500 conn(0x7f5f800fe060 msgr2=0x7f5f800ffba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.447+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 shutdown_connections 2026-03-09T16:12:26.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.447+0000 7f5f84a8f640 1 -- 192.168.123.103:0/1218816500 wait complete. 2026-03-09T16:12:26.518 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr' 2026-03-09T16:12:26.687 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.944+0000 7f946ad17640 1 -- 192.168.123.103:0/1673348429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 msgr2=0x7f946410bd20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.944+0000 7f946ad17640 1 --2- 192.168.123.103:0/1673348429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946410bd20 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f9454009a00 tx=0x7f945402f270 comp rx=0 tx=0).stop 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 -- 192.168.123.103:0/1673348429 shutdown_connections 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 --2- 192.168.123.103:0/1673348429 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f946410cb00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 --2- 192.168.123.103:0/1673348429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946410bd20 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 -- 192.168.123.103:0/1673348429 >> 192.168.123.103:0/1673348429 conn(0x7f946406a890 msgr2=0x7f946406acc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:12:26.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 -- 192.168.123.103:0/1673348429 shutdown_connections 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 -- 192.168.123.103:0/1673348429 wait complete. 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.945+0000 7f946ad17640 1 Processor -- start 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.946+0000 7f946ad17640 1 -- start start 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.946+0000 7f946ad17640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946411b230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.946+0000 7f946ad17640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.946+0000 7f946ad17640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f94641181d0 con 0x7f946410b920 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.946+0000 7f946ad17640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9464118340 con 0x7f946410c6a0 2026-03-09T16:12:26.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58764/0 (socket says 192.168.123.103:58764) 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 -- 192.168.123.103:0/617577796 learned_addr learned my addr 192.168.123.103:0/617577796 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f9468a8c640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946411b230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 -- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 msgr2=0x7f946411b230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946411b230 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f945bfff640 1 -- 192.168.123.103:0/617577796 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9454009660 con 0x7f946410c6a0 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.947+0000 7f9468a8c640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946411b230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:12:26.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.948+0000 7f945bfff640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f944c00d8d0 tx=0x7f944c00dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:26.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.948+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f944c004490 con 0x7f946410c6a0 2026-03-09T16:12:26.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.948+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9464118620 con 0x7f946410c6a0 2026-03-09T16:12:26.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.948+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f94641b5b40 con 0x7f946410c6a0 2026-03-09T16:12:26.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.949+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f944c00bd00 con 0x7f946410c6a0 2026-03-09T16:12:26.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.950+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f944c010460 con 0x7f946410c6a0 2026-03-09T16:12:26.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.950+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f944c010610 con 0x7f946410c6a0 2026-03-09T16:12:26.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.950+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f942c005350 con 0x7f946410c6a0 2026-03-09T16:12:26.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.951+0000 7f9459ffb640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 0x7f9440078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:12:26.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.951+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f944c097a70 con 0x7f946410c6a0 2026-03-09T16:12:26.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.951+0000 7f9468a8c640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 0x7f9440078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:12:26.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.952+0000 7f9468a8c640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 0x7f9440078420 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f9454005bb0 tx=0x7f9454005b40 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:12:26.954 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:26.954+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f944c060fb0 con 0x7f946410c6a0 2026-03-09T16:12:27.071 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:26 vm03.local ceph-mon[51019]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.6 KiB/s wr, 12 op/s 2026-03-09T16:12:27.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:12:27.067+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f942c002ca0 con 0x7f9440075f60 2026-03-09T16:12:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:26 vm05.local ceph-mon[58702]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.6 KiB/s wr, 12 op/s 2026-03-09T16:12:28.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:27 vm05.local ceph-mon[58702]: from='client.24325 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:27 vm03.local ceph-mon[51019]: from='client.24325 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:12:29.205 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:28 vm05.local ceph-mon[58702]: pgmap v86: 65 pgs: 65 active+clean; 453 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.5 KiB/s wr, 11 op/s 2026-03-09T16:12:29.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:28 vm03.local ceph-mon[51019]: pgmap v86: 65 pgs: 65 active+clean; 453 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.5 KiB/s wr, 11 op/s 2026-03-09T16:12:30.992 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:30 vm03.local ceph-mon[51019]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 7.1 KiB/s rd, 1005 B/s wr, 10 op/s 2026-03-09T16:12:31.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:30 vm05.local ceph-mon[58702]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 7.1 KiB/s rd, 1005 B/s wr, 10 op/s 2026-03-09T16:12:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:33 vm05.local ceph-mon[58702]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-09T16:12:33.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:33 vm03.local ceph-mon[51019]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-09T16:12:35.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:35 vm05.local ceph-mon[58702]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 6.1 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-09T16:12:35.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:35 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:35.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:35 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:35 vm03.local ceph-mon[51019]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 6.1 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-09T16:12:35.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:35 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:12:35.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:35 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:37 vm03.local ceph-mon[51019]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 4.5 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-09T16:12:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:37 vm05.local ceph-mon[58702]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 4.5 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-09T16:12:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:39 vm05.local ceph-mon[58702]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-09T16:12:39.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:39 vm03.local ceph-mon[51019]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-09T16:12:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:41 vm03.local ceph-mon[51019]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-09T16:12:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:41 vm05.local ceph-mon[58702]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-09T16:12:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:43 vm05.local ceph-mon[58702]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T16:12:43.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:43 vm03.local ceph-mon[51019]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T16:12:45.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:45 vm05.local ceph-mon[58702]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T16:12:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:45 vm03.local ceph-mon[51019]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T16:12:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:47 vm05.local ceph-mon[58702]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T16:12:47.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:47 vm03.local ceph-mon[51019]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T16:12:49.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:49 vm05.local ceph-mon[58702]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:49.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:49 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:49.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:49 vm03.local ceph-mon[51019]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:49.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:49 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:12:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:51 vm03.local ceph-mon[51019]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:51.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:51 vm05.local ceph-mon[58702]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:53.510 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:53 vm05.local ceph-mon[58702]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:53.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:53 vm03.local ceph-mon[51019]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:54.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:54 vm05.local ceph-mon[58702]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:55.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:54 vm03.local ceph-mon[51019]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:57 vm03.local ceph-mon[51019]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:57 vm05.local ceph-mon[58702]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:12:59 vm03.local ceph-mon[51019]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:12:59.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:12:59 vm05.local ceph-mon[58702]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:02.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:02 vm05.local ceph-mon[58702]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:02.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:02 vm03.local ceph-mon[51019]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:03.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:03 vm05.local ceph-mon[58702]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:03 vm03.local ceph-mon[51019]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:04.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:04 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:04.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:04 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:05.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:05 vm05.local ceph-mon[58702]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:05.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:05 vm03.local ceph-mon[51019]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:07.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:07 vm05.local ceph-mon[58702]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:07.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:07 vm03.local ceph-mon[51019]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:09.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:09 vm05.local ceph-mon[58702]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:09 vm03.local ceph-mon[51019]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:11 vm03.local ceph-mon[51019]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:11 vm05.local ceph-mon[58702]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:13.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:13 vm05.local ceph-mon[58702]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:13.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:13 vm03.local ceph-mon[51019]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:16.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:15 vm05.local ceph-mon[58702]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:16.306 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:15 vm03.local ceph-mon[51019]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:17.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:17 vm05.local ceph-mon[58702]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:17 vm03.local ceph-mon[51019]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:19.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:19 vm03.local ceph-mon[51019]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:19.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:19 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:19.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:19 vm05.local ceph-mon[58702]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:19.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:19 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:19.802 INFO:teuthology.orchestra.run.vm03.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:13:19.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.800+0000 7f9459ffb640 1 -- 192.168.123.103:0/617577796 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f942c002ca0 con 0x7f9440075f60 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.808+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 msgr2=0x7f9440078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.808+0000 7f946ad17640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 0x7f9440078420 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f9454005bb0 tx=0x7f9454005b40 comp rx=0 tx=0).stop 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.808+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 msgr2=0x7f9464117c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.808+0000 7f946ad17640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f944c00d8d0 tx=0x7f944c00dda0 comp rx=0 tx=0).stop 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 shutdown_connections 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f9440075f60 0x7f9440078420 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f946410c6a0 0x7f9464117c90 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 --2- 192.168.123.103:0/617577796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f946410b920 0x7f946411b230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 >> 192.168.123.103:0/617577796 conn(0x7f946406a890 msgr2=0x7f946410ada0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 shutdown_connections 2026-03-09T16:13:19.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:19.811+0000 7f946ad17640 1 -- 192.168.123.103:0/617577796 wait complete. 2026-03-09T16:13:19.881 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done' 2026-03-09T16:13:20.210 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- 192.168.123.103:0/2418410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00725f0 msgr2=0x7f0af0077360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 --2- 192.168.123.103:0/2418410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00725f0 0x7f0af0077360 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0ae800d330 tx=0x7f0ae80319e0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- 192.168.123.103:0/2418410007 shutdown_connections 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 --2- 192.168.123.103:0/2418410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00725f0 0x7f0af0077360 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 --2- 192.168.123.103:0/2418410007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0072020 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- 192.168.123.103:0/2418410007 >> 192.168.123.103:0/2418410007 conn(0x7f0af006d660 msgr2=0x7f0af006faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- 192.168.123.103:0/2418410007 shutdown_connections 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- 192.168.123.103:0/2418410007 wait complete. 2026-03-09T16:13:20.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 Processor -- start 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- start start 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00827b0 0x7f0af0082c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0af0083170 con 0x7f0af0071c20 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.631+0000 7f0af6788640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0af00832e0 con 0x7f0af00827b0 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41238/0 (socket says 192.168.123.103:41238) 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 -- 192.168.123.103:0/4210040876 learned_addr learned my addr 192.168.123.103:0/4210040876 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 -- 192.168.123.103:0/4210040876 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00827b0 msgr2=0x7f0af0082c30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00827b0 0x7f0af0082c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.633+0000 7f0aeffff640 1 -- 192.168.123.103:0/4210040876 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0ae8009d00 con 0x7f0af0071c20 2026-03-09T16:13:20.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.634+0000 7f0aeffff640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f0ae000a9e0 tx=0x7f0ae000e500 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:20.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.634+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ae000f040 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.634+0000 7f0af6788640 1 -- 192.168.123.103:0/4210040876 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0af0083560 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.634+0000 7f0af6788640 1 -- 192.168.123.103:0/4210040876 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0af012efc0 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.635+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0ae0004590 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.635+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ae0002a00 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.636+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f0ae0012030 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.636+0000 7f0aed7fa640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 0x7f0ac0078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.636+0000 7f0aef7fe640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 0x7f0ac0078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.637+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f0ae0098ab0 con 0x7f0af0071c20 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.637+0000 7f0aef7fe640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 0x7f0ac0078750 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f0af0083ee0 tx=0x7f0ae8009c00 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:20.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.637+0000 7f0af6788640 1 -- 192.168.123.103:0/4210040876 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ab4005350 con 0x7f0af0071c20 2026-03-09T16:13:20.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.640+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0ae0061fe0 con 0x7f0af0071c20 2026-03-09T16:13:20.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.772+0000 7f0af6788640 1 -- 192.168.123.103:0/4210040876 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0ab4002bf0 con 0x7f0ac0076290 2026-03-09T16:13:20.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.773+0000 7f0aed7fa640 1 -- 192.168.123.103:0/4210040876 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7f0ab4002bf0 con 0x7f0ac0076290 2026-03-09T16:13:20.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 msgr2=0x7f0ac0078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 0x7f0ac0078750 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f0af0083ee0 tx=0x7f0ae8009c00 comp rx=0 tx=0).stop 2026-03-09T16:13:20.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 msgr2=0x7f0af0084160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f0ae000a9e0 tx=0x7f0ae000e500 comp rx=0 tx=0).stop 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 shutdown_connections 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0ac0076290 0x7f0ac0078750 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0af00827b0 0x7f0af0082c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 --2- 192.168.123.103:0/4210040876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0af0071c20 0x7f0af0084160 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.779+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 >> 192.168.123.103:0/4210040876 conn(0x7f0af006d660 msgr2=0x7f0af006ece0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:20.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.780+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 shutdown_connections 2026-03-09T16:13:20.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.780+0000 7f0aceffd640 1 -- 192.168.123.103:0/4210040876 wait complete. 2026-03-09T16:13:20.790 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:13:20.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 -- 192.168.123.103:0/1106912892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c071a50 msgr2=0x7f6b9c071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 --2- 192.168.123.103:0/1106912892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c071a50 0x7f6b9c071e50 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f6b900099b0 tx=0x7f6b9002efc0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 -- 192.168.123.103:0/1106912892 shutdown_connections 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 --2- 192.168.123.103:0/1106912892 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c072420 0x7f6b9c077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 --2- 192.168.123.103:0/1106912892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c071a50 0x7f6b9c071e50 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.857+0000 7f6ba3653640 1 -- 192.168.123.103:0/1106912892 >> 192.168.123.103:0/1106912892 conn(0x7f6b9c06d4f0 msgr2=0x7f6b9c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 -- 192.168.123.103:0/1106912892 shutdown_connections 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 -- 192.168.123.103:0/1106912892 wait complete. 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 Processor -- start 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 -- start start 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c0826c0 0x7f6b9c082b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b9c0845b0 con 0x7f6b9c072420 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba3653640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b9c083080 con 0x7f6b9c0826c0 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba2651640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba2651640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41260/0 (socket says 192.168.123.103:41260) 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba2651640 1 -- 192.168.123.103:0/3914908890 learned_addr learned my addr 192.168.123.103:0/3914908890 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.858+0000 7f6ba1e50640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c0826c0 0x7f6b9c082b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.859+0000 7f6ba2651640 1 -- 192.168.123.103:0/3914908890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c0826c0 msgr2=0x7f6b9c082b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.859+0000 7f6ba2651640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c0826c0 0x7f6b9c082b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.859+0000 7f6ba2651640 1 -- 192.168.123.103:0/3914908890 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b90009660 con 0x7f6b9c072420 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.859+0000 7f6ba2651640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f6b9002f4d0 tx=0x7f6b900095b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.860+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b90002960 con 0x7f6b9c072420 2026-03-09T16:13:20.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.860+0000 7f6ba3653640 1 -- 192.168.123.103:0/3914908890 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b9c083300 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.860+0000 7f6ba3653640 1 -- 192.168.123.103:0/3914908890 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b9c1b5bc0 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.861+0000 7f6ba3653640 1 -- 192.168.123.103:0/3914908890 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b6c005350 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.862+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6b90043ce0 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.862+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b9002f820 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.863+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6b9002fa00 con 0x7f6b9c072420 2026-03-09T16:13:20.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.863+0000 7f6b8f7fe640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 0x7f6b70078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:20.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.864+0000 7f6ba1e50640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 0x7f6b70078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:20.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.864+0000 7f6ba1e50640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 0x7f6b70078820 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f6b9c083df0 tx=0x7f6b9800a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:20.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.864+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6b900c5c50 con 0x7f6b9c072420 2026-03-09T16:13:20.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.865+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6b9008f290 con 0x7f6b9c072420 2026-03-09T16:13:20.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.968+0000 7f6ba3653640 1 -- 192.168.123.103:0/3914908890 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6b6c002bf0 con 0x7f6b70076360 2026-03-09T16:13:20.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.970+0000 7f6b8f7fe640 1 -- 192.168.123.103:0/3914908890 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7f6b6c002bf0 con 0x7f6b70076360 2026-03-09T16:13:20.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 msgr2=0x7f6b70078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 0x7f6b70078820 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f6b9c083df0 tx=0x7f6b9800a040 comp rx=0 tx=0).stop 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 msgr2=0x7f6b9c084070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f6b9002f4d0 tx=0x7f6b900095b0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 shutdown_connections 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6b70076360 0x7f6b70078820 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.974+0000 7f6b8d7fa640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b9c0826c0 0x7f6b9c082b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.975+0000 7f6b8d7fa640 1 --2- 192.168.123.103:0/3914908890 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b9c072420 0x7f6b9c084070 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:20.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.975+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 >> 192.168.123.103:0/3914908890 conn(0x7f6b9c06d4f0 msgr2=0x7f6b9c0753c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:20.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.975+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 shutdown_connections 2026-03-09T16:13:20.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:20.975+0000 7f6b8d7fa640 1 -- 192.168.123.103:0/3914908890 wait complete. 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:13:21.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:20 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.033+0000 7f668b8c3640 1 -- 192.168.123.103:0/1648885840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684072440 msgr2=0x7f66840771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.033+0000 7f668b8c3640 1 --2- 192.168.123.103:0/1648885840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684072440 0x7f66840771b0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f667c00b0a0 tx=0x7f667c02f4c0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 -- 192.168.123.103:0/1648885840 shutdown_connections 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 --2- 192.168.123.103:0/1648885840 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684072440 0x7f66840771b0 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 --2- 192.168.123.103:0/1648885840 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6684071a70 0x7f6684071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 -- 192.168.123.103:0/1648885840 >> 192.168.123.103:0/1648885840 conn(0x7f668406d4f0 msgr2=0x7f668406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 -- 192.168.123.103:0/1648885840 shutdown_connections 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 -- 192.168.123.103:0/1648885840 wait complete. 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 Processor -- start 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.036+0000 7f668b8c3640 1 -- start start 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f668b8c3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f668b8c3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66840827b0 0x7f6684082c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f668b8c3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6684083170 con 0x7f6684071a70 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f668b8c3640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f66840832e0 con 0x7f66840827b0 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41280/0 (socket says 192.168.123.103:41280) 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 -- 192.168.123.103:0/3378332852 learned_addr learned my addr 192.168.123.103:0/3378332852 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6688e37640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66840827b0 0x7f6684082c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 -- 192.168.123.103:0/3378332852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66840827b0 msgr2=0x7f6684082c30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66840827b0 0x7f6684082c30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 -- 192.168.123.103:0/3378332852 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f667c009d00 con 0x7f6684071a70 2026-03-09T16:13:21.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.037+0000 7f6689638640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f668000b700 tx=0x7f668000bbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.039+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6680010c40 con 0x7f6684071a70 2026-03-09T16:13:21.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.039+0000 7f668b8c3640 1 -- 192.168.123.103:0/3378332852 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f66840835c0 con 0x7f6684071a70 2026-03-09T16:13:21.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.039+0000 7f668b8c3640 1 -- 192.168.123.103:0/3378332852 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f66841b5c10 con 0x7f6684071a70 2026-03-09T16:13:21.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.040+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f66800045b0 con 0x7f6684071a70 2026-03-09T16:13:21.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.040+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6680019470 con 0x7f6684071a70 2026-03-09T16:13:21.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.041+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f66800223e0 con 0x7f6684071a70 2026-03-09T16:13:21.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.041+0000 7f667a7fc640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 0x7f6668078780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.042+0000 7f6688e37640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 0x7f6668078780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.042+0000 7f6688e37640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 0x7f6668078780 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f667c004a70 tx=0x7f667c03a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.042+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f668009c280 con 0x7f6684071a70 2026-03-09T16:13:21.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.042+0000 7f668b8c3640 1 -- 192.168.123.103:0/3378332852 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6650005350 con 0x7f6684071a70 2026-03-09T16:13:21.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.046+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6680065950 con 0x7f6684071a70 2026-03-09T16:13:21.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.145+0000 7f668b8c3640 1 -- 192.168.123.103:0/3378332852 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6650002bf0 con 0x7f66680762c0 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 60s ago 3m 24.7M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (3m) 60s ago 3m 8300k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (2m) 61s ago 2m 8502k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 60s ago 3m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (2m) 61s ago 2m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 60s ago 2m 82.6M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (65s) 60s ago 65s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (67s) 60s ago 67s 18.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (66s) 61s ago 66s 11.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (64s) 61s ago 64s 14.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:9283,8765,8443 running (3m) 60s ago 3m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 55454b4aaab2 2026-03-09T16:13:21.152 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (2m) 61s ago 2m 488M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 a411a05027bd 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 60s ago 3m 54.8M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (2m) 61s ago 2m 50.0M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 60s ago 3m 14.1M - 1.5.0 0da6a335fe13 8c7f00e55632 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 61s ago 2m 15.0M - 1.5.0 0da6a335fe13 4c3ab3bdf8cf 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 60s ago 2m 48.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 60s ago 2m 68.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (117s) 60s ago 117s 47.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (106s) 61s ago 106s 66.4M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (97s) 61s ago 97s 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (88s) 61s ago 88s 43.4M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 60s ago 2m 37.2M - 2.43.0 a07b618ecd1d 89a8f084cd57 2026-03-09T16:13:21.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.151+0000 7f667a7fc640 1 -- 192.168.123.103:0/3378332852 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7f6650002bf0 con 0x7f66680762c0 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 msgr2=0x7f6668078780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 0x7f6668078780 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f667c004a70 tx=0x7f667c03a040 comp rx=0 tx=0).stop 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 msgr2=0x7f6684084160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f668000b700 tx=0x7f668000bbd0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 shutdown_connections 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f66680762c0 0x7f6668078780 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f66840827b0 0x7f6684082c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 --2- 192.168.123.103:0/3378332852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6684071a70 0x7f6684084160 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.153+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 >> 192.168.123.103:0/3378332852 conn(0x7f668406d4f0 msgr2=0x7f66840704b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.154+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 shutdown_connections 2026-03-09T16:13:21.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.155+0000 7f664ffff640 1 -- 192.168.123.103:0/3378332852 wait complete. 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.230+0000 7fec789a5640 1 -- 192.168.123.103:0/1096796143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74100620 msgr2=0x7fec74100a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.230+0000 7fec789a5640 1 --2- 192.168.123.103:0/1096796143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74100620 0x7fec74100a20 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7fec680099b0 tx=0x7fec6802f240 comp rx=0 tx=0).stop 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- 192.168.123.103:0/1096796143 shutdown_connections 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 --2- 192.168.123.103:0/1096796143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74101820 0x7fec74101ca0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 --2- 192.168.123.103:0/1096796143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74100620 0x7fec74100a20 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- 192.168.123.103:0/1096796143 >> 192.168.123.103:0/1096796143 conn(0x7fec740fbdb0 msgr2=0x7fec740fe1f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- 192.168.123.103:0/1096796143 shutdown_connections 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- 192.168.123.103:0/1096796143 wait complete. 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 Processor -- start 2026-03-09T16:13:21.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- start start 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 0x7fec741982a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 0x7fec741987e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec74198db0 con 0x7fec74101820 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.231+0000 7fec789a5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec74198f20 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 0x7fec741987e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 0x7fec741987e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41298/0 (socket says 192.168.123.103:41298) 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72575640 1 -- 192.168.123.103:0/3607138821 learned_addr learned my addr 192.168.123.103:0/3607138821 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72d76640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 0x7fec741982a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72d76640 1 -- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 msgr2=0x7fec741987e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72d76640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 0x7fec741987e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72d76640 1 -- 192.168.123.103:0/3607138821 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fec68009660 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec72d76640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 0x7fec741982a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fec6802f750 tx=0x7fec68032000 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec6803d070 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.232+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fec74073460 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.233+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fec74073950 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.233+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fec68004520 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.233+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec6804b430 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.233+0000 7fec51ffb640 1 -- 192.168.123.103:0/3607138821 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fec74109c10 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.234+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fec68049050 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.234+0000 7fec53fff640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 0x7fec4c078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.234+0000 7fec72575640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 0x7fec4c078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.234+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fec680bc2c0 con 0x7fec74100620 2026-03-09T16:13:21.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.235+0000 7fec72575640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 0x7fec4c078470 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fec741997c0 tx=0x7fec5c008040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.237+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fec68085800 con 0x7fec74100620 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:13:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:20 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:21.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.448+0000 7fec51ffb640 1 -- 192.168.123.103:0/3607138821 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fec74061980 con 0x7fec74100620 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.451+0000 7fec53fff640 1 -- 192.168.123.103:0/3607138821 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fec680851a0 con 0x7fec74100620 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.454+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 msgr2=0x7fec4c078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.454+0000 7fec789a5640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 0x7fec4c078470 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fec741997c0 tx=0x7fec5c008040 comp rx=0 tx=0).stop 2026-03-09T16:13:21.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.454+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 msgr2=0x7fec741982a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.454+0000 7fec789a5640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 0x7fec741982a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fec6802f750 tx=0x7fec68032000 comp rx=0 tx=0).stop 2026-03-09T16:13:21.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.455+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 shutdown_connections 2026-03-09T16:13:21.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.455+0000 7fec789a5640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fec4c075fb0 0x7fec4c078470 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.455+0000 7fec789a5640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec74101820 0x7fec741987e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.455+0000 7fec789a5640 1 --2- 192.168.123.103:0/3607138821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec74100620 0x7fec741982a0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.455+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 >> 192.168.123.103:0/3607138821 conn(0x7fec740fbdb0 msgr2=0x7fec740fda60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.458+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 shutdown_connections 2026-03-09T16:13:21.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.458+0000 7fec789a5640 1 -- 192.168.123.103:0/3607138821 wait complete. 2026-03-09T16:13:21.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.531+0000 7f69525f6640 1 -- 192.168.123.103:0/225420616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 msgr2=0x7f694c102cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.531+0000 7f69525f6640 1 --2- 192.168.123.103:0/225420616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c102cb0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f693c0099b0 tx=0x7f693c02f240 comp rx=0 tx=0).stop 2026-03-09T16:13:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.533+0000 7f69525f6640 1 -- 192.168.123.103:0/225420616 shutdown_connections 2026-03-09T16:13:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.533+0000 7f69525f6640 1 --2- 192.168.123.103:0/225420616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 0x7f694c103f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.533+0000 7f69525f6640 1 --2- 192.168.123.103:0/225420616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c102cb0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.533+0000 7f69525f6640 1 -- 192.168.123.103:0/225420616 >> 192.168.123.103:0/225420616 conn(0x7f694c0fe060 msgr2=0x7f694c100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 -- 192.168.123.103:0/225420616 shutdown_connections 2026-03-09T16:13:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 -- 192.168.123.103:0/225420616 wait complete. 2026-03-09T16:13:21.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 Processor -- start 2026-03-09T16:13:21.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 -- start start 2026-03-09T16:13:21.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 0x7f694c19a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f694c19ae50 con 0x7f694c103ab0 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69525f6640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f694c19afc0 con 0x7f694c1028b0 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.535+0000 7f69515f4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:41218/0 (socket says 192.168.123.103:41218) 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 -- 192.168.123.103:0/382752979 learned_addr learned my addr 192.168.123.103:0/382752979 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f6950df3640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 0x7f694c19a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 -- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 msgr2=0x7f694c19a880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 0x7f694c19a880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 -- 192.168.123.103:0/382752979 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f693c009660 con 0x7f694c1028b0 2026-03-09T16:13:21.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f69515f4640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f693c002910 tx=0x7f693c038620 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.536+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f693c03d070 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.537+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f693c02fd70 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.537+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f693c041a90 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.537+0000 7f69525f6640 1 -- 192.168.123.103:0/382752979 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f694c19fa00 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.537+0000 7f69525f6640 1 -- 192.168.123.103:0/382752979 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f694c19ff70 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.538+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f693c038730 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.538+0000 7f69525f6640 1 -- 192.168.123.103:0/382752979 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6914005350 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.539+0000 7f693a7fc640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 0x7f6928078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.539+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f693c0bc820 con 0x7f694c1028b0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.539+0000 7f6950df3640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 0x7f6928078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:21.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.539+0000 7f6950df3640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 0x7f6928078470 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f694c19b860 tx=0x7f6940009210 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:21.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.541+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f693c085e10 con 0x7f694c1028b0 2026-03-09T16:13:21.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.674+0000 7f69525f6640 1 -- 192.168.123.103:0/382752979 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6914002bf0 con 0x7f6928075fb0 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "", 2026-03-09T16:13:21.679 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.676+0000 7f693a7fc640 1 -- 192.168.123.103:0/382752979 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+351 (secure 0 0 0) 0x7f6914002bf0 con 0x7f6928075fb0 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 msgr2=0x7f6928078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 0x7f6928078470 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f694c19b860 tx=0x7f6940009210 comp rx=0 tx=0).stop 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 msgr2=0x7f694c19a340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f693c002910 tx=0x7f693c038620 comp rx=0 tx=0).stop 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 shutdown_connections 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6928075fb0 0x7f6928078470 secure :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f694c19b860 tx=0x7f6940009210 comp rx=0 tx=0).stop 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f694c103ab0 0x7f694c19a880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 --2- 192.168.123.103:0/382752979 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f694c1028b0 0x7f694c19a340 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:21.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.679+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 >> 192.168.123.103:0/382752979 conn(0x7f694c0fe060 msgr2=0x7f694c0ffba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:21.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.682+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 shutdown_connections 2026-03-09T16:13:21.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:21.682+0000 7f691bfff640 1 -- 192.168.123.103:0/382752979 wait complete. 2026-03-09T16:13:22.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:21 vm03.local ceph-mon[51019]: from='client.14548 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:21 vm03.local ceph-mon[51019]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:21 vm03.local ceph-mon[51019]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:13:22.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:21 vm03.local ceph-mon[51019]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:21 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/3607138821' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:13:22.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:21 vm05.local ceph-mon[58702]: from='client.14548 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:21 vm05.local ceph-mon[58702]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:21 vm05.local ceph-mon[58702]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:13:22.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:21 vm05.local ceph-mon[58702]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:22.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:21 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/3607138821' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: from='client.24339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:13:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:23.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: Upgrade: Need to upgrade myself (mgr.vm03.gbgzmu) 2026-03-09T16:13:23.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:22 vm03.local ceph-mon[51019]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm05 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: from='client.24339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: Upgrade: Need to upgrade myself (mgr.vm03.gbgzmu) 2026-03-09T16:13:23.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:22 vm05.local ceph-mon[58702]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm05 2026-03-09T16:13:25.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:25 vm03.local ceph-mon[51019]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:25.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:25 vm05.local ceph-mon[58702]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:27 vm05.local ceph-mon[58702]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:27.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:27 vm03.local ceph-mon[51019]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:29 vm05.local ceph-mon[58702]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:29.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:29 vm03.local ceph-mon[51019]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:31.489 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:31 vm03.local ceph-mon[51019]: pgmap v117: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:31 vm05.local ceph-mon[58702]: pgmap v117: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:33.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:33 vm05.local ceph-mon[58702]: pgmap v118: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:33.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:33 vm03.local ceph-mon[51019]: pgmap v118: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:34.877 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T16:13:34.878 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T16:13:34.884 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-09T16:13:34.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:34 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:34.904 INFO:tasks.workunit.client.0.vm03.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T16:13:34.906 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T16:13:34.906 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T16:13:35.007 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T16:13:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:34 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:35.045 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T16:13:35.074 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T16:13:35.075 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T16:13:35.075 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T16:13:35.104 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T16:13:35.107 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:13:35.107 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-09T16:13:35.164 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-09T16:13:35.165 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T16:13:35.165 DEBUG:teuthology.orchestra.run.vm03:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-09T16:13:35.228 INFO:tasks.workunit.client.0.vm03.stderr:+ mkdir -p fsstress 2026-03-09T16:13:35.230 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd fsstress 2026-03-09T16:13:35.231 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T16:13:35.231 INFO:tasks.workunit.client.0.vm03.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T16:13:35.256 INFO:tasks.workunit.client.1.vm05.stderr:Updating files: 84% (11727/13941) Updating files: 85% (11850/13941) Updating files: 86% (11990/13941) Updating files: 87% (12129/13941) Updating files: 88% (12269/13941) Updating files: 89% (12408/13941) Updating files: 90% (12547/13941) Updating files: 91% (12687/13941) Updating files: 92% (12826/13941) Updating files: 93% (12966/13941) Updating files: 94% (13105/13941) Updating files: 95% (13244/13941) Updating files: 96% (13384/13941) Updating files: 97% (13523/13941) Updating files: 98% (13663/13941) Updating files: 99% (13802/13941) Updating files: 100% (13941/13941) Updating files: 100% (13941/13941), done. 2026-03-09T16:13:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:35 vm03.local ceph-mon[51019]: pgmap v119: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: git switch -c 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr:Or undo this operation with: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: git switch - 2026-03-09T16:13:35.912 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.913 INFO:tasks.workunit.client.1.vm05.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T16:13:35.913 INFO:tasks.workunit.client.1.vm05.stderr: 2026-03-09T16:13:35.913 INFO:tasks.workunit.client.1.vm05.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T16:13:35.917 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-09T16:13:35.979 INFO:tasks.workunit.client.1.vm05.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T16:13:35.981 INFO:tasks.workunit.client.1.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T16:13:35.981 INFO:tasks.workunit.client.1.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T16:13:36.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:35 vm05.local ceph-mon[58702]: pgmap v119: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:36.043 INFO:tasks.workunit.client.1.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T16:13:36.091 INFO:tasks.workunit.client.1.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T16:13:36.130 INFO:tasks.workunit.client.1.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T16:13:36.131 INFO:tasks.workunit.client.1.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T16:13:36.131 INFO:tasks.workunit.client.1.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T16:13:36.162 INFO:tasks.workunit.client.1.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T16:13:36.166 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-09T16:13:36.166 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-09T16:13:36.221 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-09T16:13:36.222 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T16:13:36.222 DEBUG:teuthology.orchestra.run.vm05:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-09T16:13:36.294 INFO:tasks.workunit.client.1.vm05.stderr:+ mkdir -p fsstress 2026-03-09T16:13:36.297 INFO:tasks.workunit.client.1.vm05.stderr:+ pushd fsstress 2026-03-09T16:13:36.299 INFO:tasks.workunit.client.1.vm05.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T16:13:36.299 INFO:tasks.workunit.client.1.vm05.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T16:13:36.819 INFO:tasks.workunit.client.0.vm03.stderr:+ tar xzf ltp-full.tgz 2026-03-09T16:13:37.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:37 vm03.local ceph-mon[51019]: pgmap v120: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:37.947 INFO:tasks.workunit.client.1.vm05.stderr:+ tar xzf ltp-full.tgz 2026-03-09T16:13:38.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:37 vm05.local ceph-mon[58702]: pgmap v120: 65 pgs: 65 active+clean; 460 KiB data, 161 MiB used, 120 GiB / 120 GiB avail 2026-03-09T16:13:39.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:38 vm03.local ceph-mon[51019]: pgmap v121: 65 pgs: 65 active+clean; 4.8 MiB data, 169 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 375 KiB/s wr, 4 op/s 2026-03-09T16:13:39.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:38 vm05.local ceph-mon[58702]: pgmap v121: 65 pgs: 65 active+clean; 4.8 MiB data, 169 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 375 KiB/s wr, 4 op/s 2026-03-09T16:13:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:41 vm05.local ceph-mon[58702]: pgmap v122: 65 pgs: 65 active+clean; 15 MiB data, 221 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.2 MiB/s wr, 26 op/s 2026-03-09T16:13:41.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:41 vm03.local ceph-mon[51019]: pgmap v122: 65 pgs: 65 active+clean; 15 MiB data, 221 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.2 MiB/s wr, 26 op/s 2026-03-09T16:13:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:43 vm05.local ceph-mon[58702]: pgmap v123: 65 pgs: 65 active+clean; 22 MiB data, 260 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.8 MiB/s wr, 42 op/s 2026-03-09T16:13:44.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:43 vm03.local ceph-mon[51019]: pgmap v123: 65 pgs: 65 active+clean; 22 MiB data, 260 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.8 MiB/s wr, 42 op/s 2026-03-09T16:13:45.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:45 vm05.local ceph-mon[58702]: pgmap v124: 65 pgs: 65 active+clean; 30 MiB data, 299 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.5 MiB/s wr, 99 op/s 2026-03-09T16:13:45.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:45 vm03.local ceph-mon[51019]: pgmap v124: 65 pgs: 65 active+clean; 30 MiB data, 299 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.5 MiB/s wr, 99 op/s 2026-03-09T16:13:47.054 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:47 vm03.local ceph-mon[51019]: pgmap v125: 65 pgs: 65 active+clean; 42 MiB data, 364 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.5 MiB/s wr, 143 op/s 2026-03-09T16:13:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:47 vm05.local ceph-mon[58702]: pgmap v125: 65 pgs: 65 active+clean; 42 MiB data, 364 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.5 MiB/s wr, 143 op/s 2026-03-09T16:13:50.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:49 vm03.local ceph-mon[51019]: pgmap v126: 65 pgs: 65 active+clean; 47 MiB data, 413 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.9 MiB/s wr, 173 op/s 2026-03-09T16:13:50.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:49 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:50.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:49 vm05.local ceph-mon[58702]: pgmap v126: 65 pgs: 65 active+clean; 47 MiB data, 413 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.9 MiB/s wr, 173 op/s 2026-03-09T16:13:50.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:49 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:13:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:50 vm03.local ceph-mon[51019]: pgmap v127: 65 pgs: 65 active+clean; 73 MiB data, 463 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.8 MiB/s wr, 220 op/s 2026-03-09T16:13:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:50 vm05.local ceph-mon[58702]: pgmap v127: 65 pgs: 65 active+clean; 73 MiB data, 463 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.8 MiB/s wr, 220 op/s 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- 192.168.123.103:0/2292764371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260072440 msgr2=0x7f62600771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 --2- 192.168.123.103:0/2292764371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260072440 0x7f62600771b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f625800d3a0 tx=0x7f6258031690 comp rx=0 tx=0).stop 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- 192.168.123.103:0/2292764371 shutdown_connections 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 --2- 192.168.123.103:0/2292764371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260072440 0x7f62600771b0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 --2- 192.168.123.103:0/2292764371 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6260071a70 0x7f6260071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- 192.168.123.103:0/2292764371 >> 192.168.123.103:0/2292764371 conn(0x7f626006d4f0 msgr2=0x7f626006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- 192.168.123.103:0/2292764371 shutdown_connections 2026-03-09T16:13:51.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- 192.168.123.103:0/2292764371 wait complete. 2026-03-09T16:13:51.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 Processor -- start 2026-03-09T16:13:51.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.748+0000 7f6265b51640 1 -- start start 2026-03-09T16:13:51.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f6265b51640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260071a70 0x7f6260084160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f6265b51640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f6265b51640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6260083170 con 0x7f62600827b0 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f6265b51640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f62600832e0 con 0x7f6260071a70 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f625effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f625effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47528/0 (socket says 192.168.123.103:47528) 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f625effd640 1 -- 192.168.123.103:0/1478055418 learned_addr learned my addr 192.168.123.103:0/1478055418 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.749+0000 7f625f7fe640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260071a70 0x7f6260084160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.750+0000 7f625effd640 1 -- 192.168.123.103:0/1478055418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260071a70 msgr2=0x7f6260084160 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.750+0000 7f625effd640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260071a70 0x7f6260084160 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.750+0000 7f625effd640 1 -- 192.168.123.103:0/1478055418 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f625800d050 con 0x7f62600827b0 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.750+0000 7f625effd640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f6258031ba0 tx=0x7f625800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:51.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.750+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625800af10 con 0x7f62600827b0 2026-03-09T16:13:51.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.751+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6260083560 con 0x7f62600827b0 2026-03-09T16:13:51.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.751+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f626012efc0 con 0x7f62600827b0 2026-03-09T16:13:51.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.752+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6260072440 con 0x7f62600827b0 2026-03-09T16:13:51.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.755+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6258034030 con 0x7f62600827b0 2026-03-09T16:13:51.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.757+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6258008460 con 0x7f62600827b0 2026-03-09T16:13:51.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.757+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f625804a050 con 0x7f62600827b0 2026-03-09T16:13:51.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.758+0000 7f625cff9640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 0x7f6240078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.758+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f62580bda90 con 0x7f62600827b0 2026-03-09T16:13:51.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.759+0000 7f625f7fe640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 0x7f6240078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.763+0000 7f625f7fe640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 0x7f6240078820 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f62500045e0 tx=0x7f625000c040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:51.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.766+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6258086fc0 con 0x7f62600827b0 2026-03-09T16:13:51.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.899+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6260075d40 con 0x7f6240076360 2026-03-09T16:13:51.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.901+0000 7f625cff9640 1 -- 192.168.123.103:0/1478055418 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f6260075d40 con 0x7f6240076360 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.903+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 msgr2=0x7f6240078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.903+0000 7f6265b51640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 0x7f6240078820 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f62500045e0 tx=0x7f625000c040 comp rx=0 tx=0).stop 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.903+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 msgr2=0x7f6260082c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.903+0000 7f6265b51640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f6258031ba0 tx=0x7f625800be30 comp rx=0 tx=0).stop 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 shutdown_connections 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f6240076360 0x7f6240078820 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f62600827b0 0x7f6260082c30 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 --2- 192.168.123.103:0/1478055418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6260071a70 0x7f6260084160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 >> 192.168.123.103:0/1478055418 conn(0x7f626006d4f0 msgr2=0x7f6260073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:51.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 shutdown_connections 2026-03-09T16:13:51.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.904+0000 7f6265b51640 1 -- 192.168.123.103:0/1478055418 wait complete. 2026-03-09T16:13:51.917 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 -- 192.168.123.103:0/970105474 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 msgr2=0x7fa64c071dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 --2- 192.168.123.103:0/970105474 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c071dc0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7fa64801acf0 tx=0x7fa648040150 comp rx=0 tx=0).stop 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 -- 192.168.123.103:0/970105474 shutdown_connections 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 --2- 192.168.123.103:0/970105474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c10c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 --2- 192.168.123.103:0/970105474 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c071dc0 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 -- 192.168.123.103:0/970105474 >> 192.168.123.103:0/970105474 conn(0x7fa64c06d4f0 msgr2=0x7fa64c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.986+0000 7fa6543d8640 1 -- 192.168.123.103:0/970105474 shutdown_connections 2026-03-09T16:13:51.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 -- 192.168.123.103:0/970105474 wait complete. 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 Processor -- start 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 -- start start 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c19e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c19ee90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa64c19f460 con 0x7fa64c0719c0 2026-03-09T16:13:51.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa6543d8640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa64c19f5d0 con 0x7fa64c072390 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa65214d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c19e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa65194c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c19ee90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa65194c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c19ee90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:45976/0 (socket says 192.168.123.103:45976) 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.987+0000 7fa65194c640 1 -- 192.168.123.103:0/3674996289 learned_addr learned my addr 192.168.123.103:0/3674996289 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.988+0000 7fa65214d640 1 -- 192.168.123.103:0/3674996289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 msgr2=0x7fa64c19ee90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.988+0000 7fa65214d640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c19ee90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.988+0000 7fa65214d640 1 -- 192.168.123.103:0/3674996289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa64801a9a0 con 0x7fa64c0719c0 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.988+0000 7fa65214d640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c19e950 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7fa648040fd0 tx=0x7fa64801a560 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.988+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa648004050 con 0x7fa64c0719c0 2026-03-09T16:13:51.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.989+0000 7fa6543d8640 1 -- 192.168.123.103:0/3674996289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa64c1a3fa0 con 0x7fa64c0719c0 2026-03-09T16:13:51.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.989+0000 7fa6543d8640 1 -- 192.168.123.103:0/3674996289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa64c1a4490 con 0x7fa64c0719c0 2026-03-09T16:13:51.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.990+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa6480185d0 con 0x7fa64c0719c0 2026-03-09T16:13:51.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.990+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6480495f0 con 0x7fa64c0719c0 2026-03-09T16:13:51.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.990+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa648049810 con 0x7fa64c0719c0 2026-03-09T16:13:51.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.992+0000 7fa63b7fe640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 0x7fa624078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:51.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.992+0000 7fa65194c640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 0x7fa624078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:51.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.992+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa6480cda80 con 0x7fa64c0719c0 2026-03-09T16:13:51.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.993+0000 7fa65194c640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 0x7fa624078820 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fa64c19fe00 tx=0x7fa64400b040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:51.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.993+0000 7fa6543d8640 1 -- 192.168.123.103:0/3674996289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa614005350 con 0x7fa64c0719c0 2026-03-09T16:13:51.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:51.998+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa6480970f0 con 0x7fa64c0719c0 2026-03-09T16:13:52.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.132+0000 7fa6543d8640 1 -- 192.168.123.103:0/3674996289 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa614002bf0 con 0x7fa624076360 2026-03-09T16:13:52.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.133+0000 7fa63b7fe640 1 -- 192.168.123.103:0/3674996289 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7fa614002bf0 con 0x7fa624076360 2026-03-09T16:13:52.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 msgr2=0x7fa624078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 0x7fa624078820 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fa64c19fe00 tx=0x7fa64400b040 comp rx=0 tx=0).stop 2026-03-09T16:13:52.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 msgr2=0x7fa64c19e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c19e950 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7fa648040fd0 tx=0x7fa64801a560 comp rx=0 tx=0).stop 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 shutdown_connections 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fa624076360 0x7fa624078820 secure :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fa64c19fe00 tx=0x7fa64400b040 comp rx=0 tx=0).stop 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa64c072390 0x7fa64c19ee90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 --2- 192.168.123.103:0/3674996289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa64c0719c0 0x7fa64c19e950 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 >> 192.168.123.103:0/3674996289 conn(0x7fa64c06d4f0 msgr2=0x7fa64c0707b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 shutdown_connections 2026-03-09T16:13:52.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.139+0000 7fa6397fa640 1 -- 192.168.123.103:0/3674996289 wait complete. 2026-03-09T16:13:52.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 -- 192.168.123.103:0/2170405859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50072420 msgr2=0x7ffa50077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 --2- 192.168.123.103:0/2170405859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50072420 0x7ffa50077190 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7ffa4800b3e0 tx=0x7ffa4802f730 comp rx=0 tx=0).stop 2026-03-09T16:13:52.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 -- 192.168.123.103:0/2170405859 shutdown_connections 2026-03-09T16:13:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 --2- 192.168.123.103:0/2170405859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50072420 0x7ffa50077190 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 --2- 192.168.123.103:0/2170405859 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa50071a50 0x7ffa50071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.198+0000 7ffa56027640 1 -- 192.168.123.103:0/2170405859 >> 192.168.123.103:0/2170405859 conn(0x7ffa5006d4f0 msgr2=0x7ffa5006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.199+0000 7ffa56027640 1 -- 192.168.123.103:0/2170405859 shutdown_connections 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.199+0000 7ffa56027640 1 -- 192.168.123.103:0/2170405859 wait complete. 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.199+0000 7ffa56027640 1 Processor -- start 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.199+0000 7ffa56027640 1 -- start start 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.200+0000 7ffa56027640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 0x7ffa50084160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.200+0000 7ffa56027640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 0x7ffa50082c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.200+0000 7ffa56027640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa50083170 con 0x7ffa50071a50 2026-03-09T16:13:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.200+0000 7ffa56027640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa500832e0 con 0x7ffa500827b0 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa55025640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 0x7ffa50084160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa55025640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 0x7ffa50084160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47566/0 (socket says 192.168.123.103:47566) 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa55025640 1 -- 192.168.123.103:0/2350304042 learned_addr learned my addr 192.168.123.103:0/2350304042 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa54824640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 0x7ffa50082c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa54824640 1 -- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 msgr2=0x7ffa50084160 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa54824640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 0x7ffa50084160 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa54824640 1 -- 192.168.123.103:0/2350304042 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffa48009d00 con 0x7ffa500827b0 2026-03-09T16:13:52.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.201+0000 7ffa54824640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 0x7ffa50082c30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7ffa500730c0 tx=0x7ffa48009c70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.203+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa48041070 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.203+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ffa48032070 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.203+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa4803c740 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.203+0000 7ffa56027640 1 -- 192.168.123.103:0/2350304042 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa50083560 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.203+0000 7ffa56027640 1 -- 192.168.123.103:0/2350304042 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa501b5c10 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.206+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ffa4803c8a0 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.206+0000 7ffa467fc640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 0x7ffa380787f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.206+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7ffa480c1530 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.206+0000 7ffa55025640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 0x7ffa380787f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.206+0000 7ffa56027640 1 -- 192.168.123.103:0/2350304042 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffa20005350 con 0x7ffa500827b0 2026-03-09T16:13:52.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.207+0000 7ffa55025640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 0x7ffa380787f0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7ffa4c004660 tx=0x7ffa4c009290 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.213+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ffa4808ab10 con 0x7ffa500827b0 2026-03-09T16:13:52.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.314+0000 7ffa56027640 1 -- 192.168.123.103:0/2350304042 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7ffa20002bf0 con 0x7ffa38076330 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 91s ago 3m 24.7M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (3m) 91s ago 3m 8300k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (3m) 92s ago 3m 8502k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 91s ago 3m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (3m) 92s ago 3m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 91s ago 3m 82.6M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (96s) 91s ago 96s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (98s) 91s ago 98s 18.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (97s) 92s ago 97s 11.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (95s) 92s ago 95s 14.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:9283,8765,8443 running (4m) 91s ago 4m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 55454b4aaab2 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (3m) 92s ago 3m 488M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 a411a05027bd 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 91s ago 4m 54.8M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (3m) 92s ago 3m 50.0M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 91s ago 3m 14.1M - 1.5.0 0da6a335fe13 8c7f00e55632 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 92s ago 3m 15.0M - 1.5.0 0da6a335fe13 4c3ab3bdf8cf 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 91s ago 2m 48.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 91s ago 2m 68.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 91s ago 2m 47.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (2m) 92s ago 2m 66.4M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (2m) 92s ago 2m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (119s) 92s ago 119s 43.4M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:13:52.322 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 91s ago 3m 37.2M - 2.43.0 a07b618ecd1d 89a8f084cd57 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.319+0000 7ffa467fc640 1 -- 192.168.123.103:0/2350304042 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7ffa20002bf0 con 0x7ffa38076330 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.322+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 msgr2=0x7ffa380787f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.322+0000 7ffa1bfff640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 0x7ffa380787f0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7ffa4c004660 tx=0x7ffa4c009290 comp rx=0 tx=0).stop 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.322+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 msgr2=0x7ffa50082c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.322+0000 7ffa1bfff640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 0x7ffa50082c30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7ffa500730c0 tx=0x7ffa48009c70 comp rx=0 tx=0).stop 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 shutdown_connections 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7ffa38076330 0x7ffa380787f0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa500827b0 0x7ffa50082c30 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 --2- 192.168.123.103:0/2350304042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffa50071a50 0x7ffa50084160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 >> 192.168.123.103:0/2350304042 conn(0x7ffa5006d4f0 msgr2=0x7ffa50070490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 shutdown_connections 2026-03-09T16:13:52.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.323+0000 7ffa1bfff640 1 -- 192.168.123.103:0/2350304042 wait complete. 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 -- 192.168.123.103:0/4180695763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 msgr2=0x7f09ac10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4180695763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac10c590 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f09a400b0a0 tx=0x7f09a402f4c0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 -- 192.168.123.103:0/4180695763 shutdown_connections 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4180695763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac10c590 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4180695763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09ac0719c0 0x7f09ac071dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 -- 192.168.123.103:0/4180695763 >> 192.168.123.103:0/4180695763 conn(0x7f09ac06d4f0 msgr2=0x7f09ac06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 -- 192.168.123.103:0/4180695763 shutdown_connections 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.378+0000 7f09b3e68640 1 -- 192.168.123.103:0/4180695763 wait complete. 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 Processor -- start 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 -- start start 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09ac0719c0 0x7f09ac1171e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09ac115f50 con 0x7f09ac072390 2026-03-09T16:13:52.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b3e68640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09ac1160c0 con 0x7f09ac0719c0 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47582/0 (socket says 192.168.123.103:47582) 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 -- 192.168.123.103:0/4249019714 learned_addr learned my addr 192.168.123.103:0/4249019714 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 -- 192.168.123.103:0/4249019714 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09ac0719c0 msgr2=0x7f09ac1171e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09ac0719c0 0x7f09ac1171e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 -- 192.168.123.103:0/4249019714 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09a4009d00 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.379+0000 7f09b13dc640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f09a4002790 tx=0x7f09a4007890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.380+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09a40079c0 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.380+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f09a4038420 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.380+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09a40405d0 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.380+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09ac116340 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.380+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09ac1a4370 con 0x7f09ac072390 2026-03-09T16:13:52.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.381+0000 7f0998ff9640 1 -- 192.168.123.103:0/4249019714 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0980005350 con 0x7f09ac072390 2026-03-09T16:13:52.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.382+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f09a4048050 con 0x7f09ac072390 2026-03-09T16:13:52.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.382+0000 7f099affd640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 0x7f0988078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.382+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f09a40bc7a0 con 0x7f09ac072390 2026-03-09T16:13:52.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.385+0000 7f09b1bdd640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 0x7f0988078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.385+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f09a4085e10 con 0x7f09ac072390 2026-03-09T16:13:52.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.392+0000 7f09b1bdd640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 0x7f0988078750 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f09ac116fa0 tx=0x7f09a0006d20 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.547+0000 7f0998ff9640 1 -- 192.168.123.103:0/4249019714 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0980005e10 con 0x7f09ac072390 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.548+0000 7f099affd640 1 -- 192.168.123.103:0/4249019714 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f09a40857b0 con 0x7f09ac072390 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:13:52.549 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 msgr2=0x7f0988078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 0x7f0988078750 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f09ac116fa0 tx=0x7f09a0006d20 comp rx=0 tx=0).stop 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 msgr2=0x7f09ac115930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f09a4002790 tx=0x7f09a4007890 comp rx=0 tx=0).stop 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 shutdown_connections 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7f0988076290 0x7f0988078750 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f09ac072390 0x7f09ac115930 secure :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f09a4002790 tx=0x7f09a4007890 comp rx=0 tx=0).stop 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 --2- 192.168.123.103:0/4249019714 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09ac0719c0 0x7f09ac1171e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 >> 192.168.123.103:0/4249019714 conn(0x7f09ac06d4f0 msgr2=0x7f09ac070680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.552+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 shutdown_connections 2026-03-09T16:13:52.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.553+0000 7f09b3e68640 1 -- 192.168.123.103:0/4249019714 wait complete. 2026-03-09T16:13:52.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.627+0000 7fcd1813d640 1 -- 192.168.123.103:0/2063546962 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 msgr2=0x7fcd10071dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.627+0000 7fcd1813d640 1 --2- 192.168.123.103:0/2063546962 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd10071dc0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fcd000099b0 tx=0x7fcd0002f240 comp rx=0 tx=0).stop 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 -- 192.168.123.103:0/2063546962 shutdown_connections 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 --2- 192.168.123.103:0/2063546962 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10110d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 --2- 192.168.123.103:0/2063546962 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd10071dc0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 -- 192.168.123.103:0/2063546962 >> 192.168.123.103:0/2063546962 conn(0x7fcd1006d4f0 msgr2=0x7fcd1006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 -- 192.168.123.103:0/2063546962 shutdown_connections 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 -- 192.168.123.103:0/2063546962 wait complete. 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 Processor -- start 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.628+0000 7fcd1813d640 1 -- start start 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd1813d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd1011b050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd1813d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd1813d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd10117ff0 con 0x7fcd10072300 2026-03-09T16:13:52.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd1813d640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd10118160 con 0x7fcd100719c0 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47608/0 (socket says 192.168.123.103:47608) 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 -- 192.168.123.103:0/3383105686 learned_addr learned my addr 192.168.123.103:0/3383105686 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd15eb2640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd1011b050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 -- 192.168.123.103:0/3383105686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 msgr2=0x7fcd1011b050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd1011b050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.629+0000 7fcd156b1640 1 -- 192.168.123.103:0/3383105686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd04009590 con 0x7fcd10072300 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fcd156b1640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7fcd04002760 tx=0x7fcd04002c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd0400ecf0 con 0x7fcd10072300 2026-03-09T16:13:52.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcd04002e90 con 0x7fcd10072300 2026-03-09T16:13:52.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd0400f660 con 0x7fcd10072300 2026-03-09T16:13:52.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fcd1813d640 1 -- 192.168.123.103:0/3383105686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd00009660 con 0x7fcd10072300 2026-03-09T16:13:52.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.630+0000 7fcd1813d640 1 -- 192.168.123.103:0/3383105686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd101b57c0 con 0x7fcd10072300 2026-03-09T16:13:52.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.633+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fcd04016020 con 0x7fcd10072300 2026-03-09T16:13:52.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.633+0000 7fccfeffd640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 0x7fcce8078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:13:52.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.633+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fcd04098b60 con 0x7fcd10072300 2026-03-09T16:13:52.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.633+0000 7fcd1813d640 1 -- 192.168.123.103:0/3383105686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fccd8005350 con 0x7fcd10072300 2026-03-09T16:13:52.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.636+0000 7fcd15eb2640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 0x7fcce8078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:13:52.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.636+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fcd04061ca0 con 0x7fcd10072300 2026-03-09T16:13:52.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.644+0000 7fcd15eb2640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 0x7fcce8078750 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fcd00002410 tx=0x7fcd0003a040 comp rx=0 tx=0).ready entity=mgr.14225 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:13:52.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.764+0000 7fcd1813d640 1 -- 192.168.123.103:0/3383105686 --> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fccd8002bf0 con 0x7fcce8076290 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.765+0000 7fccfeffd640 1 -- 192.168.123.103:0/3383105686 <== mgr.14225 v2:192.168.123.103:6800/3168090362 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7fccd8002bf0 con 0x7fcce8076290 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "0/2 daemons upgraded", 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm05", 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:13:52.766 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 msgr2=0x7fcce8078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 0x7fcce8078750 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fcd00002410 tx=0x7fcd0003a040 comp rx=0 tx=0).stop 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 msgr2=0x7fcd10117ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7fcd04002760 tx=0x7fcd04002c30 comp rx=0 tx=0).stop 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 shutdown_connections 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:6800/3168090362,v1:192.168.123.103:6801/3168090362] conn(0x7fcce8076290 0x7fcce8078750 secure :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fcd00002410 tx=0x7fcd0003a040 comp rx=0 tx=0).stop 2026-03-09T16:13:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd10072300 0x7fcd10117ab0 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 --2- 192.168.123.103:0/3383105686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd100719c0 0x7fcd1011b050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:13:52.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.770+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 >> 192.168.123.103:0/3383105686 conn(0x7fcd1006d4f0 msgr2=0x7fcd1006dee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:13:52.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.771+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 shutdown_connections 2026-03-09T16:13:52.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:13:52.772+0000 7fccfcff9640 1 -- 192.168.123.103:0/3383105686 wait complete. 2026-03-09T16:13:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:53 vm05.local ceph-mon[58702]: pgmap v128: 65 pgs: 65 active+clean; 77 MiB data, 473 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.3 MiB/s wr, 211 op/s 2026-03-09T16:13:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:53 vm05.local ceph-mon[58702]: from='client.14566 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:53 vm05.local ceph-mon[58702]: from='client.14570 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:53 vm05.local ceph-mon[58702]: from='client.24349 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:53 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/4249019714' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:13:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:53 vm03.local ceph-mon[51019]: pgmap v128: 65 pgs: 65 active+clean; 77 MiB data, 473 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.3 MiB/s wr, 211 op/s 2026-03-09T16:13:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:53 vm03.local ceph-mon[51019]: from='client.14566 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:53 vm03.local ceph-mon[51019]: from='client.14570 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:53 vm03.local ceph-mon[51019]: from='client.24349 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:53.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:53 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/4249019714' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:13:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:54 vm03.local ceph-mon[51019]: from='client.14582 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:54 vm05.local ceph-mon[58702]: from='client.14582 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:13:55.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:55 vm03.local ceph-mon[51019]: pgmap v129: 65 pgs: 65 active+clean; 87 MiB data, 519 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.5 MiB/s wr, 238 op/s 2026-03-09T16:13:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:55 vm05.local ceph-mon[58702]: pgmap v129: 65 pgs: 65 active+clean; 87 MiB data, 519 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.5 MiB/s wr, 238 op/s 2026-03-09T16:13:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:57 vm05.local ceph-mon[58702]: pgmap v130: 65 pgs: 65 active+clean; 99 MiB data, 567 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.9 MiB/s wr, 231 op/s 2026-03-09T16:13:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:57 vm03.local ceph-mon[51019]: pgmap v130: 65 pgs: 65 active+clean; 99 MiB data, 567 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.9 MiB/s wr, 231 op/s 2026-03-09T16:13:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:13:58 vm05.local ceph-mon[58702]: pgmap v131: 65 pgs: 65 active+clean; 108 MiB data, 632 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.6 MiB/s wr, 221 op/s 2026-03-09T16:13:59.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:13:58 vm03.local ceph-mon[51019]: pgmap v131: 65 pgs: 65 active+clean; 108 MiB data, 632 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.6 MiB/s wr, 221 op/s 2026-03-09T16:14:01.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:01 vm03.local ceph-mon[51019]: pgmap v132: 65 pgs: 65 active+clean; 119 MiB data, 713 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 6.2 MiB/s wr, 238 op/s 2026-03-09T16:14:01.528 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:01 vm05.local ceph-mon[58702]: pgmap v132: 65 pgs: 65 active+clean; 119 MiB data, 713 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 6.2 MiB/s wr, 238 op/s 2026-03-09T16:14:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:03 vm05.local ceph-mon[58702]: pgmap v133: 65 pgs: 65 active+clean; 124 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.3 MiB/s wr, 206 op/s 2026-03-09T16:14:03.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:03 vm03.local ceph-mon[51019]: pgmap v133: 65 pgs: 65 active+clean; 124 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.3 MiB/s wr, 206 op/s 2026-03-09T16:14:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:04 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:04.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:04 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:05 vm05.local ceph-mon[58702]: pgmap v134: 65 pgs: 65 active+clean; 137 MiB data, 847 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.1 MiB/s wr, 248 op/s 2026-03-09T16:14:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:05 vm03.local ceph-mon[51019]: pgmap v134: 65 pgs: 65 active+clean; 137 MiB data, 847 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.1 MiB/s wr, 248 op/s 2026-03-09T16:14:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:07 vm05.local ceph-mon[58702]: pgmap v135: 65 pgs: 65 active+clean; 144 MiB data, 943 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.8 MiB/s wr, 263 op/s 2026-03-09T16:14:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:06 vm03.local ceph-mon[51019]: pgmap v135: 65 pgs: 65 active+clean; 144 MiB data, 943 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.8 MiB/s wr, 263 op/s 2026-03-09T16:14:09.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:09 vm03.local ceph-mon[51019]: pgmap v136: 65 pgs: 65 active+clean; 150 MiB data, 974 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 264 op/s 2026-03-09T16:14:09.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:09 vm05.local ceph-mon[58702]: pgmap v136: 65 pgs: 65 active+clean; 150 MiB data, 974 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 264 op/s 2026-03-09T16:14:11.982 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:11 vm05.local ceph-mon[58702]: pgmap v137: 65 pgs: 65 active+clean; 160 MiB data, 1002 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 305 op/s 2026-03-09T16:14:12.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:11 vm03.local ceph-mon[51019]: pgmap v137: 65 pgs: 65 active+clean; 160 MiB data, 1002 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 305 op/s 2026-03-09T16:14:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:13 vm03.local ceph-mon[51019]: pgmap v138: 65 pgs: 65 active+clean; 164 MiB data, 1003 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 3.9 MiB/s wr, 289 op/s 2026-03-09T16:14:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:13 vm03.local ceph-mon[51019]: Upgrade: Updating mgr.vm05.dygxfv 2026-03-09T16:14:13.498 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:13 vm05.local ceph-mon[58702]: pgmap v138: 65 pgs: 65 active+clean; 164 MiB data, 1003 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 3.9 MiB/s wr, 289 op/s 2026-03-09T16:14:13.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:13 vm05.local ceph-mon[58702]: Upgrade: Updating mgr.vm05.dygxfv 2026-03-09T16:14:14.376 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:14 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:14.376 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:14 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:14.376 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:14 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:14.376 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:14 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:14.376 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:14 vm05.local ceph-mon[58702]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:14:14.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:14 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:14.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:14 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:14.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:14 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:14.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:14 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:14.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:14 vm03.local ceph-mon[51019]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:14:14.949 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T16:14:14.953 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T16:14:14.953 INFO:tasks.workunit.client.0.vm03.stderr:+ make 2026-03-09T16:14:15.228 INFO:tasks.workunit.client.0.vm03.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T16:14:15.518 INFO:tasks.workunit.client.0.vm03.stderr:++ readlink -f fsstress 2026-03-09T16:14:15.520 INFO:tasks.workunit.client.0.vm03.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T16:14:15.520 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-09T16:14:15.522 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T16:14:15.522 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-09T16:14:15.523 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-09T16:14:15.523 INFO:tasks.workunit.client.0.vm03.stderr:++ mktemp -d -p . 2026-03-09T16:14:15.527 INFO:tasks.workunit.client.0.vm03.stderr:+ T=./tmp.rYSksf2L2n 2026-03-09T16:14:15.527 INFO:tasks.workunit.client.0.vm03.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.rYSksf2L2n -l 1 -n 1000 -p 10 -v 2026-03-09T16:14:15.531 INFO:tasks.workunit.client.0.vm03.stdout:seed = 1772547817 2026-03-09T16:14:15.537 INFO:tasks.workunit.client.0.vm03.stdout:3/0: dwrite - no filename 2026-03-09T16:14:15.537 INFO:tasks.workunit.client.0.vm03.stdout:3/1: dwrite - no filename 2026-03-09T16:14:15.537 INFO:tasks.workunit.client.0.vm03.stdout:3/2: dread - no filename 2026-03-09T16:14:15.540 INFO:tasks.workunit.client.0.vm03.stdout:4/0: rmdir - no directory 2026-03-09T16:14:15.540 INFO:tasks.workunit.client.0.vm03.stdout:4/1: write - no filename 2026-03-09T16:14:15.540 INFO:tasks.workunit.client.0.vm03.stdout:4/2: dread - no filename 2026-03-09T16:14:15.541 INFO:tasks.workunit.client.0.vm03.stdout:0/0: mkdir d0 0 2026-03-09T16:14:15.541 INFO:tasks.workunit.client.0.vm03.stdout:0/1: fsync - no filename 2026-03-09T16:14:15.541 INFO:tasks.workunit.client.0.vm03.stdout:0/2: unlink - no file 2026-03-09T16:14:15.541 INFO:tasks.workunit.client.0.vm03.stdout:0/3: dread - no filename 2026-03-09T16:14:15.541 INFO:tasks.workunit.client.0.vm03.stdout:0/4: write - no filename 2026-03-09T16:14:15.543 INFO:tasks.workunit.client.0.vm03.stdout:3/3: creat f0 x:0 0 0 2026-03-09T16:14:15.544 INFO:tasks.workunit.client.0.vm03.stdout:3/4: fdatasync f0 0 2026-03-09T16:14:15.544 INFO:tasks.workunit.client.0.vm03.stdout:3/5: stat f0 0 2026-03-09T16:14:15.544 INFO:tasks.workunit.client.0.vm03.stdout:3/6: readlink - no filename 2026-03-09T16:14:15.548 INFO:tasks.workunit.client.0.vm03.stdout:4/3: symlink l0 0 2026-03-09T16:14:15.549 INFO:tasks.workunit.client.0.vm03.stdout:4/4: chown l0 908 1 2026-03-09T16:14:15.550 INFO:tasks.workunit.client.0.vm03.stdout:3/7: rename f0 to f1 0 2026-03-09T16:14:15.551 INFO:tasks.workunit.client.0.vm03.stdout:0/5: symlink d0/l1 0 2026-03-09T16:14:15.552 INFO:tasks.workunit.client.0.vm03.stdout:0/6: stat d0 0 2026-03-09T16:14:15.552 INFO:tasks.workunit.client.0.vm03.stdout:0/7: dwrite - no filename 2026-03-09T16:14:15.568 INFO:tasks.workunit.client.0.vm03.stdout:5/0: creat f0 x:0 0 0 2026-03-09T16:14:15.569 INFO:tasks.workunit.client.0.vm03.stdout:4/5: creat f1 x:0 0 0 2026-03-09T16:14:15.572 INFO:tasks.workunit.client.0.vm03.stdout:0/8: unlink d0/l1 0 2026-03-09T16:14:15.575 INFO:tasks.workunit.client.0.vm03.stdout:8/0: dwrite - no filename 2026-03-09T16:14:15.581 INFO:tasks.workunit.client.0.vm03.stdout:6/0: creat f0 x:0 0 0 2026-03-09T16:14:15.588 INFO:tasks.workunit.client.0.vm03.stdout:3/8: dwrite f1 [0,4194304] 0 2026-03-09T16:14:15.592 INFO:tasks.workunit.client.0.vm03.stdout:5/1: mknod c1 0 2026-03-09T16:14:15.593 INFO:tasks.workunit.client.0.vm03.stdout:4/6: creat f2 x:0 0 0 2026-03-09T16:14:15.593 INFO:tasks.workunit.client.0.vm03.stdout:0/9: creat d0/f2 x:0 0 0 2026-03-09T16:14:15.593 INFO:tasks.workunit.client.0.vm03.stdout:0/10: chown d0 191002990 1 2026-03-09T16:14:15.594 INFO:tasks.workunit.client.0.vm03.stdout:4/7: write f2 [769539,31699] 0 2026-03-09T16:14:15.604 INFO:tasks.workunit.client.0.vm03.stdout:3/9: dread f1 [0,4194304] 0 2026-03-09T16:14:15.604 INFO:tasks.workunit.client.0.vm03.stdout:6/1: dwrite f0 [0,4194304] 0 2026-03-09T16:14:15.607 INFO:tasks.workunit.client.0.vm03.stdout:5/2: mkdir d2 0 2026-03-09T16:14:15.607 INFO:tasks.workunit.client.0.vm03.stdout:5/3: read - f0 zero size 2026-03-09T16:14:15.611 INFO:tasks.workunit.client.0.vm03.stdout:0/11: creat d0/f3 x:0 0 0 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:5/4: dwrite f0 [0,4194304] 0 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:8/1: getdents . 0 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:2/0: rename - no filename 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:2/1: fsync - no filename 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:2/2: dread - no filename 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:2/3: dread - no filename 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:8/2: chown . 1604 1 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:8/3: write - no filename 2026-03-09T16:14:15.618 INFO:tasks.workunit.client.0.vm03.stdout:8/4: dread - no filename 2026-03-09T16:14:15.619 INFO:tasks.workunit.client.0.vm03.stdout:8/5: chown . 916422 1 2026-03-09T16:14:15.620 INFO:tasks.workunit.client.0.vm03.stdout:0/12: write d0/f2 [976555,119140] 0 2026-03-09T16:14:15.622 INFO:tasks.workunit.client.0.vm03.stdout:3/10: dread f1 [0,4194304] 0 2026-03-09T16:14:15.629 INFO:tasks.workunit.client.0.vm03.stdout:5/5: rename f0 to d2/f3 0 2026-03-09T16:14:15.629 INFO:tasks.workunit.client.0.vm03.stdout:5/6: readlink - no filename 2026-03-09T16:14:15.632 INFO:tasks.workunit.client.0.vm03.stdout:9/0: creat f0 x:0 0 0 2026-03-09T16:14:15.636 INFO:tasks.workunit.client.0.vm03.stdout:1/0: chown . 1 1 2026-03-09T16:14:15.636 INFO:tasks.workunit.client.0.vm03.stdout:1/1: chown . 2 1 2026-03-09T16:14:15.637 INFO:tasks.workunit.client.0.vm03.stdout:3/11: creat f2 x:0 0 0 2026-03-09T16:14:15.637 INFO:tasks.workunit.client.0.vm03.stdout:3/12: chown f1 688604 1 2026-03-09T16:14:15.640 INFO:tasks.workunit.client.0.vm03.stdout:9/1: rename f0 to f1 0 2026-03-09T16:14:15.640 INFO:tasks.workunit.client.0.vm03.stdout:2/4: getdents . 0 2026-03-09T16:14:15.641 INFO:tasks.workunit.client.0.vm03.stdout:8/6: mknod c0 0 2026-03-09T16:14:15.642 INFO:tasks.workunit.client.0.vm03.stdout:3/13: creat f3 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:3/14: chown f3 170362 1 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:8/7: creat f1 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:1/2: mknod c0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:1/3: readlink - no filename 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:2/5: creat f0 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:2/6: dread - f0 zero size 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:8/8: unlink f1 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:8/9: rmdir - no directory 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:1/4: creat f1 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:3/15: link f1 f4 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:2/7: creat f1 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:3/16: dread f4 [0,4194304] 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:8/10: rename c0 to c2 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:1/5: creat f2 x:0 0 0 2026-03-09T16:14:15.656 INFO:tasks.workunit.client.0.vm03.stdout:2/8: creat f2 x:0 0 0 2026-03-09T16:14:15.657 INFO:tasks.workunit.client.0.vm03.stdout:1/6: write f1 [869093,122148] 0 2026-03-09T16:14:15.657 INFO:tasks.workunit.client.0.vm03.stdout:3/17: mkdir d5 0 2026-03-09T16:14:15.657 INFO:tasks.workunit.client.0.vm03.stdout:8/11: creat f3 x:0 0 0 2026-03-09T16:14:15.658 INFO:tasks.workunit.client.0.vm03.stdout:3/18: chown f3 10864 1 2026-03-09T16:14:15.659 INFO:tasks.workunit.client.0.vm03.stdout:3/19: dread - f3 zero size 2026-03-09T16:14:15.660 INFO:tasks.workunit.client.0.vm03.stdout:1/7: rename c0 to c3 0 2026-03-09T16:14:15.661 INFO:tasks.workunit.client.0.vm03.stdout:8/12: mknod c4 0 2026-03-09T16:14:15.662 INFO:tasks.workunit.client.0.vm03.stdout:2/9: dwrite f1 [0,4194304] 0 2026-03-09T16:14:15.666 INFO:tasks.workunit.client.0.vm03.stdout:1/8: mkdir d4 0 2026-03-09T16:14:15.666 INFO:tasks.workunit.client.0.vm03.stdout:2/10: rename f2 to f3 0 2026-03-09T16:14:15.673 INFO:tasks.workunit.client.0.vm03.stdout:2/11: creat f4 x:0 0 0 2026-03-09T16:14:15.688 INFO:tasks.workunit.client.0.vm03.stdout:2/12: fdatasync f3 0 2026-03-09T16:14:15.688 INFO:tasks.workunit.client.0.vm03.stdout:2/13: creat f5 x:0 0 0 2026-03-09T16:14:15.688 INFO:tasks.workunit.client.0.vm03.stdout:2/14: creat f6 x:0 0 0 2026-03-09T16:14:15.688 INFO:tasks.workunit.client.0.vm03.stdout:2/15: write f3 [1015965,107339] 0 2026-03-09T16:14:15.688 INFO:tasks.workunit.client.0.vm03.stdout:2/16: read - f4 zero size 2026-03-09T16:14:15.775 INFO:tasks.workunit.client.0.vm03.stdout:0/13: fdatasync d0/f2 0 2026-03-09T16:14:15.777 INFO:tasks.workunit.client.0.vm03.stdout:0/14: write d0/f2 [1345202,126175] 0 2026-03-09T16:14:15.779 INFO:tasks.workunit.client.0.vm03.stdout:2/17: dread f3 [0,4194304] 0 2026-03-09T16:14:15.784 INFO:tasks.workunit.client.0.vm03.stdout:0/15: unlink d0/f2 0 2026-03-09T16:14:15.786 INFO:tasks.workunit.client.0.vm03.stdout:0/16: mknod d0/c4 0 2026-03-09T16:14:15.786 INFO:tasks.workunit.client.0.vm03.stdout:0/17: write d0/f3 [912163,86391] 0 2026-03-09T16:14:15.787 INFO:tasks.workunit.client.0.vm03.stdout:0/18: mknod d0/c5 0 2026-03-09T16:14:15.796 INFO:tasks.workunit.client.0.vm03.stdout:4/8: dread f2 [0,4194304] 0 2026-03-09T16:14:15.850 INFO:tasks.workunit.client.0.vm03.stdout:4/9: fsync f1 0 2026-03-09T16:14:15.851 INFO:tasks.workunit.client.0.vm03.stdout:4/10: link f2 f3 0 2026-03-09T16:14:15.852 INFO:tasks.workunit.client.0.vm03.stdout:4/11: creat f4 x:0 0 0 2026-03-09T16:14:15.857 INFO:tasks.workunit.client.0.vm03.stdout:4/12: dwrite f3 [0,4194304] 0 2026-03-09T16:14:15.858 INFO:tasks.workunit.client.0.vm03.stdout:4/13: chown f1 464011 1 2026-03-09T16:14:15.858 INFO:tasks.workunit.client.0.vm03.stdout:4/14: read - f1 zero size 2026-03-09T16:14:15.865 INFO:tasks.workunit.client.0.vm03.stdout:4/15: dwrite f4 [0,4194304] 0 2026-03-09T16:14:15.871 INFO:tasks.workunit.client.0.vm03.stdout:4/16: mkdir d5 0 2026-03-09T16:14:15.872 INFO:tasks.workunit.client.0.vm03.stdout:4/17: dread - f1 zero size 2026-03-09T16:14:15.878 INFO:tasks.workunit.client.0.vm03.stdout:4/18: dwrite f2 [0,4194304] 0 2026-03-09T16:14:15.884 INFO:tasks.workunit.client.0.vm03.stdout:5/7: dwrite d2/f3 [4194304,4194304] 0 2026-03-09T16:14:15.895 INFO:tasks.workunit.client.0.vm03.stdout:5/8: mkdir d2/d4 0 2026-03-09T16:14:15.896 INFO:tasks.workunit.client.0.vm03.stdout:4/19: creat d5/f6 x:0 0 0 2026-03-09T16:14:15.897 INFO:tasks.workunit.client.0.vm03.stdout:5/9: getdents d2/d4 0 2026-03-09T16:14:15.898 INFO:tasks.workunit.client.0.vm03.stdout:4/20: link f2 d5/f7 0 2026-03-09T16:14:15.901 INFO:tasks.workunit.client.0.vm03.stdout:5/10: rename c1 to d2/d4/c5 0 2026-03-09T16:14:15.901 INFO:tasks.workunit.client.0.vm03.stdout:5/11: readlink - no filename 2026-03-09T16:14:15.904 INFO:tasks.workunit.client.0.vm03.stdout:4/21: creat d5/f8 x:0 0 0 2026-03-09T16:14:15.905 INFO:tasks.workunit.client.0.vm03.stdout:9/2: getdents . 0 2026-03-09T16:14:15.905 INFO:tasks.workunit.client.0.vm03.stdout:4/22: creat d5/f9 x:0 0 0 2026-03-09T16:14:15.907 INFO:tasks.workunit.client.0.vm03.stdout:4/23: rename d5/f6 to d5/fa 0 2026-03-09T16:14:15.907 INFO:tasks.workunit.client.0.vm03.stdout:9/3: getdents . 0 2026-03-09T16:14:15.908 INFO:tasks.workunit.client.0.vm03.stdout:9/4: truncate f1 772024 0 2026-03-09T16:14:15.908 INFO:tasks.workunit.client.0.vm03.stdout:9/5: readlink - no filename 2026-03-09T16:14:15.908 INFO:tasks.workunit.client.0.vm03.stdout:4/24: mkdir d5/db 0 2026-03-09T16:14:15.909 INFO:tasks.workunit.client.0.vm03.stdout:9/6: mkdir d2 0 2026-03-09T16:14:15.910 INFO:tasks.workunit.client.0.vm03.stdout:9/7: read f1 [411748,130696] 0 2026-03-09T16:14:15.911 INFO:tasks.workunit.client.0.vm03.stdout:4/25: unlink f4 0 2026-03-09T16:14:15.912 INFO:tasks.workunit.client.0.vm03.stdout:9/8: fsync f1 0 2026-03-09T16:14:15.913 INFO:tasks.workunit.client.0.vm03.stdout:9/9: getdents d2 0 2026-03-09T16:14:15.914 INFO:tasks.workunit.client.0.vm03.stdout:9/10: dread f1 [0,4194304] 0 2026-03-09T16:14:15.916 INFO:tasks.workunit.client.0.vm03.stdout:1/9: fsync f1 0 2026-03-09T16:14:15.916 INFO:tasks.workunit.client.0.vm03.stdout:3/20: fsync f4 0 2026-03-09T16:14:15.917 INFO:tasks.workunit.client.0.vm03.stdout:3/21: fdatasync f3 0 2026-03-09T16:14:15.918 INFO:tasks.workunit.client.0.vm03.stdout:1/10: chown f2 847239 1 2026-03-09T16:14:15.918 INFO:tasks.workunit.client.0.vm03.stdout:3/22: truncate f3 784109 0 2026-03-09T16:14:15.919 INFO:tasks.workunit.client.0.vm03.stdout:9/11: dwrite f1 [0,4194304] 0 2026-03-09T16:14:15.919 INFO:tasks.workunit.client.0.vm03.stdout:1/11: write f1 [470688,114965] 0 2026-03-09T16:14:15.925 INFO:tasks.workunit.client.0.vm03.stdout:3/23: dwrite f2 [0,4194304] 0 2026-03-09T16:14:15.927 INFO:tasks.workunit.client.0.vm03.stdout:8/13: rename c2 to c5 0 2026-03-09T16:14:15.928 INFO:tasks.workunit.client.0.vm03.stdout:9/12: rename f1 to d2/f3 0 2026-03-09T16:14:15.930 INFO:tasks.workunit.client.0.vm03.stdout:8/14: creat f6 x:0 0 0 2026-03-09T16:14:15.931 INFO:tasks.workunit.client.0.vm03.stdout:3/24: rename f2 to d5/f6 0 2026-03-09T16:14:15.933 INFO:tasks.workunit.client.0.vm03.stdout:3/25: symlink d5/l7 0 2026-03-09T16:14:15.934 INFO:tasks.workunit.client.0.vm03.stdout:3/26: truncate f3 867536 0 2026-03-09T16:14:15.935 INFO:tasks.workunit.client.0.vm03.stdout:8/15: link f3 f7 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:3/27: symlink d5/l8 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:3/28: chown d5/l7 44454471 1 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:8/16: creat f8 x:0 0 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:8/17: write f6 [528400,121008] 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:8/18: read - f3 zero size 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/12: chown c3 402 1 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:3/29: creat d5/f9 x:0 0 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:8/19: symlink l9 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/13: link f2 d4/f5 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/14: dread - d4/f5 zero size 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/15: unlink d4/f5 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/16: dwrite f1 [0,4194304] 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/18: getdents . 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/19: dread - f6 zero size 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/20: write f6 [992058,119579] 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/21: creat f7 x:0 0 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/22: rename f4 to f8 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/23: creat f9 x:0 0 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/24: dread - f8 zero size 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/25: dwrite f1 [0,4194304] 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/26: creat fa x:0 0 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/27: mkdir db 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/28: mknod db/cc 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/29: link f3 db/fd 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/30: fsync f5 0 2026-03-09T16:14:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/31: dread - f8 zero size 2026-03-09T16:14:16.041 INFO:tasks.workunit.client.0.vm03.stdout:6/2: sync 2026-03-09T16:14:16.041 INFO:tasks.workunit.client.0.vm03.stdout:7/0: sync 2026-03-09T16:14:16.042 INFO:tasks.workunit.client.0.vm03.stdout:6/3: mknod c1 0 2026-03-09T16:14:16.042 INFO:tasks.workunit.client.0.vm03.stdout:6/4: readlink - no filename 2026-03-09T16:14:16.046 INFO:tasks.workunit.client.0.vm03.stdout:6/5: dwrite f0 [0,4194304] 0 2026-03-09T16:14:16.048 INFO:tasks.workunit.client.0.vm03.stdout:7/1: creat f0 x:0 0 0 2026-03-09T16:14:16.048 INFO:tasks.workunit.client.0.vm03.stdout:6/6: creat f2 x:0 0 0 2026-03-09T16:14:16.049 INFO:tasks.workunit.client.0.vm03.stdout:6/7: write f2 [186972,99285] 0 2026-03-09T16:14:16.049 INFO:tasks.workunit.client.0.vm03.stdout:7/2: write f0 [518654,124263] 0 2026-03-09T16:14:16.052 INFO:tasks.workunit.client.0.vm03.stdout:7/3: rename f0 to f1 0 2026-03-09T16:14:16.056 INFO:tasks.workunit.client.0.vm03.stdout:6/8: dwrite f0 [0,4194304] 0 2026-03-09T16:14:16.062 INFO:tasks.workunit.client.0.vm03.stdout:6/9: rename c1 to c3 0 2026-03-09T16:14:16.063 INFO:tasks.workunit.client.0.vm03.stdout:6/10: chown f2 1735739 1 2026-03-09T16:14:16.093 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:16 vm05.local ceph-mon[58702]: pgmap v139: 65 pgs: 65 active+clean; 176 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 329 op/s 2026-03-09T16:14:16.093 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:16 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:16.093 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:16 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:16.093 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:16 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:16.131 INFO:tasks.workunit.client.0.vm03.stdout:8/20: fdatasync f6 0 2026-03-09T16:14:16.132 INFO:tasks.workunit.client.0.vm03.stdout:8/21: mkdir da 0 2026-03-09T16:14:16.133 INFO:tasks.workunit.client.0.vm03.stdout:8/22: mkdir da/db 0 2026-03-09T16:14:16.133 INFO:tasks.workunit.client.0.vm03.stdout:8/23: read - f7 zero size 2026-03-09T16:14:16.133 INFO:tasks.workunit.client.0.vm03.stdout:8/24: chown f7 230 1 2026-03-09T16:14:16.134 INFO:tasks.workunit.client.0.vm03.stdout:8/25: creat da/fc x:0 0 0 2026-03-09T16:14:16.138 INFO:tasks.workunit.client.0.vm03.stdout:8/26: dread f6 [0,4194304] 0 2026-03-09T16:14:16.138 INFO:tasks.workunit.client.0.vm03.stdout:8/27: write f6 [1160901,81367] 0 2026-03-09T16:14:16.139 INFO:tasks.workunit.client.0.vm03.stdout:8/28: dread - f7 zero size 2026-03-09T16:14:16.139 INFO:tasks.workunit.client.0.vm03.stdout:8/29: readlink l9 0 2026-03-09T16:14:16.140 INFO:tasks.workunit.client.0.vm03.stdout:0/19: truncate d0/f3 722190 0 2026-03-09T16:14:16.141 INFO:tasks.workunit.client.0.vm03.stdout:8/30: creat da/fd x:0 0 0 2026-03-09T16:14:16.142 INFO:tasks.workunit.client.0.vm03.stdout:0/20: creat d0/f6 x:0 0 0 2026-03-09T16:14:16.146 INFO:tasks.workunit.client.0.vm03.stdout:8/31: rename f7 to da/db/fe 0 2026-03-09T16:14:16.147 INFO:tasks.workunit.client.0.vm03.stdout:0/21: dwrite d0/f6 [0,4194304] 0 2026-03-09T16:14:16.148 INFO:tasks.workunit.client.0.vm03.stdout:8/32: truncate da/fc 319206 0 2026-03-09T16:14:16.152 INFO:tasks.workunit.client.0.vm03.stdout:0/22: dwrite d0/f6 [0,4194304] 0 2026-03-09T16:14:16.158 INFO:tasks.workunit.client.0.vm03.stdout:5/12: fsync d2/f3 0 2026-03-09T16:14:16.161 INFO:tasks.workunit.client.0.vm03.stdout:4/26: fdatasync f3 0 2026-03-09T16:14:16.169 INFO:tasks.workunit.client.0.vm03.stdout:6/11: dread f2 [0,4194304] 0 2026-03-09T16:14:16.169 INFO:tasks.workunit.client.0.vm03.stdout:5/13: dwrite d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.173 INFO:tasks.workunit.client.0.vm03.stdout:6/12: dwrite f0 [0,4194304] 0 2026-03-09T16:14:16.175 INFO:tasks.workunit.client.0.vm03.stdout:8/33: rename da/fc to da/ff 0 2026-03-09T16:14:16.176 INFO:tasks.workunit.client.0.vm03.stdout:4/27: mknod d5/cc 0 2026-03-09T16:14:16.179 INFO:tasks.workunit.client.0.vm03.stdout:7/4: read f1 [282303,83776] 0 2026-03-09T16:14:16.185 INFO:tasks.workunit.client.0.vm03.stdout:7/5: stat f1 0 2026-03-09T16:14:16.185 INFO:tasks.workunit.client.0.vm03.stdout:5/14: mknod d2/d4/c6 0 2026-03-09T16:14:16.185 INFO:tasks.workunit.client.0.vm03.stdout:5/15: stat d2 0 2026-03-09T16:14:16.189 INFO:tasks.workunit.client.0.vm03.stdout:8/34: mkdir da/d10 0 2026-03-09T16:14:16.192 INFO:tasks.workunit.client.0.vm03.stdout:4/28: mkdir d5/dd 0 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:4/29: dread - f1 zero size 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:4/30: dread - d5/f9 zero size 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:5/16: mkdir d2/d7 0 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:7/6: creat f2 x:0 0 0 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:7/7: chown f2 20 1 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:7/8: read - f2 zero size 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:4/31: fdatasync d5/fa 0 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:5/17: chown d2/d4/c5 152 1 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:5/18: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:16.206 INFO:tasks.workunit.client.0.vm03.stdout:7/9: dwrite f1 [0,4194304] 0 2026-03-09T16:14:16.213 INFO:tasks.workunit.client.0.vm03.stdout:9/13: truncate d2/f3 116491 0 2026-03-09T16:14:16.213 INFO:tasks.workunit.client.0.vm03.stdout:4/32: getdents d5/db 0 2026-03-09T16:14:16.214 INFO:tasks.workunit.client.0.vm03.stdout:4/33: write d5/f7 [2744926,89158] 0 2026-03-09T16:14:16.216 INFO:tasks.workunit.client.0.vm03.stdout:7/10: mknod c3 0 2026-03-09T16:14:16.217 INFO:tasks.workunit.client.0.vm03.stdout:5/19: mkdir d2/d7/d8 0 2026-03-09T16:14:16.221 INFO:tasks.workunit.client.0.vm03.stdout:7/11: mkdir d4 0 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:7/12: chown f2 68537104 1 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:5/20: mkdir d2/d4/d9 0 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:3/30: getdents d5 0 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:4/34: creat d5/dd/fe x:0 0 0 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:3/31: write d5/f9 [459737,93663] 0 2026-03-09T16:14:16.228 INFO:tasks.workunit.client.0.vm03.stdout:2/32: getdents db 0 2026-03-09T16:14:16.231 INFO:tasks.workunit.client.0.vm03.stdout:4/35: getdents d5/db 0 2026-03-09T16:14:16.232 INFO:tasks.workunit.client.0.vm03.stdout:2/33: dread db/fd [0,4194304] 0 2026-03-09T16:14:16.232 INFO:tasks.workunit.client.0.vm03.stdout:5/21: link d2/d4/c5 d2/ca 0 2026-03-09T16:14:16.232 INFO:tasks.workunit.client.0.vm03.stdout:4/36: write d5/dd/fe [296362,68644] 0 2026-03-09T16:14:16.233 INFO:tasks.workunit.client.0.vm03.stdout:4/37: dread - d5/fa zero size 2026-03-09T16:14:16.234 INFO:tasks.workunit.client.0.vm03.stdout:2/34: chown f8 1 1 2026-03-09T16:14:16.236 INFO:tasks.workunit.client.0.vm03.stdout:4/38: creat d5/ff x:0 0 0 2026-03-09T16:14:16.236 INFO:tasks.workunit.client.0.vm03.stdout:2/35: creat db/fe x:0 0 0 2026-03-09T16:14:16.237 INFO:tasks.workunit.client.0.vm03.stdout:2/36: unlink f3 0 2026-03-09T16:14:16.237 INFO:tasks.workunit.client.0.vm03.stdout:2/37: truncate f7 432880 0 2026-03-09T16:14:16.237 INFO:tasks.workunit.client.0.vm03.stdout:7/13: dread f1 [0,4194304] 0 2026-03-09T16:14:16.238 INFO:tasks.workunit.client.0.vm03.stdout:2/38: rename db/fe to db/ff 0 2026-03-09T16:14:16.239 INFO:tasks.workunit.client.0.vm03.stdout:7/14: write f2 [134153,36839] 0 2026-03-09T16:14:16.239 INFO:tasks.workunit.client.0.vm03.stdout:2/39: write f9 [783156,29046] 0 2026-03-09T16:14:16.239 INFO:tasks.workunit.client.0.vm03.stdout:2/40: rename db to db/d10 22 2026-03-09T16:14:16.241 INFO:tasks.workunit.client.0.vm03.stdout:7/15: getdents d4 0 2026-03-09T16:14:16.241 INFO:tasks.workunit.client.0.vm03.stdout:2/41: mkdir db/d11 0 2026-03-09T16:14:16.243 INFO:tasks.workunit.client.0.vm03.stdout:2/42: rmdir db/d11 0 2026-03-09T16:14:16.259 INFO:tasks.workunit.client.0.vm03.stdout:2/43: truncate fa 742062 0 2026-03-09T16:14:16.259 INFO:tasks.workunit.client.0.vm03.stdout:2/44: mkdir db/d12 0 2026-03-09T16:14:16.259 INFO:tasks.workunit.client.0.vm03.stdout:2/45: creat db/f13 x:0 0 0 2026-03-09T16:14:16.259 INFO:tasks.workunit.client.0.vm03.stdout:2/46: truncate db/ff 236059 0 2026-03-09T16:14:16.265 INFO:tasks.workunit.client.0.vm03.stdout:8/35: sync 2026-03-09T16:14:16.270 INFO:tasks.workunit.client.0.vm03.stdout:6/13: getdents . 0 2026-03-09T16:14:16.272 INFO:tasks.workunit.client.0.vm03.stdout:5/22: sync 2026-03-09T16:14:16.272 INFO:tasks.workunit.client.0.vm03.stdout:2/47: sync 2026-03-09T16:14:16.272 INFO:tasks.workunit.client.0.vm03.stdout:7/16: sync 2026-03-09T16:14:16.273 INFO:tasks.workunit.client.0.vm03.stdout:5/23: stat d2/f3 0 2026-03-09T16:14:16.290 INFO:tasks.workunit.client.0.vm03.stdout:3/32: truncate f3 113581 0 2026-03-09T16:14:16.291 INFO:tasks.workunit.client.0.vm03.stdout:7/17: truncate f2 885139 0 2026-03-09T16:14:16.293 INFO:tasks.workunit.client.0.vm03.stdout:0/23: dwrite d0/f6 [4194304,4194304] 0 2026-03-09T16:14:16.296 INFO:tasks.workunit.client.0.vm03.stdout:0/24: chown d0/c5 23 1 2026-03-09T16:14:16.297 INFO:tasks.workunit.client.0.vm03.stdout:6/14: dwrite f2 [0,4194304] 0 2026-03-09T16:14:16.298 INFO:tasks.workunit.client.0.vm03.stdout:6/15: write f2 [191009,77254] 0 2026-03-09T16:14:16.302 INFO:tasks.workunit.client.0.vm03.stdout:4/39: rmdir d5 39 2026-03-09T16:14:16.302 INFO:tasks.workunit.client.0.vm03.stdout:1/17: dwrite f2 [0,4194304] 0 2026-03-09T16:14:16.309 INFO:tasks.workunit.client.0.vm03.stdout:9/14: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.315 INFO:tasks.workunit.client.0.vm03.stdout:4/40: dwrite f3 [0,4194304] 0 2026-03-09T16:14:16.317 INFO:tasks.workunit.client.0.vm03.stdout:5/24: dwrite d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:16 vm03.local ceph-mon[51019]: pgmap v139: 65 pgs: 65 active+clean; 176 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.4 MiB/s wr, 329 op/s 2026-03-09T16:14:16.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:16 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:16.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:16 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:16.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:16 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:16.394 INFO:tasks.workunit.client.0.vm03.stdout:1/18: fsync f2 0 2026-03-09T16:14:16.417 INFO:tasks.workunit.client.0.vm03.stdout:2/48: creat db/f14 x:0 0 0 2026-03-09T16:14:16.417 INFO:tasks.workunit.client.0.vm03.stdout:2/49: fsync f7 0 2026-03-09T16:14:16.418 INFO:tasks.workunit.client.0.vm03.stdout:8/36: unlink da/ff 0 2026-03-09T16:14:16.420 INFO:tasks.workunit.client.0.vm03.stdout:3/33: mknod d5/ca 0 2026-03-09T16:14:16.421 INFO:tasks.workunit.client.0.vm03.stdout:0/25: mkdir d0/d7 0 2026-03-09T16:14:16.423 INFO:tasks.workunit.client.0.vm03.stdout:6/16: unlink f2 0 2026-03-09T16:14:16.423 INFO:tasks.workunit.client.0.vm03.stdout:6/17: chown f0 61892503 1 2026-03-09T16:14:16.424 INFO:tasks.workunit.client.0.vm03.stdout:9/15: mkdir d2/d4 0 2026-03-09T16:14:16.428 INFO:tasks.workunit.client.0.vm03.stdout:1/19: mkdir d4/d6 0 2026-03-09T16:14:16.428 INFO:tasks.workunit.client.0.vm03.stdout:1/20: write f2 [276634,127602] 0 2026-03-09T16:14:16.430 INFO:tasks.workunit.client.0.vm03.stdout:8/37: unlink da/fd 0 2026-03-09T16:14:16.430 INFO:tasks.workunit.client.0.vm03.stdout:8/38: read - da/db/fe zero size 2026-03-09T16:14:16.431 INFO:tasks.workunit.client.0.vm03.stdout:7/18: symlink d4/l5 0 2026-03-09T16:14:16.435 INFO:tasks.workunit.client.0.vm03.stdout:6/18: unlink f0 0 2026-03-09T16:14:16.435 INFO:tasks.workunit.client.0.vm03.stdout:6/19: dread - no filename 2026-03-09T16:14:16.435 INFO:tasks.workunit.client.0.vm03.stdout:6/20: fdatasync - no filename 2026-03-09T16:14:16.435 INFO:tasks.workunit.client.0.vm03.stdout:6/21: read - no filename 2026-03-09T16:14:16.436 INFO:tasks.workunit.client.0.vm03.stdout:6/22: dwrite - no filename 2026-03-09T16:14:16.436 INFO:tasks.workunit.client.0.vm03.stdout:9/16: rename d2/f3 to d2/f5 0 2026-03-09T16:14:16.437 INFO:tasks.workunit.client.0.vm03.stdout:9/17: chown d2/f5 786069226 1 2026-03-09T16:14:16.437 INFO:tasks.workunit.client.0.vm03.stdout:9/18: readlink - no filename 2026-03-09T16:14:16.437 INFO:tasks.workunit.client.0.vm03.stdout:9/19: read d2/f5 [1876821,41447] 0 2026-03-09T16:14:16.438 INFO:tasks.workunit.client.0.vm03.stdout:4/41: rename d5/ff to d5/f10 0 2026-03-09T16:14:16.439 INFO:tasks.workunit.client.0.vm03.stdout:4/42: write d5/f9 [991933,123434] 0 2026-03-09T16:14:16.440 INFO:tasks.workunit.client.0.vm03.stdout:4/43: write d5/fa [714038,103186] 0 2026-03-09T16:14:16.440 INFO:tasks.workunit.client.0.vm03.stdout:4/44: chown d5/fa 30 1 2026-03-09T16:14:16.440 INFO:tasks.workunit.client.0.vm03.stdout:4/45: fsync f3 0 2026-03-09T16:14:16.441 INFO:tasks.workunit.client.0.vm03.stdout:4/46: truncate d5/f10 317175 0 2026-03-09T16:14:16.441 INFO:tasks.workunit.client.0.vm03.stdout:4/47: chown f2 3 1 2026-03-09T16:14:16.442 INFO:tasks.workunit.client.0.vm03.stdout:4/48: truncate d5/f9 1722240 0 2026-03-09T16:14:16.442 INFO:tasks.workunit.client.0.vm03.stdout:4/49: truncate d5/f8 828356 0 2026-03-09T16:14:16.444 INFO:tasks.workunit.client.0.vm03.stdout:8/39: rename l9 to da/db/l11 0 2026-03-09T16:14:16.451 INFO:tasks.workunit.client.0.vm03.stdout:6/23: rename c3 to c4 0 2026-03-09T16:14:16.451 INFO:tasks.workunit.client.0.vm03.stdout:7/19: dwrite f1 [0,4194304] 0 2026-03-09T16:14:16.453 INFO:tasks.workunit.client.0.vm03.stdout:9/20: write d2/f5 [4774443,23845] 0 2026-03-09T16:14:16.458 INFO:tasks.workunit.client.0.vm03.stdout:4/50: mkdir d5/dd/d11 0 2026-03-09T16:14:16.460 INFO:tasks.workunit.client.0.vm03.stdout:8/40: symlink da/l12 0 2026-03-09T16:14:16.462 INFO:tasks.workunit.client.0.vm03.stdout:6/24: creat f5 x:0 0 0 2026-03-09T16:14:16.463 INFO:tasks.workunit.client.0.vm03.stdout:7/20: unlink f1 0 2026-03-09T16:14:16.463 INFO:tasks.workunit.client.0.vm03.stdout:7/21: chown d4/l5 828484 1 2026-03-09T16:14:16.465 INFO:tasks.workunit.client.0.vm03.stdout:5/25: link d2/ca d2/cb 0 2026-03-09T16:14:16.469 INFO:tasks.workunit.client.0.vm03.stdout:5/26: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.483 INFO:tasks.workunit.client.0.vm03.stdout:5/27: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.485 INFO:tasks.workunit.client.0.vm03.stdout:5/28: dread d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.486 INFO:tasks.workunit.client.0.vm03.stdout:8/41: creat da/db/f13 x:0 0 0 2026-03-09T16:14:16.486 INFO:tasks.workunit.client.0.vm03.stdout:8/42: readlink da/l12 0 2026-03-09T16:14:16.512 INFO:tasks.workunit.client.0.vm03.stdout:8/43: creat da/d10/f14 x:0 0 0 2026-03-09T16:14:16.513 INFO:tasks.workunit.client.0.vm03.stdout:5/29: link d2/ca d2/d4/d9/cc 0 2026-03-09T16:14:16.515 INFO:tasks.workunit.client.0.vm03.stdout:8/44: mkdir da/d15 0 2026-03-09T16:14:16.515 INFO:tasks.workunit.client.0.vm03.stdout:8/45: truncate f8 155210 0 2026-03-09T16:14:16.516 INFO:tasks.workunit.client.0.vm03.stdout:8/46: dread f8 [0,4194304] 0 2026-03-09T16:14:16.517 INFO:tasks.workunit.client.0.vm03.stdout:5/30: symlink d2/d4/ld 0 2026-03-09T16:14:16.519 INFO:tasks.workunit.client.0.vm03.stdout:5/31: mkdir d2/d7/de 0 2026-03-09T16:14:16.525 INFO:tasks.workunit.client.0.vm03.stdout:5/32: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.528 INFO:tasks.workunit.client.0.vm03.stdout:5/33: chown d2/cb 0 1 2026-03-09T16:14:16.528 INFO:tasks.workunit.client.0.vm03.stdout:5/34: rename d2 to d2/df 22 2026-03-09T16:14:16.531 INFO:tasks.workunit.client.0.vm03.stdout:5/35: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.534 INFO:tasks.workunit.client.0.vm03.stdout:5/36: dread d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.615 INFO:tasks.workunit.client.0.vm03.stdout:3/34: getdents d5 0 2026-03-09T16:14:16.616 INFO:tasks.workunit.client.0.vm03.stdout:0/26: getdents d0 0 2026-03-09T16:14:16.618 INFO:tasks.workunit.client.0.vm03.stdout:3/35: creat d5/fb x:0 0 0 2026-03-09T16:14:16.622 INFO:tasks.workunit.client.0.vm03.stdout:0/27: dwrite d0/f3 [0,4194304] 0 2026-03-09T16:14:16.627 INFO:tasks.workunit.client.0.vm03.stdout:0/28: dwrite d0/f3 [0,4194304] 0 2026-03-09T16:14:16.630 INFO:tasks.workunit.client.0.vm03.stdout:3/36: rename d5/l7 to d5/lc 0 2026-03-09T16:14:16.637 INFO:tasks.workunit.client.0.vm03.stdout:3/37: dwrite d5/fb [0,4194304] 0 2026-03-09T16:14:16.647 INFO:tasks.workunit.client.0.vm03.stdout:9/21: unlink d2/f5 0 2026-03-09T16:14:16.649 INFO:tasks.workunit.client.0.vm03.stdout:0/29: creat d0/d7/f8 x:0 0 0 2026-03-09T16:14:16.650 INFO:tasks.workunit.client.0.vm03.stdout:0/30: chown d0/c5 4326 1 2026-03-09T16:14:16.651 INFO:tasks.workunit.client.0.vm03.stdout:8/47: readlink da/db/l11 0 2026-03-09T16:14:16.653 INFO:tasks.workunit.client.0.vm03.stdout:6/25: stat c4 0 2026-03-09T16:14:16.654 INFO:tasks.workunit.client.0.vm03.stdout:1/21: truncate f2 3357435 0 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:3/38: truncate f4 2811307 0 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:9/22: getdents d2/d4 0 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:9/23: write - no filename 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:0/31: creat d0/f9 x:0 0 0 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:8/48: read f8 [2003,122293] 0 2026-03-09T16:14:16.664 INFO:tasks.workunit.client.0.vm03.stdout:6/26: rename c4 to c6 0 2026-03-09T16:14:16.667 INFO:tasks.workunit.client.0.vm03.stdout:6/27: dwrite f5 [0,4194304] 0 2026-03-09T16:14:16.667 INFO:tasks.workunit.client.0.vm03.stdout:6/28: rmdir - no directory 2026-03-09T16:14:16.673 INFO:tasks.workunit.client.0.vm03.stdout:8/49: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:16.673 INFO:tasks.workunit.client.0.vm03.stdout:8/50: write da/db/f13 [24683,23203] 0 2026-03-09T16:14:16.678 INFO:tasks.workunit.client.0.vm03.stdout:4/51: truncate d5/f8 679089 0 2026-03-09T16:14:16.685 INFO:tasks.workunit.client.0.vm03.stdout:0/32: mkdir d0/da 0 2026-03-09T16:14:16.685 INFO:tasks.workunit.client.0.vm03.stdout:4/52: stat d5 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:7/22: truncate f2 446953 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:5/37: getdents d2/d4/d9 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:6/29: creat f7 x:0 0 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:6/30: chown f5 1308972 1 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:6/31: truncate f7 518759 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:6/32: write f5 [3709297,89321] 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:8/51: mknod da/c16 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:1/22: link c3 d4/c7 0 2026-03-09T16:14:16.686 INFO:tasks.workunit.client.0.vm03.stdout:7/23: mknod d4/c6 0 2026-03-09T16:14:16.687 INFO:tasks.workunit.client.0.vm03.stdout:4/53: creat d5/db/f12 x:0 0 0 2026-03-09T16:14:16.688 INFO:tasks.workunit.client.0.vm03.stdout:5/38: rename d2/d4/c5 to d2/d4/d9/c10 0 2026-03-09T16:14:16.689 INFO:tasks.workunit.client.0.vm03.stdout:8/52: mknod da/d15/c17 0 2026-03-09T16:14:16.690 INFO:tasks.workunit.client.0.vm03.stdout:4/54: creat d5/db/f13 x:0 0 0 2026-03-09T16:14:16.693 INFO:tasks.workunit.client.0.vm03.stdout:1/23: symlink d4/d6/l8 0 2026-03-09T16:14:16.709 INFO:tasks.workunit.client.0.vm03.stdout:4/55: dwrite d5/f7 [0,4194304] 0 2026-03-09T16:14:16.709 INFO:tasks.workunit.client.0.vm03.stdout:7/24: rename c3 to d4/c7 0 2026-03-09T16:14:16.709 INFO:tasks.workunit.client.0.vm03.stdout:4/56: dwrite d5/f9 [0,4194304] 0 2026-03-09T16:14:16.709 INFO:tasks.workunit.client.0.vm03.stdout:5/39: dwrite d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.713 INFO:tasks.workunit.client.0.vm03.stdout:4/57: rename f3 to d5/dd/d11/f14 0 2026-03-09T16:14:16.720 INFO:tasks.workunit.client.0.vm03.stdout:4/58: write d5/f7 [709574,77988] 0 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/59: mknod d5/c15 0 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/60: truncate d5/db/f13 865618 0 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/61: stat d5/db 0 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/62: chown d5 568 1 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/63: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T16:14:16.748 INFO:tasks.workunit.client.0.vm03.stdout:4/64: rmdir d5/dd/d11 39 2026-03-09T16:14:16.752 INFO:tasks.workunit.client.0.vm03.stdout:2/50: mknod db/c15 0 2026-03-09T16:14:16.766 INFO:tasks.workunit.client.0.vm03.stdout:3/39: sync 2026-03-09T16:14:16.831 INFO:tasks.workunit.client.0.vm03.stdout:0/33: sync 2026-03-09T16:14:16.831 INFO:tasks.workunit.client.0.vm03.stdout:0/34: stat d0/d7/f8 0 2026-03-09T16:14:16.832 INFO:tasks.workunit.client.0.vm03.stdout:2/51: sync 2026-03-09T16:14:16.832 INFO:tasks.workunit.client.0.vm03.stdout:8/53: sync 2026-03-09T16:14:16.832 INFO:tasks.workunit.client.0.vm03.stdout:4/65: sync 2026-03-09T16:14:16.832 INFO:tasks.workunit.client.0.vm03.stdout:8/54: readlink da/l12 0 2026-03-09T16:14:16.833 INFO:tasks.workunit.client.0.vm03.stdout:4/66: write d5/fa [709132,77450] 0 2026-03-09T16:14:16.833 INFO:tasks.workunit.client.0.vm03.stdout:4/67: fdatasync d5/db/f13 0 2026-03-09T16:14:16.835 INFO:tasks.workunit.client.0.vm03.stdout:8/55: mknod da/c18 0 2026-03-09T16:14:16.837 INFO:tasks.workunit.client.0.vm03.stdout:4/68: write d5/dd/d11/f14 [7838223,69547] 0 2026-03-09T16:14:16.838 INFO:tasks.workunit.client.0.vm03.stdout:0/35: write d0/f3 [4731379,36388] 0 2026-03-09T16:14:16.842 INFO:tasks.workunit.client.0.vm03.stdout:9/24: getdents d2 0 2026-03-09T16:14:16.842 INFO:tasks.workunit.client.0.vm03.stdout:9/25: fdatasync - no filename 2026-03-09T16:14:16.842 INFO:tasks.workunit.client.0.vm03.stdout:9/26: fsync - no filename 2026-03-09T16:14:16.842 INFO:tasks.workunit.client.0.vm03.stdout:9/27: truncate - no filename 2026-03-09T16:14:16.845 INFO:tasks.workunit.client.0.vm03.stdout:2/52: symlink db/d12/l16 0 2026-03-09T16:14:16.846 INFO:tasks.workunit.client.0.vm03.stdout:9/28: rmdir d2 39 2026-03-09T16:14:16.847 INFO:tasks.workunit.client.0.vm03.stdout:2/53: symlink db/d12/l17 0 2026-03-09T16:14:16.848 INFO:tasks.workunit.client.0.vm03.stdout:4/69: rename d5/db/f12 to d5/dd/f16 0 2026-03-09T16:14:16.848 INFO:tasks.workunit.client.0.vm03.stdout:4/70: chown d5/db 6 1 2026-03-09T16:14:16.849 INFO:tasks.workunit.client.0.vm03.stdout:4/71: write f1 [343178,53758] 0 2026-03-09T16:14:16.851 INFO:tasks.workunit.client.0.vm03.stdout:0/36: rename d0/c4 to d0/da/cb 0 2026-03-09T16:14:16.851 INFO:tasks.workunit.client.0.vm03.stdout:0/37: read - d0/f9 zero size 2026-03-09T16:14:16.852 INFO:tasks.workunit.client.0.vm03.stdout:9/29: mknod d2/d4/c6 0 2026-03-09T16:14:16.852 INFO:tasks.workunit.client.0.vm03.stdout:9/30: write - no filename 2026-03-09T16:14:16.852 INFO:tasks.workunit.client.0.vm03.stdout:9/31: truncate - no filename 2026-03-09T16:14:16.854 INFO:tasks.workunit.client.0.vm03.stdout:9/32: creat d2/f7 x:0 0 0 2026-03-09T16:14:16.854 INFO:tasks.workunit.client.0.vm03.stdout:9/33: readlink - no filename 2026-03-09T16:14:16.858 INFO:tasks.workunit.client.0.vm03.stdout:9/34: dwrite d2/f7 [0,4194304] 0 2026-03-09T16:14:16.861 INFO:tasks.workunit.client.0.vm03.stdout:2/54: rename f1 to db/f18 0 2026-03-09T16:14:16.868 INFO:tasks.workunit.client.0.vm03.stdout:9/35: creat d2/f8 x:0 0 0 2026-03-09T16:14:16.868 INFO:tasks.workunit.client.0.vm03.stdout:9/36: readlink - no filename 2026-03-09T16:14:16.868 INFO:tasks.workunit.client.0.vm03.stdout:9/37: truncate d2/f8 194160 0 2026-03-09T16:14:16.870 INFO:tasks.workunit.client.0.vm03.stdout:2/55: mknod db/d12/c19 0 2026-03-09T16:14:16.870 INFO:tasks.workunit.client.0.vm03.stdout:9/38: mkdir d2/d4/d9 0 2026-03-09T16:14:16.871 INFO:tasks.workunit.client.0.vm03.stdout:2/56: mknod db/d12/c1a 0 2026-03-09T16:14:16.872 INFO:tasks.workunit.client.0.vm03.stdout:2/57: write f5 [89770,54317] 0 2026-03-09T16:14:16.878 INFO:tasks.workunit.client.0.vm03.stdout:2/58: link db/d12/c19 db/d12/c1b 0 2026-03-09T16:14:16.878 INFO:tasks.workunit.client.0.vm03.stdout:2/59: chown db/f13 3 1 2026-03-09T16:14:16.882 INFO:tasks.workunit.client.0.vm03.stdout:2/60: stat db/d12/l16 0 2026-03-09T16:14:16.882 INFO:tasks.workunit.client.0.vm03.stdout:2/61: symlink db/d12/l1c 0 2026-03-09T16:14:16.925 INFO:tasks.workunit.client.0.vm03.stdout:5/40: fdatasync d2/f3 0 2026-03-09T16:14:16.926 INFO:tasks.workunit.client.0.vm03.stdout:0/38: fsync d0/f3 0 2026-03-09T16:14:16.926 INFO:tasks.workunit.client.0.vm03.stdout:0/39: readlink - no filename 2026-03-09T16:14:16.927 INFO:tasks.workunit.client.0.vm03.stdout:0/40: unlink d0/f6 0 2026-03-09T16:14:16.927 INFO:tasks.workunit.client.0.vm03.stdout:0/41: stat d0/c5 0 2026-03-09T16:14:16.936 INFO:tasks.workunit.client.0.vm03.stdout:1/24: dwrite f2 [0,4194304] 0 2026-03-09T16:14:16.939 INFO:tasks.workunit.client.0.vm03.stdout:6/33: chown c6 0 1 2026-03-09T16:14:16.942 INFO:tasks.workunit.client.0.vm03.stdout:1/25: dread f2 [0,4194304] 0 2026-03-09T16:14:16.948 INFO:tasks.workunit.client.0.vm03.stdout:6/34: sync 2026-03-09T16:14:16.948 INFO:tasks.workunit.client.0.vm03.stdout:6/35: chown f7 434 1 2026-03-09T16:14:16.954 INFO:tasks.workunit.client.0.vm03.stdout:1/26: creat d4/d6/f9 x:0 0 0 2026-03-09T16:14:16.957 INFO:tasks.workunit.client.0.vm03.stdout:4/72: rename d5/dd/d11 to d5/d17 0 2026-03-09T16:14:16.957 INFO:tasks.workunit.client.0.vm03.stdout:4/73: chown d5/dd 248705 1 2026-03-09T16:14:16.958 INFO:tasks.workunit.client.0.vm03.stdout:1/27: creat d4/fa x:0 0 0 2026-03-09T16:14:16.959 INFO:tasks.workunit.client.0.vm03.stdout:5/41: rename d2/d4 to d2/d7/de/d11 0 2026-03-09T16:14:16.962 INFO:tasks.workunit.client.0.vm03.stdout:4/74: creat d5/d17/f18 x:0 0 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:5/42: dwrite d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:8/56: rmdir da 39 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:5/43: write d2/f3 [7638153,50685] 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:1/28: mkdir d4/db 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:1/29: chown d4/d6/l8 8267897 1 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:5/44: chown d2/d7/de/d11/c6 3 1 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:1/30: truncate d4/d6/f9 314806 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:1/31: truncate f1 5216033 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:8/57: dwrite f6 [0,4194304] 0 2026-03-09T16:14:16.974 INFO:tasks.workunit.client.0.vm03.stdout:1/32: unlink c3 0 2026-03-09T16:14:16.976 INFO:tasks.workunit.client.0.vm03.stdout:1/33: write f1 [2551454,83905] 0 2026-03-09T16:14:16.976 INFO:tasks.workunit.client.0.vm03.stdout:5/45: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:16.978 INFO:tasks.workunit.client.0.vm03.stdout:5/46: symlink d2/d7/de/l12 0 2026-03-09T16:14:16.978 INFO:tasks.workunit.client.0.vm03.stdout:1/34: rename d4/d6/l8 to d4/d6/lc 0 2026-03-09T16:14:16.978 INFO:tasks.workunit.client.0.vm03.stdout:5/47: readlink d2/d7/de/d11/ld 0 2026-03-09T16:14:16.982 INFO:tasks.workunit.client.0.vm03.stdout:5/48: mknod d2/d7/de/d11/d9/c13 0 2026-03-09T16:14:16.982 INFO:tasks.workunit.client.0.vm03.stdout:5/49: readlink d2/d7/de/l12 0 2026-03-09T16:14:16.986 INFO:tasks.workunit.client.0.vm03.stdout:5/50: rename d2/d7/de/l12 to d2/l14 0 2026-03-09T16:14:16.993 INFO:tasks.workunit.client.0.vm03.stdout:5/51: write d2/f3 [8640768,70220] 0 2026-03-09T16:14:16.993 INFO:tasks.workunit.client.0.vm03.stdout:5/52: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:16.997 INFO:tasks.workunit.client.0.vm03.stdout:5/53: dread d2/f3 [4194304,4194304] 0 2026-03-09T16:14:16.999 INFO:tasks.workunit.client.0.vm03.stdout:5/54: rmdir d2 39 2026-03-09T16:14:17.037 INFO:tasks.workunit.client.0.vm03.stdout:5/55: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:17.037 INFO:tasks.workunit.client.0.vm03.stdout:5/56: mknod d2/c15 0 2026-03-09T16:14:17.037 INFO:tasks.workunit.client.0.vm03.stdout:5/57: dread d2/f3 [4194304,4194304] 0 2026-03-09T16:14:17.074 INFO:tasks.workunit.client.0.vm03.stdout:0/42: getdents d0/da 0 2026-03-09T16:14:17.075 INFO:tasks.workunit.client.0.vm03.stdout:9/39: write d2/f7 [4289920,30961] 0 2026-03-09T16:14:17.077 INFO:tasks.workunit.client.0.vm03.stdout:0/43: dread d0/f3 [0,4194304] 0 2026-03-09T16:14:17.079 INFO:tasks.workunit.client.0.vm03.stdout:0/44: mkdir d0/da/dc 0 2026-03-09T16:14:17.089 INFO:tasks.workunit.client.0.vm03.stdout:0/45: fdatasync d0/f3 0 2026-03-09T16:14:17.089 INFO:tasks.workunit.client.0.vm03.stdout:0/46: chown d0/da/dc 9630004 1 2026-03-09T16:14:17.089 INFO:tasks.workunit.client.0.vm03.stdout:0/47: creat d0/da/dc/fd x:0 0 0 2026-03-09T16:14:17.089 INFO:tasks.workunit.client.0.vm03.stdout:0/48: stat d0/c5 0 2026-03-09T16:14:17.089 INFO:tasks.workunit.client.0.vm03.stdout:1/35: fdatasync f2 0 2026-03-09T16:14:17.090 INFO:tasks.workunit.client.0.vm03.stdout:0/49: creat d0/da/fe x:0 0 0 2026-03-09T16:14:17.090 INFO:tasks.workunit.client.0.vm03.stdout:0/50: unlink d0/c5 0 2026-03-09T16:14:17.090 INFO:tasks.workunit.client.0.vm03.stdout:1/36: link f2 d4/fd 0 2026-03-09T16:14:17.090 INFO:tasks.workunit.client.0.vm03.stdout:1/37: dread - d4/fa zero size 2026-03-09T16:14:17.090 INFO:tasks.workunit.client.0.vm03.stdout:0/51: dwrite d0/da/dc/fd [0,4194304] 0 2026-03-09T16:14:17.096 INFO:tasks.workunit.client.0.vm03.stdout:0/52: creat d0/da/ff x:0 0 0 2026-03-09T16:14:17.097 INFO:tasks.workunit.client.0.vm03.stdout:1/38: dwrite f2 [0,4194304] 0 2026-03-09T16:14:17.098 INFO:tasks.workunit.client.0.vm03.stdout:1/39: chown d4/d6 3336117 1 2026-03-09T16:14:17.098 INFO:tasks.workunit.client.0.vm03.stdout:0/53: creat d0/da/f10 x:0 0 0 2026-03-09T16:14:17.099 INFO:tasks.workunit.client.0.vm03.stdout:0/54: stat d0/d7 0 2026-03-09T16:14:17.102 INFO:tasks.workunit.client.0.vm03.stdout:0/55: stat d0/da/cb 0 2026-03-09T16:14:17.110 INFO:tasks.workunit.client.0.vm03.stdout:0/56: dwrite d0/da/fe [0,4194304] 0 2026-03-09T16:14:17.127 INFO:tasks.workunit.client.0.vm03.stdout:0/57: mkdir d0/da/d11 0 2026-03-09T16:14:17.127 INFO:tasks.workunit.client.0.vm03.stdout:0/58: creat d0/da/dc/f12 x:0 0 0 2026-03-09T16:14:17.127 INFO:tasks.workunit.client.0.vm03.stdout:0/59: dread d0/da/fe [0,4194304] 0 2026-03-09T16:14:17.295 INFO:tasks.workunit.client.0.vm03.stdout:9/40: fsync d2/f7 0 2026-03-09T16:14:17.295 INFO:tasks.workunit.client.0.vm03.stdout:3/40: write f1 [1609809,86326] 0 2026-03-09T16:14:17.297 INFO:tasks.workunit.client.0.vm03.stdout:2/62: dwrite db/fd [0,4194304] 0 2026-03-09T16:14:17.298 INFO:tasks.workunit.client.0.vm03.stdout:2/63: fsync fa 0 2026-03-09T16:14:17.299 INFO:tasks.workunit.client.0.vm03.stdout:7/25: truncate f2 1251645 0 2026-03-09T16:14:17.306 INFO:tasks.workunit.client.0.vm03.stdout:9/41: dread d2/f7 [0,4194304] 0 2026-03-09T16:14:17.311 INFO:tasks.workunit.client.0.vm03.stdout:6/36: truncate f5 1775568 0 2026-03-09T16:14:17.313 INFO:tasks.workunit.client.0.vm03.stdout:4/75: truncate d5/dd/fe 358942 0 2026-03-09T16:14:17.315 INFO:tasks.workunit.client.0.vm03.stdout:8/58: truncate f6 4159282 0 2026-03-09T16:14:17.318 INFO:tasks.workunit.client.0.vm03.stdout:1/40: readlink d4/d6/lc 0 2026-03-09T16:14:17.319 INFO:tasks.workunit.client.0.vm03.stdout:5/58: rename d2/d7/de/d11/d9 to d2/d7/d8/d16 0 2026-03-09T16:14:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:17 vm03.local ceph-mon[51019]: pgmap v140: 65 pgs: 65 active+clean; 186 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.2 MiB/s wr, 333 op/s 2026-03-09T16:14:17.417 INFO:tasks.workunit.client.0.vm03.stdout:7/26: creat d4/f8 x:0 0 0 2026-03-09T16:14:17.417 INFO:tasks.workunit.client.0.vm03.stdout:9/42: creat d2/d4/fa x:0 0 0 2026-03-09T16:14:17.419 INFO:tasks.workunit.client.0.vm03.stdout:4/76: creat d5/d17/f19 x:0 0 0 2026-03-09T16:14:17.420 INFO:tasks.workunit.client.0.vm03.stdout:8/59: unlink f3 0 2026-03-09T16:14:17.425 INFO:tasks.workunit.client.0.vm03.stdout:3/41: unlink f3 0 2026-03-09T16:14:17.425 INFO:tasks.workunit.client.0.vm03.stdout:3/42: fdatasync f4 0 2026-03-09T16:14:17.428 INFO:tasks.workunit.client.0.vm03.stdout:4/77: symlink d5/db/l1a 0 2026-03-09T16:14:17.430 INFO:tasks.workunit.client.0.vm03.stdout:4/78: dwrite d5/f9 [0,4194304] 0 2026-03-09T16:14:17.431 INFO:tasks.workunit.client.0.vm03.stdout:0/60: rename d0/da/f10 to d0/da/d11/f13 0 2026-03-09T16:14:17.431 INFO:tasks.workunit.client.0.vm03.stdout:9/43: rename d2 to d2/d4/d9/db 22 2026-03-09T16:14:17.431 INFO:tasks.workunit.client.0.vm03.stdout:9/44: chown d2/f8 130730 1 2026-03-09T16:14:17.432 INFO:tasks.workunit.client.0.vm03.stdout:9/45: read d2/f8 [115803,75301] 0 2026-03-09T16:14:17.432 INFO:tasks.workunit.client.0.vm03.stdout:0/61: chown d0/da/cb 6 1 2026-03-09T16:14:17.434 INFO:tasks.workunit.client.0.vm03.stdout:4/79: dread d5/f7 [0,4194304] 0 2026-03-09T16:14:17.442 INFO:tasks.workunit.client.0.vm03.stdout:3/43: creat d5/fd x:0 0 0 2026-03-09T16:14:17.449 INFO:tasks.workunit.client.0.vm03.stdout:8/60: mknod da/db/c19 0 2026-03-09T16:14:17.451 INFO:tasks.workunit.client.0.vm03.stdout:0/62: symlink d0/da/l14 0 2026-03-09T16:14:17.452 INFO:tasks.workunit.client.0.vm03.stdout:0/63: write d0/da/dc/fd [3564367,107669] 0 2026-03-09T16:14:17.455 INFO:tasks.workunit.client.0.vm03.stdout:0/64: dwrite d0/da/dc/f12 [0,4194304] 0 2026-03-09T16:14:17.455 INFO:tasks.workunit.client.0.vm03.stdout:0/65: rename d0 to d0/d15 22 2026-03-09T16:14:17.456 INFO:tasks.workunit.client.0.vm03.stdout:0/66: chown d0/f9 241 1 2026-03-09T16:14:17.457 INFO:tasks.workunit.client.0.vm03.stdout:0/67: read - d0/d7/f8 zero size 2026-03-09T16:14:17.458 INFO:tasks.workunit.client.0.vm03.stdout:0/68: write d0/da/dc/f12 [522800,101014] 0 2026-03-09T16:14:17.458 INFO:tasks.workunit.client.0.vm03.stdout:2/64: link db/d12/l1c db/l1d 0 2026-03-09T16:14:17.460 INFO:tasks.workunit.client.0.vm03.stdout:8/61: creat da/f1a x:0 0 0 2026-03-09T16:14:17.462 INFO:tasks.workunit.client.0.vm03.stdout:5/59: getdents d2/d7 0 2026-03-09T16:14:17.467 INFO:tasks.workunit.client.0.vm03.stdout:2/65: chown db/cc 97 1 2026-03-09T16:14:17.467 INFO:tasks.workunit.client.0.vm03.stdout:2/66: readlink db/d12/l17 0 2026-03-09T16:14:17.468 INFO:tasks.workunit.client.0.vm03.stdout:0/69: write d0/da/d11/f13 [561576,9166] 0 2026-03-09T16:14:17.470 INFO:tasks.workunit.client.0.vm03.stdout:8/62: rename da/f1a to da/d15/f1b 0 2026-03-09T16:14:17.471 INFO:tasks.workunit.client.0.vm03.stdout:5/60: mknod d2/d7/d8/d16/c17 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:8/63: dread da/db/f13 [0,4194304] 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:5/61: write d2/f3 [1528032,74651] 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:2/67: fdatasync db/ff 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:5/62: mknod d2/d7/d8/c18 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:5/63: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:2/68: fsync db/f18 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:2/69: dread - db/f14 zero size 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:5/64: write d2/f3 [5489805,30177] 0 2026-03-09T16:14:17.491 INFO:tasks.workunit.client.0.vm03.stdout:2/70: read db/f18 [497747,91077] 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:5/65: dread d2/f3 [8388608,4194304] 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:2/71: mkdir db/d1e 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:5/66: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:5/67: mkdir d2/d7/de/d11/d19 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:2/72: mkdir db/d1e/d1f 0 2026-03-09T16:14:17.492 INFO:tasks.workunit.client.0.vm03.stdout:5/68: mkdir d2/d7/d1a 0 2026-03-09T16:14:17.493 INFO:tasks.workunit.client.0.vm03.stdout:2/73: dwrite db/ff [0,4194304] 0 2026-03-09T16:14:17.494 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:17 vm05.local ceph-mon[58702]: pgmap v140: 65 pgs: 65 active+clean; 186 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 4.2 MiB/s wr, 333 op/s 2026-03-09T16:14:17.495 INFO:tasks.workunit.client.0.vm03.stdout:2/74: dread - db/f14 zero size 2026-03-09T16:14:17.498 INFO:tasks.workunit.client.0.vm03.stdout:3/44: sync 2026-03-09T16:14:17.500 INFO:tasks.workunit.client.0.vm03.stdout:5/69: readlink d2/l14 0 2026-03-09T16:14:17.508 INFO:tasks.workunit.client.0.vm03.stdout:2/75: chown db/d12/c19 6 1 2026-03-09T16:14:17.509 INFO:tasks.workunit.client.0.vm03.stdout:3/45: symlink d5/le 0 2026-03-09T16:14:17.511 INFO:tasks.workunit.client.0.vm03.stdout:2/76: dread db/f18 [0,4194304] 0 2026-03-09T16:14:17.512 INFO:tasks.workunit.client.0.vm03.stdout:2/77: chown db/d12/l17 550845 1 2026-03-09T16:14:17.516 INFO:tasks.workunit.client.0.vm03.stdout:5/70: write d2/f3 [7304577,2606] 0 2026-03-09T16:14:17.519 INFO:tasks.workunit.client.0.vm03.stdout:2/78: creat db/d1e/f20 x:0 0 0 2026-03-09T16:14:17.520 INFO:tasks.workunit.client.0.vm03.stdout:2/79: write db/fd [2180283,101541] 0 2026-03-09T16:14:17.522 INFO:tasks.workunit.client.0.vm03.stdout:3/46: dwrite d5/f6 [4194304,4194304] 0 2026-03-09T16:14:17.533 INFO:tasks.workunit.client.0.vm03.stdout:0/70: fdatasync d0/da/d11/f13 0 2026-03-09T16:14:17.543 INFO:tasks.workunit.client.0.vm03.stdout:3/47: unlink f4 0 2026-03-09T16:14:17.543 INFO:tasks.workunit.client.0.vm03.stdout:3/48: write d5/fb [3083416,47390] 0 2026-03-09T16:14:17.545 INFO:tasks.workunit.client.0.vm03.stdout:0/71: creat d0/d7/f16 x:0 0 0 2026-03-09T16:14:17.548 INFO:tasks.workunit.client.0.vm03.stdout:3/49: rename d5/le to d5/lf 0 2026-03-09T16:14:17.551 INFO:tasks.workunit.client.0.vm03.stdout:2/80: rmdir db/d1e/d1f 0 2026-03-09T16:14:17.553 INFO:tasks.workunit.client.0.vm03.stdout:3/50: dwrite d5/f6 [0,4194304] 0 2026-03-09T16:14:17.558 INFO:tasks.workunit.client.0.vm03.stdout:9/46: rmdir d2 39 2026-03-09T16:14:17.561 INFO:tasks.workunit.client.0.vm03.stdout:2/81: dwrite db/fd [4194304,4194304] 0 2026-03-09T16:14:17.564 INFO:tasks.workunit.client.0.vm03.stdout:2/82: fdatasync f0 0 2026-03-09T16:14:17.565 INFO:tasks.workunit.client.0.vm03.stdout:2/83: fdatasync f9 0 2026-03-09T16:14:17.565 INFO:tasks.workunit.client.0.vm03.stdout:2/84: chown db/fd 11 1 2026-03-09T16:14:17.565 INFO:tasks.workunit.client.0.vm03.stdout:2/85: stat fa 0 2026-03-09T16:14:17.571 INFO:tasks.workunit.client.0.vm03.stdout:6/37: read f5 [973268,59777] 0 2026-03-09T16:14:17.573 INFO:tasks.workunit.client.0.vm03.stdout:3/51: rename d5/f9 to d5/f10 0 2026-03-09T16:14:17.579 INFO:tasks.workunit.client.0.vm03.stdout:7/27: dread f2 [0,4194304] 0 2026-03-09T16:14:17.580 INFO:tasks.workunit.client.0.vm03.stdout:2/86: creat db/d12/f21 x:0 0 0 2026-03-09T16:14:17.586 INFO:tasks.workunit.client.0.vm03.stdout:4/80: dread d5/dd/fe [0,4194304] 0 2026-03-09T16:14:17.592 INFO:tasks.workunit.client.0.vm03.stdout:1/41: dwrite d4/fd [4194304,4194304] 0 2026-03-09T16:14:17.601 INFO:tasks.workunit.client.0.vm03.stdout:6/38: rename f5 to f8 0 2026-03-09T16:14:17.601 INFO:tasks.workunit.client.0.vm03.stdout:3/52: write d5/f10 [357301,122494] 0 2026-03-09T16:14:17.602 INFO:tasks.workunit.client.0.vm03.stdout:0/72: getdents d0 0 2026-03-09T16:14:17.603 INFO:tasks.workunit.client.0.vm03.stdout:0/73: write d0/f9 [835149,77398] 0 2026-03-09T16:14:17.603 INFO:tasks.workunit.client.0.vm03.stdout:0/74: stat d0 0 2026-03-09T16:14:17.604 INFO:tasks.workunit.client.0.vm03.stdout:0/75: truncate d0/da/ff 913974 0 2026-03-09T16:14:17.604 INFO:tasks.workunit.client.0.vm03.stdout:0/76: fdatasync d0/d7/f8 0 2026-03-09T16:14:17.607 INFO:tasks.workunit.client.0.vm03.stdout:9/47: unlink d2/d4/c6 0 2026-03-09T16:14:17.610 INFO:tasks.workunit.client.0.vm03.stdout:7/28: mknod d4/c9 0 2026-03-09T16:14:17.611 INFO:tasks.workunit.client.0.vm03.stdout:8/64: dwrite f8 [0,4194304] 0 2026-03-09T16:14:17.613 INFO:tasks.workunit.client.0.vm03.stdout:8/65: readlink da/l12 0 2026-03-09T16:14:17.613 INFO:tasks.workunit.client.0.vm03.stdout:7/29: write d4/f8 [333759,69638] 0 2026-03-09T16:14:17.621 INFO:tasks.workunit.client.0.vm03.stdout:8/66: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:17.629 INFO:tasks.workunit.client.0.vm03.stdout:4/81: dwrite d5/dd/f16 [0,4194304] 0 2026-03-09T16:14:17.633 INFO:tasks.workunit.client.0.vm03.stdout:2/87: dwrite f8 [0,4194304] 0 2026-03-09T16:14:17.640 INFO:tasks.workunit.client.0.vm03.stdout:5/71: truncate d2/f3 6165877 0 2026-03-09T16:14:17.643 INFO:tasks.workunit.client.0.vm03.stdout:0/77: sync 2026-03-09T16:14:17.644 INFO:tasks.workunit.client.0.vm03.stdout:0/78: chown d0/da/dc 1 1 2026-03-09T16:14:17.644 INFO:tasks.workunit.client.0.vm03.stdout:3/53: rmdir d5 39 2026-03-09T16:14:17.646 INFO:tasks.workunit.client.0.vm03.stdout:8/67: fdatasync f6 0 2026-03-09T16:14:17.650 INFO:tasks.workunit.client.0.vm03.stdout:8/68: dwrite f6 [0,4194304] 0 2026-03-09T16:14:17.652 INFO:tasks.workunit.client.0.vm03.stdout:1/42: mknod d4/ce 0 2026-03-09T16:14:17.660 INFO:tasks.workunit.client.0.vm03.stdout:4/82: dread d5/f8 [0,4194304] 0 2026-03-09T16:14:17.662 INFO:tasks.workunit.client.0.vm03.stdout:5/72: mknod d2/d7/de/d11/c1b 0 2026-03-09T16:14:17.668 INFO:tasks.workunit.client.0.vm03.stdout:0/79: dwrite d0/da/fe [0,4194304] 0 2026-03-09T16:14:17.677 INFO:tasks.workunit.client.0.vm03.stdout:1/43: chown d4/fa 75928776 1 2026-03-09T16:14:17.679 INFO:tasks.workunit.client.0.vm03.stdout:4/83: mkdir d5/db/d1b 0 2026-03-09T16:14:17.683 INFO:tasks.workunit.client.0.vm03.stdout:0/80: creat d0/da/f17 x:0 0 0 2026-03-09T16:14:17.695 INFO:tasks.workunit.client.0.vm03.stdout:7/30: dread d4/f8 [0,4194304] 0 2026-03-09T16:14:17.695 INFO:tasks.workunit.client.0.vm03.stdout:4/84: unlink d5/d17/f19 0 2026-03-09T16:14:17.696 INFO:tasks.workunit.client.0.vm03.stdout:7/31: chown d4/c6 95202 1 2026-03-09T16:14:17.698 INFO:tasks.workunit.client.0.vm03.stdout:4/85: dread d5/dd/f16 [0,4194304] 0 2026-03-09T16:14:17.701 INFO:tasks.workunit.client.0.vm03.stdout:4/86: dread d5/fa [0,4194304] 0 2026-03-09T16:14:17.702 INFO:tasks.workunit.client.0.vm03.stdout:4/87: write d5/f9 [4131175,79733] 0 2026-03-09T16:14:17.702 INFO:tasks.workunit.client.0.vm03.stdout:5/73: sync 2026-03-09T16:14:17.702 INFO:tasks.workunit.client.0.vm03.stdout:0/81: sync 2026-03-09T16:14:17.704 INFO:tasks.workunit.client.0.vm03.stdout:1/44: mknod d4/cf 0 2026-03-09T16:14:17.712 INFO:tasks.workunit.client.0.vm03.stdout:7/32: mkdir d4/da 0 2026-03-09T16:14:17.719 INFO:tasks.workunit.client.0.vm03.stdout:4/88: write d5/f10 [419134,47255] 0 2026-03-09T16:14:17.723 INFO:tasks.workunit.client.0.vm03.stdout:0/82: rmdir d0/d7 39 2026-03-09T16:14:17.723 INFO:tasks.workunit.client.0.vm03.stdout:0/83: fsync d0/da/dc/fd 0 2026-03-09T16:14:17.728 INFO:tasks.workunit.client.0.vm03.stdout:2/88: getdents db/d12 0 2026-03-09T16:14:17.728 INFO:tasks.workunit.client.0.vm03.stdout:2/89: read - db/f13 zero size 2026-03-09T16:14:17.728 INFO:tasks.workunit.client.0.vm03.stdout:2/90: dread - db/f14 zero size 2026-03-09T16:14:17.731 INFO:tasks.workunit.client.0.vm03.stdout:3/54: read d5/f10 [281339,51] 0 2026-03-09T16:14:17.731 INFO:tasks.workunit.client.0.vm03.stdout:3/55: truncate d5/fd 238774 0 2026-03-09T16:14:17.733 INFO:tasks.workunit.client.0.vm03.stdout:4/89: mkdir d5/db/d1c 0 2026-03-09T16:14:17.734 INFO:tasks.workunit.client.0.vm03.stdout:4/90: dread d5/f8 [0,4194304] 0 2026-03-09T16:14:17.740 INFO:tasks.workunit.client.0.vm03.stdout:4/91: stat d5/fa 0 2026-03-09T16:14:17.742 INFO:tasks.workunit.client.0.vm03.stdout:1/45: symlink d4/d6/l10 0 2026-03-09T16:14:17.742 INFO:tasks.workunit.client.0.vm03.stdout:2/91: rename db/f13 to db/d12/f22 0 2026-03-09T16:14:17.742 INFO:tasks.workunit.client.0.vm03.stdout:2/92: write db/f14 [673332,125334] 0 2026-03-09T16:14:17.748 INFO:tasks.workunit.client.0.vm03.stdout:1/46: symlink d4/d6/l11 0 2026-03-09T16:14:17.749 INFO:tasks.workunit.client.0.vm03.stdout:4/92: dwrite d5/fa [0,4194304] 0 2026-03-09T16:14:17.751 INFO:tasks.workunit.client.0.vm03.stdout:4/93: write d5/f10 [1365169,124452] 0 2026-03-09T16:14:17.754 INFO:tasks.workunit.client.0.vm03.stdout:3/56: unlink f1 0 2026-03-09T16:14:17.757 INFO:tasks.workunit.client.0.vm03.stdout:8/69: creat da/db/f1c x:0 0 0 2026-03-09T16:14:17.766 INFO:tasks.workunit.client.0.vm03.stdout:3/57: creat d5/f11 x:0 0 0 2026-03-09T16:14:17.769 INFO:tasks.workunit.client.0.vm03.stdout:6/39: dread f8 [0,4194304] 0 2026-03-09T16:14:17.769 INFO:tasks.workunit.client.0.vm03.stdout:6/40: chown c6 15333550 1 2026-03-09T16:14:17.771 INFO:tasks.workunit.client.0.vm03.stdout:1/47: sync 2026-03-09T16:14:17.776 INFO:tasks.workunit.client.0.vm03.stdout:1/48: write f2 [1082613,96178] 0 2026-03-09T16:14:17.777 INFO:tasks.workunit.client.0.vm03.stdout:9/48: getdents d2/d4 0 2026-03-09T16:14:17.777 INFO:tasks.workunit.client.0.vm03.stdout:1/49: dwrite d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:17.784 INFO:tasks.workunit.client.0.vm03.stdout:0/84: getdents d0/da/dc 0 2026-03-09T16:14:17.785 INFO:tasks.workunit.client.0.vm03.stdout:0/85: write d0/da/f17 [1022758,37669] 0 2026-03-09T16:14:17.786 INFO:tasks.workunit.client.0.vm03.stdout:0/86: dread d0/f3 [4194304,4194304] 0 2026-03-09T16:14:17.789 INFO:tasks.workunit.client.0.vm03.stdout:2/93: creat db/f23 x:0 0 0 2026-03-09T16:14:17.789 INFO:tasks.workunit.client.0.vm03.stdout:2/94: fdatasync db/ff 0 2026-03-09T16:14:17.797 INFO:tasks.workunit.client.0.vm03.stdout:5/74: dwrite d2/f3 [0,4194304] 0 2026-03-09T16:14:17.799 INFO:tasks.workunit.client.0.vm03.stdout:5/75: chown d2/d7/de/d11/ld 25 1 2026-03-09T16:14:17.811 INFO:tasks.workunit.client.0.vm03.stdout:6/41: unlink f8 0 2026-03-09T16:14:17.818 INFO:tasks.workunit.client.0.vm03.stdout:7/33: dwrite f2 [0,4194304] 0 2026-03-09T16:14:17.821 INFO:tasks.workunit.client.0.vm03.stdout:7/34: dread d4/f8 [0,4194304] 0 2026-03-09T16:14:17.825 INFO:tasks.workunit.client.0.vm03.stdout:9/49: creat d2/d4/fc x:0 0 0 2026-03-09T16:14:17.825 INFO:tasks.workunit.client.0.vm03.stdout:9/50: chown d2 19956811 1 2026-03-09T16:14:17.825 INFO:tasks.workunit.client.0.vm03.stdout:9/51: chown d2/d4/fa 5899 1 2026-03-09T16:14:17.826 INFO:tasks.workunit.client.0.vm03.stdout:9/52: dread - d2/d4/fc zero size 2026-03-09T16:14:17.826 INFO:tasks.workunit.client.0.vm03.stdout:9/53: chown d2/f7 146962 1 2026-03-09T16:14:17.833 INFO:tasks.workunit.client.0.vm03.stdout:0/87: rename d0/f9 to d0/da/d11/f18 0 2026-03-09T16:14:17.833 INFO:tasks.workunit.client.0.vm03.stdout:0/88: chown d0/da/cb 0 1 2026-03-09T16:14:17.833 INFO:tasks.workunit.client.0.vm03.stdout:0/89: chown d0/da/d11/f13 54 1 2026-03-09T16:14:17.850 INFO:tasks.workunit.client.0.vm03.stdout:6/42: mkdir d9 0 2026-03-09T16:14:17.855 INFO:tasks.workunit.client.0.vm03.stdout:4/94: truncate d5/f9 147155 0 2026-03-09T16:14:17.855 INFO:tasks.workunit.client.0.vm03.stdout:4/95: chown d5/f9 0 1 2026-03-09T16:14:17.857 INFO:tasks.workunit.client.0.vm03.stdout:4/96: dread f1 [0,4194304] 0 2026-03-09T16:14:17.857 INFO:tasks.workunit.client.0.vm03.stdout:4/97: stat d5/db/d1c 0 2026-03-09T16:14:17.858 INFO:tasks.workunit.client.0.vm03.stdout:4/98: dread d5/f8 [0,4194304] 0 2026-03-09T16:14:17.862 INFO:tasks.workunit.client.0.vm03.stdout:1/50: rename d4/d6/l11 to d4/db/l12 0 2026-03-09T16:14:17.863 INFO:tasks.workunit.client.0.vm03.stdout:0/90: write d0/da/d11/f18 [1120043,119181] 0 2026-03-09T16:14:17.865 INFO:tasks.workunit.client.0.vm03.stdout:0/91: dread d0/da/fe [0,4194304] 0 2026-03-09T16:14:17.866 INFO:tasks.workunit.client.0.vm03.stdout:0/92: write d0/da/d11/f13 [614621,83350] 0 2026-03-09T16:14:17.867 INFO:tasks.workunit.client.0.vm03.stdout:0/93: dread d0/da/ff [0,4194304] 0 2026-03-09T16:14:17.867 INFO:tasks.workunit.client.0.vm03.stdout:0/94: truncate d0/da/ff 1544297 0 2026-03-09T16:14:17.868 INFO:tasks.workunit.client.0.vm03.stdout:0/95: read d0/f3 [2974412,101089] 0 2026-03-09T16:14:17.868 INFO:tasks.workunit.client.0.vm03.stdout:0/96: truncate d0/da/d11/f13 891059 0 2026-03-09T16:14:17.869 INFO:tasks.workunit.client.0.vm03.stdout:0/97: fdatasync d0/da/ff 0 2026-03-09T16:14:17.875 INFO:tasks.workunit.client.0.vm03.stdout:2/95: truncate db/ff 695580 0 2026-03-09T16:14:17.876 INFO:tasks.workunit.client.0.vm03.stdout:5/76: mkdir d2/d7/d1a/d1c 0 2026-03-09T16:14:17.877 INFO:tasks.workunit.client.0.vm03.stdout:5/77: rename d2/d7 to d2/d7/d1a/d1c/d1d 22 2026-03-09T16:14:17.878 INFO:tasks.workunit.client.0.vm03.stdout:8/70: write da/db/f13 [444713,125923] 0 2026-03-09T16:14:17.879 INFO:tasks.workunit.client.0.vm03.stdout:7/35: mknod d4/da/cb 0 2026-03-09T16:14:17.879 INFO:tasks.workunit.client.0.vm03.stdout:5/78: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:17.886 INFO:tasks.workunit.client.0.vm03.stdout:1/51: mknod d4/d6/c13 0 2026-03-09T16:14:17.886 INFO:tasks.workunit.client.0.vm03.stdout:1/52: stat f1 0 2026-03-09T16:14:17.888 INFO:tasks.workunit.client.0.vm03.stdout:1/53: dread f1 [0,4194304] 0 2026-03-09T16:14:17.889 INFO:tasks.workunit.client.0.vm03.stdout:1/54: dread - d4/fa zero size 2026-03-09T16:14:17.891 INFO:tasks.workunit.client.0.vm03.stdout:1/55: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:17.893 INFO:tasks.workunit.client.0.vm03.stdout:3/58: getdents d5 0 2026-03-09T16:14:17.893 INFO:tasks.workunit.client.0.vm03.stdout:3/59: read d5/fd [230793,97584] 0 2026-03-09T16:14:17.895 INFO:tasks.workunit.client.0.vm03.stdout:6/43: symlink d9/la 0 2026-03-09T16:14:17.895 INFO:tasks.workunit.client.0.vm03.stdout:6/44: read f7 [438576,130830] 0 2026-03-09T16:14:17.896 INFO:tasks.workunit.client.0.vm03.stdout:8/71: write da/d15/f1b [916557,27994] 0 2026-03-09T16:14:17.898 INFO:tasks.workunit.client.0.vm03.stdout:5/79: symlink d2/d7/d1a/l1e 0 2026-03-09T16:14:17.900 INFO:tasks.workunit.client.0.vm03.stdout:7/36: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:17.900 INFO:tasks.workunit.client.0.vm03.stdout:9/54: rmdir d2/d4/d9 0 2026-03-09T16:14:17.905 INFO:tasks.workunit.client.0.vm03.stdout:9/55: dread d2/f7 [0,4194304] 0 2026-03-09T16:14:17.910 INFO:tasks.workunit.client.0.vm03.stdout:1/56: rmdir d4 39 2026-03-09T16:14:17.910 INFO:tasks.workunit.client.0.vm03.stdout:1/57: fdatasync f2 0 2026-03-09T16:14:17.914 INFO:tasks.workunit.client.0.vm03.stdout:3/60: rename d5/lc to d5/l12 0 2026-03-09T16:14:17.914 INFO:tasks.workunit.client.0.vm03.stdout:2/96: mknod db/c24 0 2026-03-09T16:14:17.918 INFO:tasks.workunit.client.0.vm03.stdout:2/97: dread f7 [0,4194304] 0 2026-03-09T16:14:17.918 INFO:tasks.workunit.client.0.vm03.stdout:2/98: write db/f14 [958458,89597] 0 2026-03-09T16:14:17.918 INFO:tasks.workunit.client.0.vm03.stdout:2/99: write f8 [4854705,81254] 0 2026-03-09T16:14:17.918 INFO:tasks.workunit.client.0.vm03.stdout:2/100: write f0 [705329,36675] 0 2026-03-09T16:14:17.919 INFO:tasks.workunit.client.0.vm03.stdout:6/45: creat d9/fb x:0 0 0 2026-03-09T16:14:17.922 INFO:tasks.workunit.client.0.vm03.stdout:8/72: rmdir da 39 2026-03-09T16:14:17.924 INFO:tasks.workunit.client.0.vm03.stdout:4/99: rmdir d5/db/d1c 0 2026-03-09T16:14:17.939 INFO:tasks.workunit.client.0.vm03.stdout:7/37: write d4/f8 [4823954,2957] 0 2026-03-09T16:14:17.942 INFO:tasks.workunit.client.0.vm03.stdout:7/38: fsync d4/f8 0 2026-03-09T16:14:17.945 INFO:tasks.workunit.client.0.vm03.stdout:0/98: link d0/da/dc/fd d0/d7/f19 0 2026-03-09T16:14:17.952 INFO:tasks.workunit.client.0.vm03.stdout:9/56: dwrite d2/f8 [0,4194304] 0 2026-03-09T16:14:17.961 INFO:tasks.workunit.client.0.vm03.stdout:3/61: mkdir d5/d13 0 2026-03-09T16:14:17.963 INFO:tasks.workunit.client.0.vm03.stdout:3/62: dread d5/f6 [0,4194304] 0 2026-03-09T16:14:17.966 INFO:tasks.workunit.client.0.vm03.stdout:2/101: symlink db/d12/l25 0 2026-03-09T16:14:17.966 INFO:tasks.workunit.client.0.vm03.stdout:6/46: mknod d9/cc 0 2026-03-09T16:14:17.966 INFO:tasks.workunit.client.0.vm03.stdout:6/47: chown d9 7518170 1 2026-03-09T16:14:17.967 INFO:tasks.workunit.client.0.vm03.stdout:5/80: getdents d2/d7/d1a/d1c 0 2026-03-09T16:14:17.968 INFO:tasks.workunit.client.0.vm03.stdout:4/100: mknod d5/c1d 0 2026-03-09T16:14:17.971 INFO:tasks.workunit.client.0.vm03.stdout:9/57: creat d2/d4/fd x:0 0 0 2026-03-09T16:14:17.986 INFO:tasks.workunit.client.0.vm03.stdout:3/63: write d5/f10 [1339230,81045] 0 2026-03-09T16:14:17.986 INFO:tasks.workunit.client.0.vm03.stdout:2/102: unlink db/d12/l25 0 2026-03-09T16:14:17.986 INFO:tasks.workunit.client.0.vm03.stdout:2/103: dwrite f5 [0,4194304] 0 2026-03-09T16:14:17.986 INFO:tasks.workunit.client.0.vm03.stdout:6/48: fsync f7 0 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:8/73: mkdir da/d1d 0 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:4/101: rmdir d5 39 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:1/58: symlink d4/l14 0 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:9/58: mkdir d2/de 0 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:9/59: write d2/d4/fc [1048534,4229] 0 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:2/104: chown db/d12/l1c 6773070 1 2026-03-09T16:14:17.987 INFO:tasks.workunit.client.0.vm03.stdout:8/74: dwrite f8 [0,4194304] 0 2026-03-09T16:14:17.988 INFO:tasks.workunit.client.0.vm03.stdout:0/99: sync 2026-03-09T16:14:17.992 INFO:tasks.workunit.client.0.vm03.stdout:8/75: dwrite f6 [0,4194304] 0 2026-03-09T16:14:17.994 INFO:tasks.workunit.client.0.vm03.stdout:8/76: chown da/d10/f14 0 1 2026-03-09T16:14:17.994 INFO:tasks.workunit.client.0.vm03.stdout:6/49: unlink d9/la 0 2026-03-09T16:14:17.995 INFO:tasks.workunit.client.0.vm03.stdout:6/50: write f7 [1432270,111902] 0 2026-03-09T16:14:17.995 INFO:tasks.workunit.client.0.vm03.stdout:6/51: fsync d9/fb 0 2026-03-09T16:14:17.996 INFO:tasks.workunit.client.0.vm03.stdout:6/52: chown d9/cc 1154 1 2026-03-09T16:14:17.996 INFO:tasks.workunit.client.0.vm03.stdout:1/59: creat d4/d6/f15 x:0 0 0 2026-03-09T16:14:17.998 INFO:tasks.workunit.client.0.vm03.stdout:3/64: creat d5/d13/f14 x:0 0 0 2026-03-09T16:14:17.998 INFO:tasks.workunit.client.0.vm03.stdout:3/65: write d5/f10 [100890,112904] 0 2026-03-09T16:14:17.999 INFO:tasks.workunit.client.0.vm03.stdout:3/66: fsync d5/fb 0 2026-03-09T16:14:17.999 INFO:tasks.workunit.client.0.vm03.stdout:3/67: truncate d5/d13/f14 567756 0 2026-03-09T16:14:18.000 INFO:tasks.workunit.client.0.vm03.stdout:9/60: mkdir d2/df 0 2026-03-09T16:14:18.001 INFO:tasks.workunit.client.0.vm03.stdout:9/61: chown d2/f8 828 1 2026-03-09T16:14:18.002 INFO:tasks.workunit.client.0.vm03.stdout:7/39: getdents d4 0 2026-03-09T16:14:18.007 INFO:tasks.workunit.client.0.vm03.stdout:8/77: readlink da/db/l11 0 2026-03-09T16:14:18.007 INFO:tasks.workunit.client.0.vm03.stdout:8/78: read f8 [3915190,665] 0 2026-03-09T16:14:18.008 INFO:tasks.workunit.client.0.vm03.stdout:6/53: mknod d9/cd 0 2026-03-09T16:14:18.009 INFO:tasks.workunit.client.0.vm03.stdout:1/60: mknod d4/d6/c16 0 2026-03-09T16:14:18.012 INFO:tasks.workunit.client.0.vm03.stdout:3/68: mknod d5/c15 0 2026-03-09T16:14:18.013 INFO:tasks.workunit.client.0.vm03.stdout:3/69: chown d5/lf 5114424 1 2026-03-09T16:14:18.014 INFO:tasks.workunit.client.0.vm03.stdout:5/81: getdents d2/d7/de/d11 0 2026-03-09T16:14:18.017 INFO:tasks.workunit.client.0.vm03.stdout:7/40: mkdir d4/dc 0 2026-03-09T16:14:18.021 INFO:tasks.workunit.client.0.vm03.stdout:6/54: rename c6 to d9/ce 0 2026-03-09T16:14:18.023 INFO:tasks.workunit.client.0.vm03.stdout:6/55: truncate f7 1803907 0 2026-03-09T16:14:18.024 INFO:tasks.workunit.client.0.vm03.stdout:1/61: mknod d4/d6/c17 0 2026-03-09T16:14:18.026 INFO:tasks.workunit.client.0.vm03.stdout:9/62: creat d2/df/f10 x:0 0 0 2026-03-09T16:14:18.028 INFO:tasks.workunit.client.0.vm03.stdout:5/82: symlink d2/d7/d1a/l1f 0 2026-03-09T16:14:18.030 INFO:tasks.workunit.client.0.vm03.stdout:2/105: rename db/c15 to db/c26 0 2026-03-09T16:14:18.034 INFO:tasks.workunit.client.0.vm03.stdout:2/106: dwrite fa [0,4194304] 0 2026-03-09T16:14:18.038 INFO:tasks.workunit.client.0.vm03.stdout:0/100: creat d0/f1a x:0 0 0 2026-03-09T16:14:18.038 INFO:tasks.workunit.client.0.vm03.stdout:0/101: stat d0/da/fe 0 2026-03-09T16:14:18.040 INFO:tasks.workunit.client.0.vm03.stdout:4/102: rename d5/dd/fe to d5/dd/f1e 0 2026-03-09T16:14:18.046 INFO:tasks.workunit.client.0.vm03.stdout:1/62: fsync f1 0 2026-03-09T16:14:18.049 INFO:tasks.workunit.client.0.vm03.stdout:9/63: mkdir d2/d4/d11 0 2026-03-09T16:14:18.052 INFO:tasks.workunit.client.0.vm03.stdout:5/83: dread d2/f3 [0,4194304] 0 2026-03-09T16:14:18.052 INFO:tasks.workunit.client.0.vm03.stdout:7/41: symlink d4/ld 0 2026-03-09T16:14:18.054 INFO:tasks.workunit.client.0.vm03.stdout:8/79: sync 2026-03-09T16:14:18.056 INFO:tasks.workunit.client.0.vm03.stdout:0/102: fdatasync d0/da/ff 0 2026-03-09T16:14:18.056 INFO:tasks.workunit.client.0.vm03.stdout:0/103: chown d0/d7 1 1 2026-03-09T16:14:18.056 INFO:tasks.workunit.client.0.vm03.stdout:0/104: fdatasync d0/f3 0 2026-03-09T16:14:18.057 INFO:tasks.workunit.client.0.vm03.stdout:0/105: fdatasync d0/da/d11/f13 0 2026-03-09T16:14:18.062 INFO:tasks.workunit.client.0.vm03.stdout:0/106: dwrite d0/f1a [0,4194304] 0 2026-03-09T16:14:18.062 INFO:tasks.workunit.client.0.vm03.stdout:6/56: read f7 [981970,103343] 0 2026-03-09T16:14:18.066 INFO:tasks.workunit.client.0.vm03.stdout:1/63: mkdir d4/db/d18 0 2026-03-09T16:14:18.072 INFO:tasks.workunit.client.0.vm03.stdout:7/42: chown d4/da/cb 10 1 2026-03-09T16:14:18.073 INFO:tasks.workunit.client.0.vm03.stdout:7/43: read f2 [1494513,126105] 0 2026-03-09T16:14:18.073 INFO:tasks.workunit.client.0.vm03.stdout:7/44: chown d4/dc 0 1 2026-03-09T16:14:18.076 INFO:tasks.workunit.client.0.vm03.stdout:6/57: dread f7 [0,4194304] 0 2026-03-09T16:14:18.079 INFO:tasks.workunit.client.0.vm03.stdout:5/84: mknod d2/d7/d1a/c20 0 2026-03-09T16:14:18.080 INFO:tasks.workunit.client.0.vm03.stdout:7/45: dread f2 [0,4194304] 0 2026-03-09T16:14:18.086 INFO:tasks.workunit.client.0.vm03.stdout:7/46: dwrite f2 [0,4194304] 0 2026-03-09T16:14:18.090 INFO:tasks.workunit.client.0.vm03.stdout:4/103: mkdir d5/dd/d1f 0 2026-03-09T16:14:18.090 INFO:tasks.workunit.client.0.vm03.stdout:9/64: mkdir d2/d4/d11/d12 0 2026-03-09T16:14:18.090 INFO:tasks.workunit.client.0.vm03.stdout:9/65: chown d2/df 53 1 2026-03-09T16:14:18.090 INFO:tasks.workunit.client.0.vm03.stdout:9/66: readlink - no filename 2026-03-09T16:14:18.090 INFO:tasks.workunit.client.0.vm03.stdout:4/104: write d5/d17/f18 [859002,96817] 0 2026-03-09T16:14:18.096 INFO:tasks.workunit.client.0.vm03.stdout:8/80: dwrite f8 [0,4194304] 0 2026-03-09T16:14:18.097 INFO:tasks.workunit.client.0.vm03.stdout:8/81: chown da/db/f1c 0 1 2026-03-09T16:14:18.098 INFO:tasks.workunit.client.0.vm03.stdout:5/85: unlink d2/f3 0 2026-03-09T16:14:18.099 INFO:tasks.workunit.client.0.vm03.stdout:6/58: fsync f7 0 2026-03-09T16:14:18.099 INFO:tasks.workunit.client.0.vm03.stdout:5/86: readlink d2/d7/de/d11/ld 0 2026-03-09T16:14:18.099 INFO:tasks.workunit.client.0.vm03.stdout:5/87: dread - no filename 2026-03-09T16:14:18.099 INFO:tasks.workunit.client.0.vm03.stdout:5/88: write - no filename 2026-03-09T16:14:18.101 INFO:tasks.workunit.client.0.vm03.stdout:6/59: dread f7 [0,4194304] 0 2026-03-09T16:14:18.108 INFO:tasks.workunit.client.0.vm03.stdout:0/107: rename d0/da/dc to d0/da/d1b 0 2026-03-09T16:14:18.112 INFO:tasks.workunit.client.0.vm03.stdout:0/108: fdatasync d0/f3 0 2026-03-09T16:14:18.112 INFO:tasks.workunit.client.0.vm03.stdout:9/67: write d2/f8 [5123644,26408] 0 2026-03-09T16:14:18.113 INFO:tasks.workunit.client.0.vm03.stdout:4/105: sync 2026-03-09T16:14:18.117 INFO:tasks.workunit.client.0.vm03.stdout:2/107: getdents db/d1e 0 2026-03-09T16:14:18.117 INFO:tasks.workunit.client.0.vm03.stdout:7/47: dread d4/f8 [0,4194304] 0 2026-03-09T16:14:18.118 INFO:tasks.workunit.client.0.vm03.stdout:5/89: mknod d2/c21 0 2026-03-09T16:14:18.119 INFO:tasks.workunit.client.0.vm03.stdout:0/109: creat d0/da/f1c x:0 0 0 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:7/48: dwrite f2 [0,4194304] 0 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:5/90: rename d2/d7/d1a/l1e to d2/l22 0 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:5/91: dwrite - no filename 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:5/92: dread - no filename 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:5/93: dwrite - no filename 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:9/68: link d2/d4/fc d2/d4/d11/f13 0 2026-03-09T16:14:18.126 INFO:tasks.workunit.client.0.vm03.stdout:2/108: mknod db/c27 0 2026-03-09T16:14:18.134 INFO:tasks.workunit.client.0.vm03.stdout:2/109: dread db/f14 [0,4194304] 0 2026-03-09T16:14:18.134 INFO:tasks.workunit.client.0.vm03.stdout:9/69: creat d2/df/f14 x:0 0 0 2026-03-09T16:14:18.135 INFO:tasks.workunit.client.0.vm03.stdout:9/70: stat d2/d4/fc 0 2026-03-09T16:14:18.145 INFO:tasks.workunit.client.0.vm03.stdout:9/71: rmdir d2 39 2026-03-09T16:14:18.148 INFO:tasks.workunit.client.0.vm03.stdout:7/49: creat d4/fe x:0 0 0 2026-03-09T16:14:18.153 INFO:tasks.workunit.client.0.vm03.stdout:9/72: chown d2/f7 6530 1 2026-03-09T16:14:18.154 INFO:tasks.workunit.client.0.vm03.stdout:5/94: link d2/d7/d8/d16/cc d2/d7/de/c23 0 2026-03-09T16:14:18.156 INFO:tasks.workunit.client.0.vm03.stdout:2/110: link db/cc db/d1e/c28 0 2026-03-09T16:14:18.159 INFO:tasks.workunit.client.0.vm03.stdout:9/73: rename d2/d4/fc to d2/f15 0 2026-03-09T16:14:18.160 INFO:tasks.workunit.client.0.vm03.stdout:9/74: truncate d2/d4/fa 450496 0 2026-03-09T16:14:18.164 INFO:tasks.workunit.client.0.vm03.stdout:9/75: dwrite d2/d4/fd [0,4194304] 0 2026-03-09T16:14:18.166 INFO:tasks.workunit.client.0.vm03.stdout:9/76: write d2/f8 [3249984,54495] 0 2026-03-09T16:14:18.167 INFO:tasks.workunit.client.0.vm03.stdout:9/77: write d2/df/f14 [1016652,34335] 0 2026-03-09T16:14:18.171 INFO:tasks.workunit.client.0.vm03.stdout:6/60: getdents d9 0 2026-03-09T16:14:18.174 INFO:tasks.workunit.client.0.vm03.stdout:1/64: getdents d4/d6 0 2026-03-09T16:14:18.178 INFO:tasks.workunit.client.0.vm03.stdout:3/70: getdents d5/d13 0 2026-03-09T16:14:18.180 INFO:tasks.workunit.client.0.vm03.stdout:7/50: creat d4/dc/ff x:0 0 0 2026-03-09T16:14:18.184 INFO:tasks.workunit.client.0.vm03.stdout:8/82: truncate f8 2966301 0 2026-03-09T16:14:18.184 INFO:tasks.workunit.client.0.vm03.stdout:8/83: stat da/d15/c17 0 2026-03-09T16:14:18.184 INFO:tasks.workunit.client.0.vm03.stdout:8/84: stat da/d15/c17 0 2026-03-09T16:14:18.187 INFO:tasks.workunit.client.0.vm03.stdout:5/95: mkdir d2/d7/d8/d24 0 2026-03-09T16:14:18.187 INFO:tasks.workunit.client.0.vm03.stdout:0/110: write d0/da/fe [5131794,80656] 0 2026-03-09T16:14:18.193 INFO:tasks.workunit.client.0.vm03.stdout:4/106: fsync d5/dd/f1e 0 2026-03-09T16:14:18.193 INFO:tasks.workunit.client.0.vm03.stdout:0/111: dwrite d0/f1a [0,4194304] 0 2026-03-09T16:14:18.197 INFO:tasks.workunit.client.0.vm03.stdout:9/78: symlink d2/df/l16 0 2026-03-09T16:14:18.201 INFO:tasks.workunit.client.0.vm03.stdout:1/65: creat d4/d6/f19 x:0 0 0 2026-03-09T16:14:18.201 INFO:tasks.workunit.client.0.vm03.stdout:3/71: creat d5/f16 x:0 0 0 2026-03-09T16:14:18.201 INFO:tasks.workunit.client.0.vm03.stdout:3/72: dread - d5/f16 zero size 2026-03-09T16:14:18.202 INFO:tasks.workunit.client.0.vm03.stdout:8/85: sync 2026-03-09T16:14:18.203 INFO:tasks.workunit.client.0.vm03.stdout:2/111: dwrite db/f14 [0,4194304] 0 2026-03-09T16:14:18.203 INFO:tasks.workunit.client.0.vm03.stdout:8/86: fdatasync da/d10/f14 0 2026-03-09T16:14:18.205 INFO:tasks.workunit.client.0.vm03.stdout:7/51: symlink d4/da/l10 0 2026-03-09T16:14:18.207 INFO:tasks.workunit.client.0.vm03.stdout:4/107: truncate d5/f8 1138063 0 2026-03-09T16:14:18.210 INFO:tasks.workunit.client.0.vm03.stdout:6/61: getdents d9 0 2026-03-09T16:14:18.210 INFO:tasks.workunit.client.0.vm03.stdout:1/66: dread f1 [0,4194304] 0 2026-03-09T16:14:18.214 INFO:tasks.workunit.client.0.vm03.stdout:0/112: dread d0/da/d11/f13 [0,4194304] 0 2026-03-09T16:14:18.216 INFO:tasks.workunit.client.0.vm03.stdout:9/79: creat d2/d4/f17 x:0 0 0 2026-03-09T16:14:18.216 INFO:tasks.workunit.client.0.vm03.stdout:9/80: chown d2/df/f14 1 1 2026-03-09T16:14:18.221 INFO:tasks.workunit.client.0.vm03.stdout:3/73: symlink d5/l17 0 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:3/74: rename d5/d13 to d5/d13/d18 22 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:5/96: getdents d2/d7/de/d11/d19 0 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:5/97: read - no filename 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:5/98: dwrite - no filename 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:2/112: creat db/d1e/f29 x:0 0 0 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:2/113: fdatasync f9 0 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:4/108: read d5/d17/f14 [2249946,13970] 0 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:6/62: rmdir d9 39 2026-03-09T16:14:18.229 INFO:tasks.workunit.client.0.vm03.stdout:1/67: symlink d4/db/l1a 0 2026-03-09T16:14:18.231 INFO:tasks.workunit.client.0.vm03.stdout:0/113: mknod d0/d7/c1d 0 2026-03-09T16:14:18.237 INFO:tasks.workunit.client.0.vm03.stdout:5/99: rename d2/d7/d8/d16/c13 to d2/d7/de/c25 0 2026-03-09T16:14:18.239 INFO:tasks.workunit.client.0.vm03.stdout:5/100: dwrite - no filename 2026-03-09T16:14:18.240 INFO:tasks.workunit.client.0.vm03.stdout:8/87: symlink da/db/l1e 0 2026-03-09T16:14:18.240 INFO:tasks.workunit.client.0.vm03.stdout:7/52: symlink d4/l11 0 2026-03-09T16:14:18.241 INFO:tasks.workunit.client.0.vm03.stdout:7/53: truncate d4/dc/ff 96662 0 2026-03-09T16:14:18.241 INFO:tasks.workunit.client.0.vm03.stdout:7/54: dread - d4/fe zero size 2026-03-09T16:14:18.242 INFO:tasks.workunit.client.0.vm03.stdout:2/114: mkdir db/d12/d2a 0 2026-03-09T16:14:18.242 INFO:tasks.workunit.client.0.vm03.stdout:2/115: chown f0 122 1 2026-03-09T16:14:18.244 INFO:tasks.workunit.client.0.vm03.stdout:4/109: symlink d5/db/l20 0 2026-03-09T16:14:18.244 INFO:tasks.workunit.client.0.vm03.stdout:5/101: sync 2026-03-09T16:14:18.244 INFO:tasks.workunit.client.0.vm03.stdout:5/102: write - no filename 2026-03-09T16:14:18.245 INFO:tasks.workunit.client.0.vm03.stdout:5/103: stat d2/d7/de/d11/c1b 0 2026-03-09T16:14:18.245 INFO:tasks.workunit.client.0.vm03.stdout:5/104: fdatasync - no filename 2026-03-09T16:14:18.246 INFO:tasks.workunit.client.0.vm03.stdout:4/110: dread d5/db/f13 [0,4194304] 0 2026-03-09T16:14:18.248 INFO:tasks.workunit.client.0.vm03.stdout:6/63: chown d9 1883 1 2026-03-09T16:14:18.248 INFO:tasks.workunit.client.0.vm03.stdout:6/64: write d9/fb [1021420,56519] 0 2026-03-09T16:14:18.249 INFO:tasks.workunit.client.0.vm03.stdout:6/65: chown d9/cd 414938170 1 2026-03-09T16:14:18.250 INFO:tasks.workunit.client.0.vm03.stdout:6/66: fsync f7 0 2026-03-09T16:14:18.256 INFO:tasks.workunit.client.0.vm03.stdout:9/81: write d2/d4/d11/f13 [868580,104617] 0 2026-03-09T16:14:18.257 INFO:tasks.workunit.client.0.vm03.stdout:6/67: dread d9/fb [0,4194304] 0 2026-03-09T16:14:18.258 INFO:tasks.workunit.client.0.vm03.stdout:6/68: fsync f7 0 2026-03-09T16:14:18.262 INFO:tasks.workunit.client.0.vm03.stdout:1/68: dread d4/fd [0,4194304] 0 2026-03-09T16:14:18.264 INFO:tasks.workunit.client.0.vm03.stdout:8/88: creat da/d10/f1f x:0 0 0 2026-03-09T16:14:18.265 INFO:tasks.workunit.client.0.vm03.stdout:7/55: mkdir d4/da/d12 0 2026-03-09T16:14:18.266 INFO:tasks.workunit.client.0.vm03.stdout:2/116: creat db/d1e/f2b x:0 0 0 2026-03-09T16:14:18.268 INFO:tasks.workunit.client.0.vm03.stdout:5/105: creat d2/d7/de/d11/f26 x:0 0 0 2026-03-09T16:14:18.268 INFO:tasks.workunit.client.0.vm03.stdout:5/106: dread - d2/d7/de/d11/f26 zero size 2026-03-09T16:14:18.269 INFO:tasks.workunit.client.0.vm03.stdout:4/111: read f1 [156547,128262] 0 2026-03-09T16:14:18.272 INFO:tasks.workunit.client.0.vm03.stdout:9/82: creat d2/d4/d11/f18 x:0 0 0 2026-03-09T16:14:18.277 INFO:tasks.workunit.client.0.vm03.stdout:9/83: dwrite d2/d4/fd [0,4194304] 0 2026-03-09T16:14:18.278 INFO:tasks.workunit.client.0.vm03.stdout:9/84: readlink d2/df/l16 0 2026-03-09T16:14:18.278 INFO:tasks.workunit.client.0.vm03.stdout:9/85: readlink d2/df/l16 0 2026-03-09T16:14:18.282 INFO:tasks.workunit.client.0.vm03.stdout:9/86: dread d2/d4/fd [0,4194304] 0 2026-03-09T16:14:18.282 INFO:tasks.workunit.client.0.vm03.stdout:9/87: write d2/d4/d11/f13 [960606,24846] 0 2026-03-09T16:14:18.287 INFO:tasks.workunit.client.0.vm03.stdout:3/75: link d5/l8 d5/l19 0 2026-03-09T16:14:18.292 INFO:tasks.workunit.client.0.vm03.stdout:9/88: dwrite d2/d4/d11/f18 [0,4194304] 0 2026-03-09T16:14:18.304 INFO:tasks.workunit.client.0.vm03.stdout:7/56: rmdir d4 39 2026-03-09T16:14:18.313 INFO:tasks.workunit.client.0.vm03.stdout:1/69: unlink d4/d6/c17 0 2026-03-09T16:14:18.317 INFO:tasks.workunit.client.0.vm03.stdout:1/70: write d4/fa [22232,116205] 0 2026-03-09T16:14:18.317 INFO:tasks.workunit.client.0.vm03.stdout:3/76: symlink d5/d13/l1a 0 2026-03-09T16:14:18.317 INFO:tasks.workunit.client.0.vm03.stdout:3/77: write d5/f16 [146671,95804] 0 2026-03-09T16:14:18.317 INFO:tasks.workunit.client.0.vm03.stdout:0/114: rename d0/da/f17 to d0/f1e 0 2026-03-09T16:14:18.317 INFO:tasks.workunit.client.0.vm03.stdout:0/115: readlink d0/da/l14 0 2026-03-09T16:14:18.335 INFO:tasks.workunit.client.0.vm03.stdout:3/78: sync 2026-03-09T16:14:18.343 INFO:tasks.workunit.client.0.vm03.stdout:6/69: truncate d9/fb 784141 0 2026-03-09T16:14:18.345 INFO:tasks.workunit.client.0.vm03.stdout:4/112: dwrite d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:18.347 INFO:tasks.workunit.client.0.vm03.stdout:4/113: write d5/fa [2226434,64042] 0 2026-03-09T16:14:18.351 INFO:tasks.workunit.client.0.vm03.stdout:8/89: dwrite da/db/fe [0,4194304] 0 2026-03-09T16:14:18.351 INFO:tasks.workunit.client.0.vm03.stdout:8/90: write f6 [1118977,56810] 0 2026-03-09T16:14:18.371 INFO:tasks.workunit.client.0.vm03.stdout:2/117: truncate f7 140003 0 2026-03-09T16:14:18.371 INFO:tasks.workunit.client.0.vm03.stdout:9/89: symlink d2/d4/l19 0 2026-03-09T16:14:18.374 INFO:tasks.workunit.client.0.vm03.stdout:9/90: dwrite d2/f15 [0,4194304] 0 2026-03-09T16:14:18.375 INFO:tasks.workunit.client.0.vm03.stdout:5/107: mkdir d2/d7/d8/d24/d27 0 2026-03-09T16:14:18.376 INFO:tasks.workunit.client.0.vm03.stdout:5/108: fsync d2/d7/de/d11/f26 0 2026-03-09T16:14:18.377 INFO:tasks.workunit.client.0.vm03.stdout:5/109: rename d2/d7 to d2/d7/d28 22 2026-03-09T16:14:18.381 INFO:tasks.workunit.client.0.vm03.stdout:9/91: dwrite d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:18.383 INFO:tasks.workunit.client.0.vm03.stdout:0/116: mknod d0/da/c1f 0 2026-03-09T16:14:18.385 INFO:tasks.workunit.client.0.vm03.stdout:9/92: chown d2/d4/f17 0 1 2026-03-09T16:14:18.386 INFO:tasks.workunit.client.0.vm03.stdout:9/93: write d2/f15 [4159513,34690] 0 2026-03-09T16:14:18.396 INFO:tasks.workunit.client.0.vm03.stdout:4/114: creat d5/d17/f21 x:0 0 0 2026-03-09T16:14:18.399 INFO:tasks.workunit.client.0.vm03.stdout:4/115: dread d5/fa [0,4194304] 0 2026-03-09T16:14:18.401 INFO:tasks.workunit.client.0.vm03.stdout:8/91: chown da/db/c19 204508 1 2026-03-09T16:14:18.402 INFO:tasks.workunit.client.0.vm03.stdout:8/92: truncate da/d10/f1f 394939 0 2026-03-09T16:14:18.403 INFO:tasks.workunit.client.0.vm03.stdout:4/116: dwrite d5/db/f13 [0,4194304] 0 2026-03-09T16:14:18.406 INFO:tasks.workunit.client.0.vm03.stdout:2/118: mknod db/d1e/c2c 0 2026-03-09T16:14:18.419 INFO:tasks.workunit.client.0.vm03.stdout:0/117: creat d0/d7/f20 x:0 0 0 2026-03-09T16:14:18.419 INFO:tasks.workunit.client.0.vm03.stdout:0/118: stat d0/da/d11/f18 0 2026-03-09T16:14:18.424 INFO:tasks.workunit.client.0.vm03.stdout:9/94: unlink d2/d4/fa 0 2026-03-09T16:14:18.434 INFO:tasks.workunit.client.0.vm03.stdout:8/93: mknod da/c20 0 2026-03-09T16:14:18.434 INFO:tasks.workunit.client.0.vm03.stdout:7/57: unlink d4/c9 0 2026-03-09T16:14:18.435 INFO:tasks.workunit.client.0.vm03.stdout:8/94: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:18.442 INFO:tasks.workunit.client.0.vm03.stdout:5/110: mkdir d2/d7/de/d11/d19/d29 0 2026-03-09T16:14:18.443 INFO:tasks.workunit.client.0.vm03.stdout:5/111: write d2/d7/de/d11/f26 [660580,89643] 0 2026-03-09T16:14:18.446 INFO:tasks.workunit.client.0.vm03.stdout:1/71: creat d4/f1b x:0 0 0 2026-03-09T16:14:18.447 INFO:tasks.workunit.client.0.vm03.stdout:1/72: dread d4/fa [0,4194304] 0 2026-03-09T16:14:18.450 INFO:tasks.workunit.client.0.vm03.stdout:0/119: symlink d0/da/l21 0 2026-03-09T16:14:18.450 INFO:tasks.workunit.client.0.vm03.stdout:0/120: fsync d0/f1a 0 2026-03-09T16:14:18.452 INFO:tasks.workunit.client.0.vm03.stdout:9/95: mknod d2/df/c1a 0 2026-03-09T16:14:18.454 INFO:tasks.workunit.client.0.vm03.stdout:7/58: chown d4/c6 80527327 1 2026-03-09T16:14:18.454 INFO:tasks.workunit.client.0.vm03.stdout:7/59: stat d4/dc/ff 0 2026-03-09T16:14:18.455 INFO:tasks.workunit.client.0.vm03.stdout:7/60: stat d4/da/d12 0 2026-03-09T16:14:18.456 INFO:tasks.workunit.client.0.vm03.stdout:8/95: unlink da/db/l11 0 2026-03-09T16:14:18.457 INFO:tasks.workunit.client.0.vm03.stdout:8/96: dread - da/db/f1c zero size 2026-03-09T16:14:18.469 INFO:tasks.workunit.client.0.vm03.stdout:8/97: dwrite f8 [0,4194304] 0 2026-03-09T16:14:18.476 INFO:tasks.workunit.client.0.vm03.stdout:8/98: dwrite f6 [0,4194304] 0 2026-03-09T16:14:18.478 INFO:tasks.workunit.client.0.vm03.stdout:4/117: creat d5/dd/f22 x:0 0 0 2026-03-09T16:14:18.484 INFO:tasks.workunit.client.0.vm03.stdout:4/118: dwrite d5/d17/f21 [0,4194304] 0 2026-03-09T16:14:18.489 INFO:tasks.workunit.client.0.vm03.stdout:1/73: sync 2026-03-09T16:14:18.491 INFO:tasks.workunit.client.0.vm03.stdout:0/121: mknod d0/c22 0 2026-03-09T16:14:18.492 INFO:tasks.workunit.client.0.vm03.stdout:0/122: truncate d0/f1e 1828996 0 2026-03-09T16:14:18.499 INFO:tasks.workunit.client.0.vm03.stdout:9/96: symlink d2/d4/d11/d12/l1b 0 2026-03-09T16:14:18.504 INFO:tasks.workunit.client.0.vm03.stdout:1/74: chown d4/c7 55273 1 2026-03-09T16:14:18.504 INFO:tasks.workunit.client.0.vm03.stdout:1/75: stat f1 0 2026-03-09T16:14:18.507 INFO:tasks.workunit.client.0.vm03.stdout:0/123: rename d0/da/cb to d0/da/c23 0 2026-03-09T16:14:18.509 INFO:tasks.workunit.client.0.vm03.stdout:1/76: dwrite d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:18.518 INFO:tasks.workunit.client.0.vm03.stdout:0/124: unlink d0/d7/f16 0 2026-03-09T16:14:18.520 INFO:tasks.workunit.client.0.vm03.stdout:7/61: link d4/c6 d4/c13 0 2026-03-09T16:14:18.522 INFO:tasks.workunit.client.0.vm03.stdout:4/119: link d5/dd/f22 d5/dd/f23 0 2026-03-09T16:14:18.523 INFO:tasks.workunit.client.0.vm03.stdout:4/120: write d5/db/f13 [1133274,62493] 0 2026-03-09T16:14:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:18 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:18 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:18 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:18 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.528 INFO:tasks.workunit.client.0.vm03.stdout:5/112: read d2/d7/de/d11/f26 [159827,102080] 0 2026-03-09T16:14:18.534 INFO:tasks.workunit.client.0.vm03.stdout:5/113: symlink d2/d7/l2a 0 2026-03-09T16:14:18.543 INFO:tasks.workunit.client.0.vm03.stdout:7/62: mknod d4/c14 0 2026-03-09T16:14:18.543 INFO:tasks.workunit.client.0.vm03.stdout:5/114: mkdir d2/d7/de/d2b 0 2026-03-09T16:14:18.543 INFO:tasks.workunit.client.0.vm03.stdout:1/77: rename d4/d6/c16 to d4/c1c 0 2026-03-09T16:14:18.543 INFO:tasks.workunit.client.0.vm03.stdout:7/63: mkdir d4/da/d12/d15 0 2026-03-09T16:14:18.543 INFO:tasks.workunit.client.0.vm03.stdout:3/79: write d5/fd [763680,66548] 0 2026-03-09T16:14:18.544 INFO:tasks.workunit.client.0.vm03.stdout:3/80: stat d5/d13 0 2026-03-09T16:14:18.544 INFO:tasks.workunit.client.0.vm03.stdout:3/81: readlink d5/l19 0 2026-03-09T16:14:18.547 INFO:tasks.workunit.client.0.vm03.stdout:5/115: symlink d2/d7/d8/d24/d27/l2c 0 2026-03-09T16:14:18.549 INFO:tasks.workunit.client.0.vm03.stdout:2/119: rmdir db/d1e 39 2026-03-09T16:14:18.550 INFO:tasks.workunit.client.0.vm03.stdout:1/78: mkdir d4/d6/d1d 0 2026-03-09T16:14:18.551 INFO:tasks.workunit.client.0.vm03.stdout:5/116: mknod d2/d7/d8/d16/c2d 0 2026-03-09T16:14:18.552 INFO:tasks.workunit.client.0.vm03.stdout:1/79: mknod d4/db/c1e 0 2026-03-09T16:14:18.553 INFO:tasks.workunit.client.0.vm03.stdout:1/80: write d4/fd [5779275,79155] 0 2026-03-09T16:14:18.555 INFO:tasks.workunit.client.0.vm03.stdout:5/117: write d2/d7/de/d11/f26 [1174175,38407] 0 2026-03-09T16:14:18.558 INFO:tasks.workunit.client.0.vm03.stdout:2/120: creat db/f2d x:0 0 0 2026-03-09T16:14:18.559 INFO:tasks.workunit.client.0.vm03.stdout:2/121: unlink f6 0 2026-03-09T16:14:18.560 INFO:tasks.workunit.client.0.vm03.stdout:2/122: creat db/f2e x:0 0 0 2026-03-09T16:14:18.568 INFO:tasks.workunit.client.0.vm03.stdout:4/121: sync 2026-03-09T16:14:18.568 INFO:tasks.workunit.client.0.vm03.stdout:7/64: sync 2026-03-09T16:14:18.568 INFO:tasks.workunit.client.0.vm03.stdout:1/81: sync 2026-03-09T16:14:18.568 INFO:tasks.workunit.client.0.vm03.stdout:7/65: chown d4/dc/ff 15 1 2026-03-09T16:14:18.570 INFO:tasks.workunit.client.0.vm03.stdout:4/122: sync 2026-03-09T16:14:18.577 INFO:tasks.workunit.client.0.vm03.stdout:2/123: rename db/f18 to db/d1e/f2f 0 2026-03-09T16:14:18.578 INFO:tasks.workunit.client.0.vm03.stdout:7/66: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:18.583 INFO:tasks.workunit.client.0.vm03.stdout:8/99: truncate da/db/f13 187930 0 2026-03-09T16:14:18.586 INFO:tasks.workunit.client.0.vm03.stdout:9/97: truncate d2/d4/f17 1535327 0 2026-03-09T16:14:18.588 INFO:tasks.workunit.client.0.vm03.stdout:1/82: dwrite d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:18.595 INFO:tasks.workunit.client.0.vm03.stdout:8/100: dwrite f8 [0,4194304] 0 2026-03-09T16:14:18.603 INFO:tasks.workunit.client.0.vm03.stdout:0/125: dwrite d0/da/ff [0,4194304] 0 2026-03-09T16:14:18.603 INFO:tasks.workunit.client.0.vm03.stdout:0/126: write d0/da/ff [3680808,112334] 0 2026-03-09T16:14:18.604 INFO:tasks.workunit.client.0.vm03.stdout:0/127: fdatasync d0/da/d1b/f12 0 2026-03-09T16:14:18.607 INFO:tasks.workunit.client.0.vm03.stdout:5/118: dread d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:18.612 INFO:tasks.workunit.client.0.vm03.stdout:7/67: mkdir d4/da/d12/d16 0 2026-03-09T16:14:18.614 INFO:tasks.workunit.client.0.vm03.stdout:8/101: dwrite f8 [0,4194304] 0 2026-03-09T16:14:18.617 INFO:tasks.workunit.client.0.vm03.stdout:0/128: dwrite d0/da/d11/f13 [0,4194304] 0 2026-03-09T16:14:18.624 INFO:tasks.workunit.client.0.vm03.stdout:8/102: dwrite da/db/fe [0,4194304] 0 2026-03-09T16:14:18.625 INFO:tasks.workunit.client.0.vm03.stdout:9/98: unlink d2/d4/d11/f18 0 2026-03-09T16:14:18.626 INFO:tasks.workunit.client.0.vm03.stdout:6/70: write d9/fb [1455965,42302] 0 2026-03-09T16:14:18.627 INFO:tasks.workunit.client.0.vm03.stdout:9/99: readlink d2/d4/l19 0 2026-03-09T16:14:18.632 INFO:tasks.workunit.client.0.vm03.stdout:9/100: dread d2/d4/d11/f13 [0,4194304] 0 2026-03-09T16:14:18.634 INFO:tasks.workunit.client.0.vm03.stdout:8/103: dread f8 [0,4194304] 0 2026-03-09T16:14:18.637 INFO:tasks.workunit.client.0.vm03.stdout:4/123: mkdir d5/dd/d1f/d24 0 2026-03-09T16:14:18.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:18 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:18 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:18 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:18 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:18.641 INFO:tasks.workunit.client.0.vm03.stdout:3/82: dwrite d5/f6 [4194304,4194304] 0 2026-03-09T16:14:18.643 INFO:tasks.workunit.client.0.vm03.stdout:7/68: fdatasync d4/f8 0 2026-03-09T16:14:18.643 INFO:tasks.workunit.client.0.vm03.stdout:7/69: read f2 [1823204,107001] 0 2026-03-09T16:14:18.644 INFO:tasks.workunit.client.0.vm03.stdout:7/70: dread - d4/fe zero size 2026-03-09T16:14:18.658 INFO:tasks.workunit.client.0.vm03.stdout:6/71: sync 2026-03-09T16:14:18.658 INFO:tasks.workunit.client.0.vm03.stdout:6/72: readlink - no filename 2026-03-09T16:14:18.658 INFO:tasks.workunit.client.0.vm03.stdout:6/73: chown d9/cd 0 1 2026-03-09T16:14:18.659 INFO:tasks.workunit.client.0.vm03.stdout:8/104: rename da/c18 to da/d15/c21 0 2026-03-09T16:14:18.666 INFO:tasks.workunit.client.0.vm03.stdout:0/129: mknod d0/c24 0 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:0/130: dread d0/da/d11/f13 [0,4194304] 0 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:0/131: readlink d0/da/l21 0 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:9/101: creat d2/de/f1c x:0 0 0 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:8/105: rmdir da/d10 39 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:8/106: dread - da/db/f1c zero size 2026-03-09T16:14:18.675 INFO:tasks.workunit.client.0.vm03.stdout:1/83: creat d4/f1f x:0 0 0 2026-03-09T16:14:18.678 INFO:tasks.workunit.client.0.vm03.stdout:2/124: getdents db 0 2026-03-09T16:14:18.679 INFO:tasks.workunit.client.0.vm03.stdout:5/119: link d2/c15 d2/d7/d8/d24/d27/c2e 0 2026-03-09T16:14:18.681 INFO:tasks.workunit.client.0.vm03.stdout:9/102: creat d2/d4/d11/f1d x:0 0 0 2026-03-09T16:14:18.682 INFO:tasks.workunit.client.0.vm03.stdout:8/107: fsync da/d10/f14 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:2/125: write db/d12/f22 [785708,13961] 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:5/120: fsync d2/d7/de/d11/f26 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:2/126: truncate db/d12/f21 332829 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:9/103: creat d2/d4/d11/d12/f1e x:0 0 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:9/104: chown d2/df/f10 2860 1 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:1/84: truncate d4/d6/f9 3300614 0 2026-03-09T16:14:18.690 INFO:tasks.workunit.client.0.vm03.stdout:5/121: dwrite d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:18.692 INFO:tasks.workunit.client.0.vm03.stdout:3/83: getdents d5/d13 0 2026-03-09T16:14:18.693 INFO:tasks.workunit.client.0.vm03.stdout:2/127: mkdir db/d1e/d30 0 2026-03-09T16:14:18.693 INFO:tasks.workunit.client.0.vm03.stdout:2/128: write f0 [1519459,114291] 0 2026-03-09T16:14:18.694 INFO:tasks.workunit.client.0.vm03.stdout:9/105: mkdir d2/d4/d1f 0 2026-03-09T16:14:18.695 INFO:tasks.workunit.client.0.vm03.stdout:2/129: write f5 [1474948,1570] 0 2026-03-09T16:14:18.698 INFO:tasks.workunit.client.0.vm03.stdout:5/122: creat d2/d7/de/f2f x:0 0 0 2026-03-09T16:14:18.698 INFO:tasks.workunit.client.0.vm03.stdout:3/84: creat d5/f1b x:0 0 0 2026-03-09T16:14:18.703 INFO:tasks.workunit.client.0.vm03.stdout:2/130: readlink db/d12/l16 0 2026-03-09T16:14:18.704 INFO:tasks.workunit.client.0.vm03.stdout:1/85: getdents d4/d6/d1d 0 2026-03-09T16:14:18.704 INFO:tasks.workunit.client.0.vm03.stdout:3/85: mknod d5/d13/c1c 0 2026-03-09T16:14:18.705 INFO:tasks.workunit.client.0.vm03.stdout:5/123: mknod d2/d7/d8/d16/c30 0 2026-03-09T16:14:18.708 INFO:tasks.workunit.client.0.vm03.stdout:8/108: getdents da/d10 0 2026-03-09T16:14:18.709 INFO:tasks.workunit.client.0.vm03.stdout:8/109: fdatasync f6 0 2026-03-09T16:14:18.709 INFO:tasks.workunit.client.0.vm03.stdout:9/106: dwrite d2/d4/fd [0,4194304] 0 2026-03-09T16:14:18.709 INFO:tasks.workunit.client.0.vm03.stdout:8/110: write da/d10/f14 [1286965,88148] 0 2026-03-09T16:14:18.710 INFO:tasks.workunit.client.0.vm03.stdout:8/111: write f6 [3087527,130621] 0 2026-03-09T16:14:18.715 INFO:tasks.workunit.client.0.vm03.stdout:9/107: write d2/d4/d11/d12/f1e [657127,89652] 0 2026-03-09T16:14:18.720 INFO:tasks.workunit.client.0.vm03.stdout:5/124: dread d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:18.723 INFO:tasks.workunit.client.0.vm03.stdout:7/71: sync 2026-03-09T16:14:18.734 INFO:tasks.workunit.client.0.vm03.stdout:0/132: sync 2026-03-09T16:14:18.749 INFO:tasks.workunit.client.0.vm03.stdout:3/86: unlink d5/fd 0 2026-03-09T16:14:18.752 INFO:tasks.workunit.client.0.vm03.stdout:6/74: dread d9/fb [0,4194304] 0 2026-03-09T16:14:18.755 INFO:tasks.workunit.client.0.vm03.stdout:9/108: dwrite d2/d4/d11/f13 [4194304,4194304] 0 2026-03-09T16:14:18.757 INFO:tasks.workunit.client.0.vm03.stdout:7/72: chown d4/c13 40385255 1 2026-03-09T16:14:18.757 INFO:tasks.workunit.client.0.vm03.stdout:6/75: dread f7 [0,4194304] 0 2026-03-09T16:14:18.757 INFO:tasks.workunit.client.0.vm03.stdout:0/133: rename d0/da/l21 to d0/da/d11/l25 0 2026-03-09T16:14:18.759 INFO:tasks.workunit.client.0.vm03.stdout:1/86: mkdir d4/d6/d1d/d20 0 2026-03-09T16:14:18.761 INFO:tasks.workunit.client.0.vm03.stdout:8/112: symlink da/d1d/l22 0 2026-03-09T16:14:18.763 INFO:tasks.workunit.client.0.vm03.stdout:8/113: chown da/d10/f1f 9447 1 2026-03-09T16:14:18.765 INFO:tasks.workunit.client.0.vm03.stdout:3/87: creat d5/d13/f1d x:0 0 0 2026-03-09T16:14:18.770 INFO:tasks.workunit.client.0.vm03.stdout:7/73: dwrite d4/dc/ff [0,4194304] 0 2026-03-09T16:14:18.780 INFO:tasks.workunit.client.0.vm03.stdout:3/88: write d5/fb [1754065,113574] 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:3/89: dread d5/d13/f14 [0,4194304] 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:5/125: unlink d2/cb 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:6/76: creat d9/ff x:0 0 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:8/114: dwrite da/d10/f1f [0,4194304] 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:9/109: dread d2/f8 [4194304,4194304] 0 2026-03-09T16:14:18.781 INFO:tasks.workunit.client.0.vm03.stdout:1/87: creat d4/db/f21 x:0 0 0 2026-03-09T16:14:18.782 INFO:tasks.workunit.client.0.vm03.stdout:3/90: write d5/f6 [3916695,29346] 0 2026-03-09T16:14:18.783 INFO:tasks.workunit.client.0.vm03.stdout:7/74: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:18.789 INFO:tasks.workunit.client.0.vm03.stdout:3/91: write d5/f6 [4521187,30180] 0 2026-03-09T16:14:18.791 INFO:tasks.workunit.client.0.vm03.stdout:0/134: rename d0/da/fe to d0/da/f26 0 2026-03-09T16:14:18.796 INFO:tasks.workunit.client.0.vm03.stdout:1/88: dwrite d4/f1f [0,4194304] 0 2026-03-09T16:14:18.805 INFO:tasks.workunit.client.0.vm03.stdout:1/89: dwrite f2 [0,4194304] 0 2026-03-09T16:14:18.829 INFO:tasks.workunit.client.0.vm03.stdout:6/77: symlink d9/l10 0 2026-03-09T16:14:18.833 INFO:tasks.workunit.client.0.vm03.stdout:8/115: write f8 [3100706,35332] 0 2026-03-09T16:14:18.838 INFO:tasks.workunit.client.0.vm03.stdout:9/110: mknod d2/d4/c20 0 2026-03-09T16:14:18.840 INFO:tasks.workunit.client.0.vm03.stdout:7/75: write f2 [697558,88804] 0 2026-03-09T16:14:18.842 INFO:tasks.workunit.client.0.vm03.stdout:3/92: mkdir d5/d1e 0 2026-03-09T16:14:18.842 INFO:tasks.workunit.client.0.vm03.stdout:3/93: chown d5/ca 3 1 2026-03-09T16:14:18.846 INFO:tasks.workunit.client.0.vm03.stdout:6/78: mknod d9/c11 0 2026-03-09T16:14:18.848 INFO:tasks.workunit.client.0.vm03.stdout:7/76: symlink d4/dc/l17 0 2026-03-09T16:14:18.858 INFO:tasks.workunit.client.0.vm03.stdout:1/90: symlink d4/d6/l22 0 2026-03-09T16:14:18.858 INFO:tasks.workunit.client.0.vm03.stdout:1/91: chown d4/f1b 91869244 1 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:1/92: chown d4/db 32169 1 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:8/116: getdents da/d10 0 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:8/117: write da/d10/f1f [3322196,79589] 0 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:8/118: chown da/d10 884 1 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:0/135: rename d0/da/d1b/f12 to d0/f27 0 2026-03-09T16:14:18.859 INFO:tasks.workunit.client.0.vm03.stdout:0/136: dread d0/da/d11/f13 [0,4194304] 0 2026-03-09T16:14:18.863 INFO:tasks.workunit.client.0.vm03.stdout:3/94: rename d5/f1b to d5/d13/f1f 0 2026-03-09T16:14:18.874 INFO:tasks.workunit.client.0.vm03.stdout:1/93: mkdir d4/d6/d1d/d20/d23 0 2026-03-09T16:14:18.874 INFO:tasks.workunit.client.0.vm03.stdout:1/94: dread d4/fd [0,4194304] 0 2026-03-09T16:14:18.874 INFO:tasks.workunit.client.0.vm03.stdout:1/95: dread - d4/f1b zero size 2026-03-09T16:14:18.874 INFO:tasks.workunit.client.0.vm03.stdout:1/96: write d4/f1b [107981,68422] 0 2026-03-09T16:14:18.883 INFO:tasks.workunit.client.0.vm03.stdout:1/97: mkdir d4/d6/d1d/d24 0 2026-03-09T16:14:18.891 INFO:tasks.workunit.client.0.vm03.stdout:1/98: mkdir d4/d6/d1d/d24/d25 0 2026-03-09T16:14:18.922 INFO:tasks.workunit.client.0.vm03.stdout:8/119: dread da/db/f13 [0,4194304] 0 2026-03-09T16:14:18.933 INFO:tasks.workunit.client.0.vm03.stdout:9/111: dread d2/d4/d11/d12/f1e [0,4194304] 0 2026-03-09T16:14:18.934 INFO:tasks.workunit.client.0.vm03.stdout:9/112: symlink d2/d4/d11/d12/l21 0 2026-03-09T16:14:18.935 INFO:tasks.workunit.client.0.vm03.stdout:9/113: chown d2/d4/l19 32057310 1 2026-03-09T16:14:18.936 INFO:tasks.workunit.client.0.vm03.stdout:9/114: creat d2/df/f22 x:0 0 0 2026-03-09T16:14:18.937 INFO:tasks.workunit.client.0.vm03.stdout:9/115: creat d2/d4/d1f/f23 x:0 0 0 2026-03-09T16:14:18.937 INFO:tasks.workunit.client.0.vm03.stdout:9/116: symlink d2/df/l24 0 2026-03-09T16:14:18.952 INFO:tasks.workunit.client.0.vm03.stdout:3/95: sync 2026-03-09T16:14:18.955 INFO:tasks.workunit.client.0.vm03.stdout:8/120: sync 2026-03-09T16:14:18.956 INFO:tasks.workunit.client.0.vm03.stdout:9/117: sync 2026-03-09T16:14:18.956 INFO:tasks.workunit.client.0.vm03.stdout:5/126: fdatasync d2/d7/de/d11/f26 0 2026-03-09T16:14:18.959 INFO:tasks.workunit.client.0.vm03.stdout:5/127: fdatasync d2/d7/de/d11/f26 0 2026-03-09T16:14:18.964 INFO:tasks.workunit.client.0.vm03.stdout:5/128: dwrite d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:18.967 INFO:tasks.workunit.client.0.vm03.stdout:5/129: write d2/d7/de/f2f [344336,130940] 0 2026-03-09T16:14:18.968 INFO:tasks.workunit.client.0.vm03.stdout:3/96: dwrite d5/d13/f1f [0,4194304] 0 2026-03-09T16:14:18.970 INFO:tasks.workunit.client.0.vm03.stdout:5/130: sync 2026-03-09T16:14:18.971 INFO:tasks.workunit.client.0.vm03.stdout:5/131: chown d2/d7/d8/c18 6182 1 2026-03-09T16:14:18.974 INFO:tasks.workunit.client.0.vm03.stdout:7/77: fdatasync d4/f8 0 2026-03-09T16:14:18.974 INFO:tasks.workunit.client.0.vm03.stdout:5/132: sync 2026-03-09T16:14:18.975 INFO:tasks.workunit.client.0.vm03.stdout:7/78: write d4/f8 [442314,74448] 0 2026-03-09T16:14:18.976 INFO:tasks.workunit.client.0.vm03.stdout:3/97: dread d5/f16 [0,4194304] 0 2026-03-09T16:14:18.996 INFO:tasks.workunit.client.0.vm03.stdout:7/79: mkdir d4/da/d18 0 2026-03-09T16:14:18.998 INFO:tasks.workunit.client.0.vm03.stdout:3/98: creat d5/d13/f20 x:0 0 0 2026-03-09T16:14:18.999 INFO:tasks.workunit.client.0.vm03.stdout:3/99: dread - d5/f11 zero size 2026-03-09T16:14:19.002 INFO:tasks.workunit.client.0.vm03.stdout:9/118: link d2/f15 d2/d4/d1f/f25 0 2026-03-09T16:14:19.003 INFO:tasks.workunit.client.0.vm03.stdout:7/80: mkdir d4/da/d19 0 2026-03-09T16:14:19.004 INFO:tasks.workunit.client.0.vm03.stdout:3/100: symlink d5/l21 0 2026-03-09T16:14:19.008 INFO:tasks.workunit.client.0.vm03.stdout:3/101: dwrite d5/d13/f1f [0,4194304] 0 2026-03-09T16:14:19.018 INFO:tasks.workunit.client.0.vm03.stdout:9/119: symlink d2/d4/l26 0 2026-03-09T16:14:19.020 INFO:tasks.workunit.client.0.vm03.stdout:7/81: rename f2 to d4/dc/f1a 0 2026-03-09T16:14:19.020 INFO:tasks.workunit.client.0.vm03.stdout:7/82: stat d4/da/d18 0 2026-03-09T16:14:19.026 INFO:tasks.workunit.client.0.vm03.stdout:3/102: dwrite d5/d13/f14 [0,4194304] 0 2026-03-09T16:14:19.030 INFO:tasks.workunit.client.0.vm03.stdout:9/120: mknod d2/d4/d1f/c27 0 2026-03-09T16:14:19.032 INFO:tasks.workunit.client.0.vm03.stdout:9/121: mkdir d2/d4/d11/d12/d28 0 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:9/122: fdatasync d2/de/f1c 0 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:9/123: dread - d2/df/f22 zero size 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:8/121: getdents da/d15 0 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:9/124: mkdir d2/d4/d11/d29 0 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:9/125: stat d2/f15 0 2026-03-09T16:14:19.043 INFO:tasks.workunit.client.0.vm03.stdout:4/124: dwrite f1 [0,4194304] 0 2026-03-09T16:14:19.044 INFO:tasks.workunit.client.0.vm03.stdout:3/103: sync 2026-03-09T16:14:19.045 INFO:tasks.workunit.client.0.vm03.stdout:3/104: rename d5/d1e to d5/d1e/d22 22 2026-03-09T16:14:19.045 INFO:tasks.workunit.client.0.vm03.stdout:3/105: readlink d5/lf 0 2026-03-09T16:14:19.045 INFO:tasks.workunit.client.0.vm03.stdout:3/106: chown d5/d13/f14 3 1 2026-03-09T16:14:19.051 INFO:tasks.workunit.client.0.vm03.stdout:9/126: mkdir d2/d4/d11/d29/d2a 0 2026-03-09T16:14:19.053 INFO:tasks.workunit.client.0.vm03.stdout:8/122: dread da/d15/f1b [0,4194304] 0 2026-03-09T16:14:19.055 INFO:tasks.workunit.client.0.vm03.stdout:3/107: mknod d5/d13/c23 0 2026-03-09T16:14:19.066 INFO:tasks.workunit.client.0.vm03.stdout:9/127: dwrite d2/d4/d11/f1d [0,4194304] 0 2026-03-09T16:14:19.066 INFO:tasks.workunit.client.0.vm03.stdout:8/123: dread da/d10/f1f [0,4194304] 0 2026-03-09T16:14:19.066 INFO:tasks.workunit.client.0.vm03.stdout:3/108: dread d5/d13/f1f [0,4194304] 0 2026-03-09T16:14:19.066 INFO:tasks.workunit.client.0.vm03.stdout:3/109: dread - d5/f11 zero size 2026-03-09T16:14:19.066 INFO:tasks.workunit.client.0.vm03.stdout:3/110: write d5/f10 [1000100,105683] 0 2026-03-09T16:14:19.075 INFO:tasks.workunit.client.0.vm03.stdout:8/124: creat da/d10/f23 x:0 0 0 2026-03-09T16:14:19.079 INFO:tasks.workunit.client.0.vm03.stdout:4/125: getdents d5 0 2026-03-09T16:14:19.093 INFO:tasks.workunit.client.0.vm03.stdout:8/125: dwrite f8 [0,4194304] 0 2026-03-09T16:14:19.093 INFO:tasks.workunit.client.0.vm03.stdout:3/111: write d5/f16 [113758,94478] 0 2026-03-09T16:14:19.093 INFO:tasks.workunit.client.0.vm03.stdout:4/126: mkdir d5/db/d25 0 2026-03-09T16:14:19.093 INFO:tasks.workunit.client.0.vm03.stdout:4/127: rmdir d5/db/d1b 0 2026-03-09T16:14:19.093 INFO:tasks.workunit.client.0.vm03.stdout:4/128: fdatasync d5/d17/f21 0 2026-03-09T16:14:19.103 INFO:tasks.workunit.client.0.vm03.stdout:5/133: fdatasync d2/d7/de/d11/f26 0 2026-03-09T16:14:19.105 INFO:tasks.workunit.client.0.vm03.stdout:5/134: mkdir d2/d7/de/d11/d19/d31 0 2026-03-09T16:14:19.105 INFO:tasks.workunit.client.0.vm03.stdout:5/135: write d2/d7/de/f2f [1036105,23785] 0 2026-03-09T16:14:19.106 INFO:tasks.workunit.client.0.vm03.stdout:5/136: fdatasync d2/d7/de/d11/f26 0 2026-03-09T16:14:19.125 INFO:tasks.workunit.client.0.vm03.stdout:5/137: dread d2/d7/de/f2f [0,4194304] 0 2026-03-09T16:14:19.128 INFO:tasks.workunit.client.0.vm03.stdout:5/138: creat d2/d7/de/d11/f32 x:0 0 0 2026-03-09T16:14:19.130 INFO:tasks.workunit.client.0.vm03.stdout:2/131: truncate f5 415124 0 2026-03-09T16:14:19.133 INFO:tasks.workunit.client.0.vm03.stdout:5/139: mkdir d2/d7/de/d33 0 2026-03-09T16:14:19.136 INFO:tasks.workunit.client.0.vm03.stdout:2/132: fsync db/d1e/f2f 0 2026-03-09T16:14:19.136 INFO:tasks.workunit.client.0.vm03.stdout:2/133: fdatasync db/f2e 0 2026-03-09T16:14:19.137 INFO:tasks.workunit.client.0.vm03.stdout:2/134: chown db/c24 1 1 2026-03-09T16:14:19.143 INFO:tasks.workunit.client.0.vm03.stdout:2/135: creat db/f31 x:0 0 0 2026-03-09T16:14:19.146 INFO:tasks.workunit.client.0.vm03.stdout:5/140: link d2/d7/d8/c18 d2/d7/d8/d24/d27/c34 0 2026-03-09T16:14:19.148 INFO:tasks.workunit.client.0.vm03.stdout:5/141: write d2/d7/de/d11/f32 [22367,70421] 0 2026-03-09T16:14:19.154 INFO:tasks.workunit.client.0.vm03.stdout:5/142: chown d2/ca 1230 1 2026-03-09T16:14:19.158 INFO:tasks.workunit.client.0.vm03.stdout:5/143: mkdir d2/d7/de/d11/d19/d31/d35 0 2026-03-09T16:14:19.159 INFO:tasks.workunit.client.0.vm03.stdout:5/144: write d2/d7/de/d11/f26 [2164458,42056] 0 2026-03-09T16:14:19.161 INFO:tasks.workunit.client.0.vm03.stdout:5/145: creat d2/d7/d8/f36 x:0 0 0 2026-03-09T16:14:19.164 INFO:tasks.workunit.client.0.vm03.stdout:5/146: mkdir d2/d37 0 2026-03-09T16:14:19.170 INFO:tasks.workunit.client.0.vm03.stdout:0/137: getdents d0/da 0 2026-03-09T16:14:19.174 INFO:tasks.workunit.client.0.vm03.stdout:0/138: creat d0/f28 x:0 0 0 2026-03-09T16:14:19.180 INFO:tasks.workunit.client.0.vm03.stdout:0/139: getdents d0/da/d11 0 2026-03-09T16:14:19.181 INFO:tasks.workunit.client.0.vm03.stdout:0/140: dread d0/f27 [0,4194304] 0 2026-03-09T16:14:19.181 INFO:tasks.workunit.client.0.vm03.stdout:5/147: sync 2026-03-09T16:14:19.184 INFO:tasks.workunit.client.0.vm03.stdout:5/148: rename d2/d37 to d2/d7/de/d11/d38 0 2026-03-09T16:14:19.190 INFO:tasks.workunit.client.0.vm03.stdout:5/149: mkdir d2/d7/de/d11/d19/d31/d35/d39 0 2026-03-09T16:14:19.192 INFO:tasks.workunit.client.0.vm03.stdout:5/150: write d2/d7/de/f2f [1761053,43853] 0 2026-03-09T16:14:19.195 INFO:tasks.workunit.client.0.vm03.stdout:5/151: rename d2/d7/de/d2b to d2/d7/de/d3a 0 2026-03-09T16:14:19.197 INFO:tasks.workunit.client.0.vm03.stdout:5/152: mkdir d2/d7/de/d11/d38/d3b 0 2026-03-09T16:14:19.199 INFO:tasks.workunit.client.0.vm03.stdout:5/153: mkdir d2/d7/d3c 0 2026-03-09T16:14:19.199 INFO:tasks.workunit.client.0.vm03.stdout:5/154: fdatasync d2/d7/d8/f36 0 2026-03-09T16:14:19.200 INFO:tasks.workunit.client.0.vm03.stdout:5/155: fsync d2/d7/de/d11/f26 0 2026-03-09T16:14:19.201 INFO:tasks.workunit.client.0.vm03.stdout:5/156: write d2/d7/de/d11/f32 [448321,124925] 0 2026-03-09T16:14:19.202 INFO:tasks.workunit.client.0.vm03.stdout:5/157: stat d2/d7/de/d11/d19/d31/d35/d39 0 2026-03-09T16:14:19.208 INFO:tasks.workunit.client.0.vm03.stdout:6/79: rmdir d9 39 2026-03-09T16:14:19.213 INFO:tasks.workunit.client.0.vm03.stdout:6/80: symlink d9/l12 0 2026-03-09T16:14:19.213 INFO:tasks.workunit.client.0.vm03.stdout:6/81: dread - d9/ff zero size 2026-03-09T16:14:19.217 INFO:tasks.workunit.client.0.vm03.stdout:1/99: getdents d4/d6/d1d/d24 0 2026-03-09T16:14:19.218 INFO:tasks.workunit.client.0.vm03.stdout:1/100: write d4/f1b [268939,83223] 0 2026-03-09T16:14:19.219 INFO:tasks.workunit.client.0.vm03.stdout:1/101: write d4/db/f21 [656798,76655] 0 2026-03-09T16:14:19.220 INFO:tasks.workunit.client.0.vm03.stdout:6/82: unlink d9/l10 0 2026-03-09T16:14:19.224 INFO:tasks.workunit.client.0.vm03.stdout:6/83: creat d9/f13 x:0 0 0 2026-03-09T16:14:19.229 INFO:tasks.workunit.client.0.vm03.stdout:1/102: rmdir d4/db/d18 0 2026-03-09T16:14:19.229 INFO:tasks.workunit.client.0.vm03.stdout:6/84: dwrite d9/fb [0,4194304] 0 2026-03-09T16:14:19.231 INFO:tasks.workunit.client.0.vm03.stdout:6/85: chown d9/cc 992 1 2026-03-09T16:14:19.234 INFO:tasks.workunit.client.0.vm03.stdout:1/103: dread d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:19.241 INFO:tasks.workunit.client.0.vm03.stdout:6/86: mkdir d9/d14 0 2026-03-09T16:14:19.242 INFO:tasks.workunit.client.0.vm03.stdout:6/87: truncate d9/ff 927456 0 2026-03-09T16:14:19.242 INFO:tasks.workunit.client.0.vm03.stdout:6/88: readlink d9/l12 0 2026-03-09T16:14:19.242 INFO:tasks.workunit.client.0.vm03.stdout:6/89: readlink d9/l12 0 2026-03-09T16:14:19.246 INFO:tasks.workunit.client.0.vm03.stdout:6/90: rename d9/fb to d9/f15 0 2026-03-09T16:14:19.251 INFO:tasks.workunit.client.0.vm03.stdout:6/91: creat d9/d14/f16 x:0 0 0 2026-03-09T16:14:19.252 INFO:tasks.workunit.client.0.vm03.stdout:6/92: mkdir d9/d17 0 2026-03-09T16:14:19.252 INFO:tasks.workunit.client.0.vm03.stdout:6/93: truncate d9/ff 1469830 0 2026-03-09T16:14:19.256 INFO:tasks.workunit.client.0.vm03.stdout:6/94: dwrite d9/d14/f16 [0,4194304] 0 2026-03-09T16:14:19.257 INFO:tasks.workunit.client.0.vm03.stdout:6/95: stat d9/d14 0 2026-03-09T16:14:19.261 INFO:tasks.workunit.client.0.vm03.stdout:6/96: dread d9/d14/f16 [0,4194304] 0 2026-03-09T16:14:19.262 INFO:tasks.workunit.client.0.vm03.stdout:6/97: readlink d9/l12 0 2026-03-09T16:14:19.262 INFO:tasks.workunit.client.0.vm03.stdout:6/98: write d9/f13 [310636,26128] 0 2026-03-09T16:14:19.270 INFO:tasks.workunit.client.0.vm03.stdout:3/112: write d5/d13/f14 [5087202,126607] 0 2026-03-09T16:14:19.274 INFO:tasks.workunit.client.0.vm03.stdout:7/83: truncate d4/dc/f1a 946739 0 2026-03-09T16:14:19.276 INFO:tasks.workunit.client.0.vm03.stdout:9/128: getdents d2/d4/d11 0 2026-03-09T16:14:19.276 INFO:tasks.workunit.client.0.vm03.stdout:9/129: write d2/f7 [3090906,122475] 0 2026-03-09T16:14:19.279 INFO:tasks.workunit.client.0.vm03.stdout:3/113: creat d5/f24 x:0 0 0 2026-03-09T16:14:19.283 INFO:tasks.workunit.client.0.vm03.stdout:3/114: dwrite d5/fb [4194304,4194304] 0 2026-03-09T16:14:19.288 INFO:tasks.workunit.client.0.vm03.stdout:9/130: symlink d2/de/l2b 0 2026-03-09T16:14:19.298 INFO:tasks.workunit.client.0.vm03.stdout:3/115: dwrite d5/d13/f1f [0,4194304] 0 2026-03-09T16:14:19.302 INFO:tasks.workunit.client.0.vm03.stdout:3/116: truncate d5/d13/f20 636145 0 2026-03-09T16:14:19.318 INFO:tasks.workunit.client.0.vm03.stdout:8/126: truncate da/d10/f14 3720337 0 2026-03-09T16:14:19.320 INFO:tasks.workunit.client.0.vm03.stdout:4/129: getdents d5/db 0 2026-03-09T16:14:19.331 INFO:tasks.workunit.client.0.vm03.stdout:9/131: fsync d2/f7 0 2026-03-09T16:14:19.358 INFO:tasks.workunit.client.0.vm03.stdout:0/141: truncate d0/da/d11/f18 145744 0 2026-03-09T16:14:19.386 INFO:tasks.workunit.client.0.vm03.stdout:6/99: truncate d9/f13 192810 0 2026-03-09T16:14:19.443 INFO:tasks.workunit.client.0.vm03.stdout:7/84: dwrite d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:19.459 INFO:tasks.workunit.client.0.vm03.stdout:6/100: dwrite d9/d14/f16 [0,4194304] 0 2026-03-09T16:14:19.460 INFO:tasks.workunit.client.0.vm03.stdout:6/101: dread d9/ff [0,4194304] 0 2026-03-09T16:14:19.461 INFO:tasks.workunit.client.0.vm03.stdout:6/102: read d9/ff [1228642,120933] 0 2026-03-09T16:14:19.461 INFO:tasks.workunit.client.0.vm03.stdout:6/103: chown d9/d14 6533 1 2026-03-09T16:14:19.461 INFO:tasks.workunit.client.0.vm03.stdout:6/104: chown d9/cd 26320 1 2026-03-09T16:14:19.468 INFO:tasks.workunit.client.0.vm03.stdout:4/130: creat d5/db/d25/f26 x:0 0 0 2026-03-09T16:14:19.473 INFO:tasks.workunit.client.0.vm03.stdout:8/127: write da/d10/f1f [2603335,99917] 0 2026-03-09T16:14:19.474 INFO:tasks.workunit.client.0.vm03.stdout:8/128: fdatasync da/db/fe 0 2026-03-09T16:14:19.474 INFO:tasks.workunit.client.0.vm03.stdout:8/129: chown da/d10/f1f 2126054551 1 2026-03-09T16:14:19.475 INFO:tasks.workunit.client.0.vm03.stdout:5/158: rmdir d2/d7/de/d3a 0 2026-03-09T16:14:19.477 INFO:tasks.workunit.client.0.vm03.stdout:3/117: getdents d5/d13 0 2026-03-09T16:14:19.479 INFO:tasks.workunit.client.0.vm03.stdout:9/132: dwrite d2/f8 [0,4194304] 0 2026-03-09T16:14:19.481 INFO:tasks.workunit.client.0.vm03.stdout:9/133: dread d2/f7 [0,4194304] 0 2026-03-09T16:14:19.482 INFO:tasks.workunit.client.0.vm03.stdout:9/134: dread - d2/d4/d1f/f23 zero size 2026-03-09T16:14:19.483 INFO:tasks.workunit.client.0.vm03.stdout:9/135: chown d2/d4/d11/d12/l21 969704 1 2026-03-09T16:14:19.488 INFO:tasks.workunit.client.0.vm03.stdout:9/136: dwrite d2/df/f10 [0,4194304] 0 2026-03-09T16:14:19.489 INFO:tasks.workunit.client.0.vm03.stdout:4/131: symlink d5/dd/d1f/l27 0 2026-03-09T16:14:19.492 INFO:tasks.workunit.client.0.vm03.stdout:8/130: chown c5 879287586 1 2026-03-09T16:14:19.499 INFO:tasks.workunit.client.0.vm03.stdout:0/142: rename d0/d7/f20 to d0/f29 0 2026-03-09T16:14:19.500 INFO:tasks.workunit.client.0.vm03.stdout:0/143: chown d0/f3 3769228 1 2026-03-09T16:14:19.506 INFO:tasks.workunit.client.0.vm03.stdout:1/104: write d4/d6/f9 [300571,129188] 0 2026-03-09T16:14:19.510 INFO:tasks.workunit.client.0.vm03.stdout:7/85: truncate d4/dc/ff 2708705 0 2026-03-09T16:14:19.513 INFO:tasks.workunit.client.0.vm03.stdout:6/105: rename d9/c11 to d9/d14/c18 0 2026-03-09T16:14:19.517 INFO:tasks.workunit.client.0.vm03.stdout:8/131: rename da/d10 to da/d10/d24 22 2026-03-09T16:14:19.517 INFO:tasks.workunit.client.0.vm03.stdout:8/132: read da/db/f13 [60589,82074] 0 2026-03-09T16:14:19.517 INFO:tasks.workunit.client.0.vm03.stdout:8/133: dread f6 [0,4194304] 0 2026-03-09T16:14:19.517 INFO:tasks.workunit.client.0.vm03.stdout:8/134: fsync f8 0 2026-03-09T16:14:19.518 INFO:tasks.workunit.client.0.vm03.stdout:5/159: mkdir d2/d7/d3c/d3d 0 2026-03-09T16:14:19.519 INFO:tasks.workunit.client.0.vm03.stdout:5/160: read d2/d7/de/d11/f26 [4054143,57014] 0 2026-03-09T16:14:19.520 INFO:tasks.workunit.client.0.vm03.stdout:1/105: mknod d4/d6/d1d/c26 0 2026-03-09T16:14:19.520 INFO:tasks.workunit.client.0.vm03.stdout:9/137: creat d2/d4/d11/d12/d28/f2c x:0 0 0 2026-03-09T16:14:19.521 INFO:tasks.workunit.client.0.vm03.stdout:2/136: getdents db/d12 0 2026-03-09T16:14:19.525 INFO:tasks.workunit.client.0.vm03.stdout:5/161: dwrite d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:19.529 INFO:tasks.workunit.client.0.vm03.stdout:9/138: mknod d2/d4/d11/d29/c2d 0 2026-03-09T16:14:19.534 INFO:tasks.workunit.client.0.vm03.stdout:5/162: dread d2/d7/de/f2f [0,4194304] 0 2026-03-09T16:14:19.535 INFO:tasks.workunit.client.0.vm03.stdout:1/106: dwrite f2 [0,4194304] 0 2026-03-09T16:14:19.538 INFO:tasks.workunit.client.0.vm03.stdout:5/163: chown d2/d7/de/d11/d19/d31/d35 46 1 2026-03-09T16:14:19.543 INFO:tasks.workunit.client.0.vm03.stdout:5/164: dread d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:19.543 INFO:tasks.workunit.client.0.vm03.stdout:2/137: symlink db/d12/l32 0 2026-03-09T16:14:19.544 INFO:tasks.workunit.client.0.vm03.stdout:5/165: write d2/d7/de/d11/f26 [432960,16354] 0 2026-03-09T16:14:19.546 INFO:tasks.workunit.client.0.vm03.stdout:5/166: write d2/d7/de/d11/f32 [827906,103051] 0 2026-03-09T16:14:19.554 INFO:tasks.workunit.client.0.vm03.stdout:6/106: symlink d9/d17/l19 0 2026-03-09T16:14:19.556 INFO:tasks.workunit.client.0.vm03.stdout:6/107: dread d9/d14/f16 [0,4194304] 0 2026-03-09T16:14:19.558 INFO:tasks.workunit.client.0.vm03.stdout:8/135: mknod da/db/c25 0 2026-03-09T16:14:19.560 INFO:tasks.workunit.client.0.vm03.stdout:7/86: rename d4/dc/ff to d4/da/d18/f1b 0 2026-03-09T16:14:19.563 INFO:tasks.workunit.client.0.vm03.stdout:4/132: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T16:14:19.576 INFO:tasks.workunit.client.0.vm03.stdout:0/144: dwrite d0/d7/f19 [0,4194304] 0 2026-03-09T16:14:19.576 INFO:tasks.workunit.client.0.vm03.stdout:3/118: getdents d5 0 2026-03-09T16:14:19.576 INFO:tasks.workunit.client.0.vm03.stdout:3/119: dread d5/d13/f20 [0,4194304] 0 2026-03-09T16:14:19.576 INFO:tasks.workunit.client.0.vm03.stdout:4/133: dread d5/f7 [4194304,4194304] 0 2026-03-09T16:14:19.586 INFO:tasks.workunit.client.0.vm03.stdout:6/108: dwrite d9/f13 [0,4194304] 0 2026-03-09T16:14:19.588 INFO:tasks.workunit.client.0.vm03.stdout:8/136: rmdir da/d10 39 2026-03-09T16:14:19.598 INFO:tasks.workunit.client.0.vm03.stdout:0/145: mkdir d0/d7/d2a 0 2026-03-09T16:14:19.601 INFO:tasks.workunit.client.0.vm03.stdout:4/134: creat d5/db/f28 x:0 0 0 2026-03-09T16:14:19.602 INFO:tasks.workunit.client.0.vm03.stdout:4/135: chown d5/db/l1a 227365 1 2026-03-09T16:14:19.602 INFO:tasks.workunit.client.0.vm03.stdout:4/136: chown d5/dd/d1f/d24 61 1 2026-03-09T16:14:19.605 INFO:tasks.workunit.client.0.vm03.stdout:2/138: symlink db/l33 0 2026-03-09T16:14:19.606 INFO:tasks.workunit.client.0.vm03.stdout:5/167: symlink d2/d7/de/d11/d19/d29/l3e 0 2026-03-09T16:14:19.610 INFO:tasks.workunit.client.0.vm03.stdout:8/137: mknod da/d1d/c26 0 2026-03-09T16:14:19.611 INFO:tasks.workunit.client.0.vm03.stdout:7/87: mknod d4/c1c 0 2026-03-09T16:14:19.616 INFO:tasks.workunit.client.0.vm03.stdout:0/146: dwrite d0/da/f26 [0,4194304] 0 2026-03-09T16:14:19.617 INFO:tasks.workunit.client.0.vm03.stdout:3/120: getdents d5/d1e 0 2026-03-09T16:14:19.627 INFO:tasks.workunit.client.0.vm03.stdout:5/168: chown d2/d7/d8/d16/c10 4217 1 2026-03-09T16:14:19.627 INFO:tasks.workunit.client.0.vm03.stdout:5/169: chown d2/d7/l2a 870 1 2026-03-09T16:14:19.628 INFO:tasks.workunit.client.0.vm03.stdout:8/138: symlink da/d15/l27 0 2026-03-09T16:14:19.633 INFO:tasks.workunit.client.0.vm03.stdout:8/139: dwrite da/db/f1c [0,4194304] 0 2026-03-09T16:14:19.645 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:19 vm03.local ceph-mon[51019]: pgmap v141: 65 pgs: 65 active+clean; 260 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 90 KiB/s rd, 15 MiB/s wr, 303 op/s 2026-03-09T16:14:19.645 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:19 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:19.652 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:19 vm05.local ceph-mon[58702]: pgmap v141: 65 pgs: 65 active+clean; 260 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 90 KiB/s rd, 15 MiB/s wr, 303 op/s 2026-03-09T16:14:19.655 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:19 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:19.655 INFO:tasks.workunit.client.0.vm03.stdout:0/147: symlink d0/da/d11/l2b 0 2026-03-09T16:14:19.655 INFO:tasks.workunit.client.0.vm03.stdout:0/148: write d0/f1a [2694721,38171] 0 2026-03-09T16:14:19.657 INFO:tasks.workunit.client.0.vm03.stdout:3/121: rename d5/d13/f1f to d5/d13/f25 0 2026-03-09T16:14:19.657 INFO:tasks.workunit.client.0.vm03.stdout:3/122: write d5/fb [6642277,73615] 0 2026-03-09T16:14:19.658 INFO:tasks.workunit.client.0.vm03.stdout:3/123: dread d5/f16 [0,4194304] 0 2026-03-09T16:14:19.665 INFO:tasks.workunit.client.0.vm03.stdout:5/170: fsync d2/d7/d8/f36 0 2026-03-09T16:14:19.672 INFO:tasks.workunit.client.0.vm03.stdout:8/140: fdatasync da/d10/f23 0 2026-03-09T16:14:19.673 INFO:tasks.workunit.client.0.vm03.stdout:8/141: write da/db/fe [2983652,105115] 0 2026-03-09T16:14:19.680 INFO:tasks.workunit.client.0.vm03.stdout:0/149: creat d0/da/f2c x:0 0 0 2026-03-09T16:14:19.683 INFO:tasks.workunit.client.0.vm03.stdout:9/139: write d2/d4/d1f/f25 [8169052,123601] 0 2026-03-09T16:14:19.693 INFO:tasks.workunit.client.0.vm03.stdout:5/171: chown d2/l22 464 1 2026-03-09T16:14:19.694 INFO:tasks.workunit.client.0.vm03.stdout:5/172: chown d2/d7/d8/d16/cc 3817628 1 2026-03-09T16:14:19.695 INFO:tasks.workunit.client.0.vm03.stdout:8/142: mkdir da/d10/d28 0 2026-03-09T16:14:19.695 INFO:tasks.workunit.client.0.vm03.stdout:8/143: read f8 [3674399,3378] 0 2026-03-09T16:14:19.701 INFO:tasks.workunit.client.0.vm03.stdout:0/150: rename d0/da/d11/l2b to d0/d7/d2a/l2d 0 2026-03-09T16:14:19.703 INFO:tasks.workunit.client.0.vm03.stdout:1/107: truncate d4/f1f 1544333 0 2026-03-09T16:14:19.704 INFO:tasks.workunit.client.0.vm03.stdout:9/140: symlink d2/d4/d11/d12/l2e 0 2026-03-09T16:14:19.704 INFO:tasks.workunit.client.0.vm03.stdout:9/141: chown d2/d4/d11/d29 19576 1 2026-03-09T16:14:19.705 INFO:tasks.workunit.client.0.vm03.stdout:9/142: fdatasync d2/f7 0 2026-03-09T16:14:19.707 INFO:tasks.workunit.client.0.vm03.stdout:1/108: dwrite d4/f1b [0,4194304] 0 2026-03-09T16:14:19.712 INFO:tasks.workunit.client.0.vm03.stdout:9/143: dwrite d2/df/f10 [0,4194304] 0 2026-03-09T16:14:19.714 INFO:tasks.workunit.client.0.vm03.stdout:3/124: link d5/d13/f25 d5/d1e/f26 0 2026-03-09T16:14:19.718 INFO:tasks.workunit.client.0.vm03.stdout:2/139: rmdir db/d1e/d30 0 2026-03-09T16:14:19.719 INFO:tasks.workunit.client.0.vm03.stdout:2/140: write f0 [2245032,80067] 0 2026-03-09T16:14:19.721 INFO:tasks.workunit.client.0.vm03.stdout:3/125: dwrite d5/f24 [0,4194304] 0 2026-03-09T16:14:19.730 INFO:tasks.workunit.client.0.vm03.stdout:6/109: dwrite d9/ff [0,4194304] 0 2026-03-09T16:14:19.734 INFO:tasks.workunit.client.0.vm03.stdout:6/110: dwrite d9/f13 [0,4194304] 0 2026-03-09T16:14:19.748 INFO:tasks.workunit.client.0.vm03.stdout:8/144: unlink da/db/f13 0 2026-03-09T16:14:19.755 INFO:tasks.workunit.client.0.vm03.stdout:8/145: write da/db/f1c [3391958,116807] 0 2026-03-09T16:14:19.755 INFO:tasks.workunit.client.0.vm03.stdout:8/146: dread da/db/fe [0,4194304] 0 2026-03-09T16:14:19.757 INFO:tasks.workunit.client.0.vm03.stdout:0/151: rename d0/da/f26 to d0/da/d11/f2e 0 2026-03-09T16:14:19.762 INFO:tasks.workunit.client.0.vm03.stdout:9/144: rmdir d2/d4/d11/d29 39 2026-03-09T16:14:19.765 INFO:tasks.workunit.client.0.vm03.stdout:9/145: chown d2/df/f22 527 1 2026-03-09T16:14:19.766 INFO:tasks.workunit.client.0.vm03.stdout:9/146: dread d2/d4/d11/d12/f1e [0,4194304] 0 2026-03-09T16:14:19.766 INFO:tasks.workunit.client.0.vm03.stdout:9/147: write d2/df/f22 [940217,123946] 0 2026-03-09T16:14:19.768 INFO:tasks.workunit.client.0.vm03.stdout:4/137: getdents d5 0 2026-03-09T16:14:19.768 INFO:tasks.workunit.client.0.vm03.stdout:4/138: dread - d5/db/f28 zero size 2026-03-09T16:14:19.768 INFO:tasks.workunit.client.0.vm03.stdout:4/139: readlink l0 0 2026-03-09T16:14:19.771 INFO:tasks.workunit.client.0.vm03.stdout:4/140: dread d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:19.772 INFO:tasks.workunit.client.0.vm03.stdout:4/141: chown d5/db/f28 3 1 2026-03-09T16:14:19.777 INFO:tasks.workunit.client.0.vm03.stdout:2/141: write db/f23 [288979,89445] 0 2026-03-09T16:14:19.782 INFO:tasks.workunit.client.0.vm03.stdout:6/111: unlink d9/d14/f16 0 2026-03-09T16:14:19.782 INFO:tasks.workunit.client.0.vm03.stdout:5/173: mkdir d2/d7/d1a/d1c/d3f 0 2026-03-09T16:14:19.787 INFO:tasks.workunit.client.0.vm03.stdout:0/152: mkdir d0/da/d1b/d2f 0 2026-03-09T16:14:19.789 INFO:tasks.workunit.client.0.vm03.stdout:0/153: dread d0/f27 [0,4194304] 0 2026-03-09T16:14:19.792 INFO:tasks.workunit.client.0.vm03.stdout:4/142: rename d5/db/l20 to d5/l29 0 2026-03-09T16:14:19.798 INFO:tasks.workunit.client.0.vm03.stdout:9/148: dread d2/df/f14 [0,4194304] 0 2026-03-09T16:14:19.801 INFO:tasks.workunit.client.0.vm03.stdout:9/149: dwrite d2/f8 [4194304,4194304] 0 2026-03-09T16:14:19.803 INFO:tasks.workunit.client.0.vm03.stdout:0/154: mknod d0/da/c30 0 2026-03-09T16:14:19.809 INFO:tasks.workunit.client.0.vm03.stdout:5/174: symlink d2/d7/de/d11/d19/d31/d35/d39/l40 0 2026-03-09T16:14:19.812 INFO:tasks.workunit.client.0.vm03.stdout:9/150: write d2/df/f14 [1951279,10917] 0 2026-03-09T16:14:19.814 INFO:tasks.workunit.client.0.vm03.stdout:2/142: creat db/f34 x:0 0 0 2026-03-09T16:14:19.814 INFO:tasks.workunit.client.0.vm03.stdout:2/143: fdatasync f0 0 2026-03-09T16:14:19.816 INFO:tasks.workunit.client.0.vm03.stdout:3/126: getdents d5/d13 0 2026-03-09T16:14:19.816 INFO:tasks.workunit.client.0.vm03.stdout:5/175: fdatasync d2/d7/de/f2f 0 2026-03-09T16:14:19.820 INFO:tasks.workunit.client.0.vm03.stdout:0/155: creat d0/da/d1b/d2f/f31 x:0 0 0 2026-03-09T16:14:19.821 INFO:tasks.workunit.client.0.vm03.stdout:0/156: truncate d0/d7/f8 203566 0 2026-03-09T16:14:19.825 INFO:tasks.workunit.client.0.vm03.stdout:3/127: mkdir d5/d27 0 2026-03-09T16:14:19.826 INFO:tasks.workunit.client.0.vm03.stdout:3/128: chown d5/l21 84772 1 2026-03-09T16:14:19.826 INFO:tasks.workunit.client.0.vm03.stdout:3/129: chown d5/fb 15486 1 2026-03-09T16:14:19.827 INFO:tasks.workunit.client.0.vm03.stdout:6/112: fsync d9/f13 0 2026-03-09T16:14:19.828 INFO:tasks.workunit.client.0.vm03.stdout:8/147: fdatasync da/db/f1c 0 2026-03-09T16:14:19.836 INFO:tasks.workunit.client.0.vm03.stdout:9/151: link d2/df/f22 d2/d4/d11/d12/d28/f2f 0 2026-03-09T16:14:19.836 INFO:tasks.workunit.client.0.vm03.stdout:9/152: dread d2/f7 [4194304,4194304] 0 2026-03-09T16:14:19.837 INFO:tasks.workunit.client.0.vm03.stdout:9/153: chown d2/df/f10 35360710 1 2026-03-09T16:14:19.838 INFO:tasks.workunit.client.0.vm03.stdout:0/157: mkdir d0/da/d11/d32 0 2026-03-09T16:14:19.839 INFO:tasks.workunit.client.0.vm03.stdout:0/158: stat d0/f27 0 2026-03-09T16:14:19.847 INFO:tasks.workunit.client.0.vm03.stdout:8/148: chown da/d10/f14 3 1 2026-03-09T16:14:19.848 INFO:tasks.workunit.client.0.vm03.stdout:8/149: dread da/d15/f1b [0,4194304] 0 2026-03-09T16:14:19.851 INFO:tasks.workunit.client.0.vm03.stdout:9/154: mknod d2/d4/c30 0 2026-03-09T16:14:19.853 INFO:tasks.workunit.client.0.vm03.stdout:2/144: mknod db/d12/d2a/c35 0 2026-03-09T16:14:19.854 INFO:tasks.workunit.client.0.vm03.stdout:2/145: write db/d12/f21 [232197,4275] 0 2026-03-09T16:14:19.854 INFO:tasks.workunit.client.0.vm03.stdout:2/146: read f9 [520864,31756] 0 2026-03-09T16:14:19.858 INFO:tasks.workunit.client.0.vm03.stdout:9/155: creat d2/d4/f31 x:0 0 0 2026-03-09T16:14:19.859 INFO:tasks.workunit.client.0.vm03.stdout:9/156: dread d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:19.860 INFO:tasks.workunit.client.0.vm03.stdout:7/88: write d4/da/d18/f1b [2537588,36180] 0 2026-03-09T16:14:19.864 INFO:tasks.workunit.client.0.vm03.stdout:0/159: link d0/f3 d0/da/d1b/d2f/f33 0 2026-03-09T16:14:19.870 INFO:tasks.workunit.client.0.vm03.stdout:0/160: chown d0/d7/d2a 1336507732 1 2026-03-09T16:14:19.870 INFO:tasks.workunit.client.0.vm03.stdout:7/89: dwrite d4/da/d18/f1b [0,4194304] 0 2026-03-09T16:14:19.870 INFO:tasks.workunit.client.0.vm03.stdout:7/90: dwrite d4/da/d18/f1b [0,4194304] 0 2026-03-09T16:14:19.878 INFO:tasks.workunit.client.0.vm03.stdout:4/143: truncate d5/dd/f1e 411839 0 2026-03-09T16:14:19.878 INFO:tasks.workunit.client.0.vm03.stdout:4/144: stat d5/cc 0 2026-03-09T16:14:19.880 INFO:tasks.workunit.client.0.vm03.stdout:8/150: creat da/d10/d28/f29 x:0 0 0 2026-03-09T16:14:19.884 INFO:tasks.workunit.client.0.vm03.stdout:0/161: chown d0/da/d11/f18 2 1 2026-03-09T16:14:19.886 INFO:tasks.workunit.client.0.vm03.stdout:4/145: mknod d5/db/c2a 0 2026-03-09T16:14:19.891 INFO:tasks.workunit.client.0.vm03.stdout:6/113: getdents d9/d17 0 2026-03-09T16:14:19.891 INFO:tasks.workunit.client.0.vm03.stdout:6/114: read d9/f13 [921371,74348] 0 2026-03-09T16:14:19.893 INFO:tasks.workunit.client.0.vm03.stdout:7/91: symlink d4/da/d12/d16/l1d 0 2026-03-09T16:14:19.894 INFO:tasks.workunit.client.0.vm03.stdout:8/151: dread da/db/fe [0,4194304] 0 2026-03-09T16:14:19.895 INFO:tasks.workunit.client.0.vm03.stdout:0/162: unlink d0/f27 0 2026-03-09T16:14:19.896 INFO:tasks.workunit.client.0.vm03.stdout:6/115: symlink d9/l1a 0 2026-03-09T16:14:19.897 INFO:tasks.workunit.client.0.vm03.stdout:7/92: mknod d4/da/d12/c1e 0 2026-03-09T16:14:19.898 INFO:tasks.workunit.client.0.vm03.stdout:7/93: dread - d4/fe zero size 2026-03-09T16:14:19.899 INFO:tasks.workunit.client.0.vm03.stdout:0/163: fdatasync d0/da/d11/f2e 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:8/152: symlink da/db/l2a 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:6/116: rename d9/d14/c18 to d9/d14/c1b 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:8/153: rmdir da/d10 39 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:7/94: symlink d4/da/d12/d15/l1f 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:4/146: getdents d5/dd/d1f 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:4/147: chown d5/dd/d1f/l27 32045 1 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:4/148: fdatasync d5/d17/f21 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:6/117: creat d9/d17/f1c x:0 0 0 2026-03-09T16:14:19.908 INFO:tasks.workunit.client.0.vm03.stdout:8/154: unlink f6 0 2026-03-09T16:14:19.909 INFO:tasks.workunit.client.0.vm03.stdout:7/95: creat d4/da/f20 x:0 0 0 2026-03-09T16:14:19.910 INFO:tasks.workunit.client.0.vm03.stdout:5/176: sync 2026-03-09T16:14:19.910 INFO:tasks.workunit.client.0.vm03.stdout:9/157: sync 2026-03-09T16:14:19.910 INFO:tasks.workunit.client.0.vm03.stdout:7/96: write d4/f8 [2791100,70021] 0 2026-03-09T16:14:19.911 INFO:tasks.workunit.client.0.vm03.stdout:9/158: fsync d2/de/f1c 0 2026-03-09T16:14:19.912 INFO:tasks.workunit.client.0.vm03.stdout:4/149: rename d5/f10 to d5/d17/f2b 0 2026-03-09T16:14:19.914 INFO:tasks.workunit.client.0.vm03.stdout:9/159: dread d2/f7 [0,4194304] 0 2026-03-09T16:14:19.918 INFO:tasks.workunit.client.0.vm03.stdout:7/97: unlink d4/c1c 0 2026-03-09T16:14:19.921 INFO:tasks.workunit.client.0.vm03.stdout:6/118: link d9/f15 d9/d14/f1d 0 2026-03-09T16:14:19.922 INFO:tasks.workunit.client.0.vm03.stdout:6/119: chown d9/d17/f1c 130939764 1 2026-03-09T16:14:19.925 INFO:tasks.workunit.client.0.vm03.stdout:0/164: dread d0/f3 [0,4194304] 0 2026-03-09T16:14:19.926 INFO:tasks.workunit.client.0.vm03.stdout:0/165: fdatasync d0/f28 0 2026-03-09T16:14:19.926 INFO:tasks.workunit.client.0.vm03.stdout:6/120: dwrite d9/d17/f1c [0,4194304] 0 2026-03-09T16:14:19.929 INFO:tasks.workunit.client.0.vm03.stdout:6/121: write d9/f13 [474666,20045] 0 2026-03-09T16:14:19.938 INFO:tasks.workunit.client.0.vm03.stdout:8/155: symlink da/d10/l2b 0 2026-03-09T16:14:19.945 INFO:tasks.workunit.client.0.vm03.stdout:5/177: truncate d2/d7/d8/f36 112159 0 2026-03-09T16:14:19.958 INFO:tasks.workunit.client.0.vm03.stdout:7/98: symlink d4/da/l21 0 2026-03-09T16:14:19.958 INFO:tasks.workunit.client.0.vm03.stdout:5/178: dwrite d2/d7/de/d11/f26 [4194304,4194304] 0 2026-03-09T16:14:19.958 INFO:tasks.workunit.client.0.vm03.stdout:5/179: chown d2/d7/de/d11/c1b 222679 1 2026-03-09T16:14:19.959 INFO:tasks.workunit.client.0.vm03.stdout:1/109: truncate d4/f1f 771686 0 2026-03-09T16:14:19.959 INFO:tasks.workunit.client.0.vm03.stdout:3/130: write d5/d13/f20 [1028727,68380] 0 2026-03-09T16:14:19.960 INFO:tasks.workunit.client.0.vm03.stdout:3/131: chown d5/d13/c1c 244680769 1 2026-03-09T16:14:19.962 INFO:tasks.workunit.client.0.vm03.stdout:1/110: dread d4/fd [4194304,4194304] 0 2026-03-09T16:14:19.969 INFO:tasks.workunit.client.0.vm03.stdout:8/156: sync 2026-03-09T16:14:19.971 INFO:tasks.workunit.client.0.vm03.stdout:3/132: dread d5/f6 [0,4194304] 0 2026-03-09T16:14:19.975 INFO:tasks.workunit.client.0.vm03.stdout:2/147: truncate fa 3998346 0 2026-03-09T16:14:19.975 INFO:tasks.workunit.client.0.vm03.stdout:8/157: dwrite da/db/f1c [0,4194304] 0 2026-03-09T16:14:19.979 INFO:tasks.workunit.client.0.vm03.stdout:8/158: dread f8 [0,4194304] 0 2026-03-09T16:14:19.986 INFO:tasks.workunit.client.0.vm03.stdout:8/159: dread da/db/fe [0,4194304] 0 2026-03-09T16:14:19.987 INFO:tasks.workunit.client.0.vm03.stdout:8/160: fsync da/d10/d28/f29 0 2026-03-09T16:14:19.996 INFO:tasks.workunit.client.0.vm03.stdout:0/166: mknod d0/d7/c34 0 2026-03-09T16:14:20.000 INFO:tasks.workunit.client.0.vm03.stdout:7/99: mkdir d4/da/d18/d22 0 2026-03-09T16:14:20.010 INFO:tasks.workunit.client.0.vm03.stdout:2/148: mknod db/d1e/c36 0 2026-03-09T16:14:20.011 INFO:tasks.workunit.client.0.vm03.stdout:8/161: creat da/d10/d28/f2c x:0 0 0 2026-03-09T16:14:20.013 INFO:tasks.workunit.client.0.vm03.stdout:4/150: write d5/f8 [1700914,45601] 0 2026-03-09T16:14:20.014 INFO:tasks.workunit.client.0.vm03.stdout:9/160: link d2/df/l16 d2/df/l32 0 2026-03-09T16:14:20.014 INFO:tasks.workunit.client.0.vm03.stdout:6/122: truncate d9/f15 3956941 0 2026-03-09T16:14:20.016 INFO:tasks.workunit.client.0.vm03.stdout:7/100: mknod d4/da/c23 0 2026-03-09T16:14:20.016 INFO:tasks.workunit.client.0.vm03.stdout:7/101: fdatasync d4/fe 0 2026-03-09T16:14:20.017 INFO:tasks.workunit.client.0.vm03.stdout:7/102: chown d4/da/c23 38532 1 2026-03-09T16:14:20.035 INFO:tasks.workunit.client.0.vm03.stdout:2/149: rename db/d12/f22 to db/d12/f37 0 2026-03-09T16:14:20.043 INFO:tasks.workunit.client.0.vm03.stdout:4/151: symlink d5/dd/d1f/l2c 0 2026-03-09T16:14:20.047 INFO:tasks.workunit.client.0.vm03.stdout:5/180: dwrite d2/d7/de/f2f [0,4194304] 0 2026-03-09T16:14:20.048 INFO:tasks.workunit.client.0.vm03.stdout:9/161: creat d2/f33 x:0 0 0 2026-03-09T16:14:20.048 INFO:tasks.workunit.client.0.vm03.stdout:5/181: stat d2/d7/de/d11/d38 0 2026-03-09T16:14:20.049 INFO:tasks.workunit.client.0.vm03.stdout:9/162: chown d2/d4/c20 0 1 2026-03-09T16:14:20.055 INFO:tasks.workunit.client.0.vm03.stdout:3/133: dwrite d5/d13/f25 [0,4194304] 0 2026-03-09T16:14:20.056 INFO:tasks.workunit.client.0.vm03.stdout:3/134: write d5/d13/f25 [2390043,122093] 0 2026-03-09T16:14:20.064 INFO:tasks.workunit.client.0.vm03.stdout:7/103: rename d4/da/d12 to d4/da/d18/d22/d24 0 2026-03-09T16:14:20.065 INFO:tasks.workunit.client.0.vm03.stdout:7/104: readlink d4/da/d18/d22/d24/d16/l1d 0 2026-03-09T16:14:20.067 INFO:tasks.workunit.client.0.vm03.stdout:7/105: dread d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:20.070 INFO:tasks.workunit.client.0.vm03.stdout:6/123: rmdir d9 39 2026-03-09T16:14:20.074 INFO:tasks.workunit.client.0.vm03.stdout:2/150: chown db/fd 2128 1 2026-03-09T16:14:20.074 INFO:tasks.workunit.client.0.vm03.stdout:2/151: fdatasync f0 0 2026-03-09T16:14:20.076 INFO:tasks.workunit.client.0.vm03.stdout:1/111: truncate d4/d6/f9 2320371 0 2026-03-09T16:14:20.076 INFO:tasks.workunit.client.0.vm03.stdout:1/112: readlink d4/db/l1a 0 2026-03-09T16:14:20.083 INFO:tasks.workunit.client.0.vm03.stdout:8/162: write da/db/fe [3499144,82605] 0 2026-03-09T16:14:20.091 INFO:tasks.workunit.client.0.vm03.stdout:7/106: mknod d4/da/c25 0 2026-03-09T16:14:20.095 INFO:tasks.workunit.client.0.vm03.stdout:2/152: creat db/d12/d2a/f38 x:0 0 0 2026-03-09T16:14:20.102 INFO:tasks.workunit.client.0.vm03.stdout:0/167: truncate d0/da/d1b/fd 1981621 0 2026-03-09T16:14:20.105 INFO:tasks.workunit.client.0.vm03.stdout:0/168: dwrite d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:20.108 INFO:tasks.workunit.client.0.vm03.stdout:0/169: write d0/da/f2c [87412,6726] 0 2026-03-09T16:14:20.108 INFO:tasks.workunit.client.0.vm03.stdout:3/135: symlink d5/d27/l28 0 2026-03-09T16:14:20.109 INFO:tasks.workunit.client.0.vm03.stdout:0/170: write d0/f28 [345651,99190] 0 2026-03-09T16:14:20.109 INFO:tasks.workunit.client.0.vm03.stdout:0/171: chown d0/da/d11/f13 451 1 2026-03-09T16:14:20.119 INFO:tasks.workunit.client.0.vm03.stdout:2/153: rename db/d1e/f29 to db/d12/f39 0 2026-03-09T16:14:20.123 INFO:tasks.workunit.client.0.vm03.stdout:0/172: fdatasync d0/f3 0 2026-03-09T16:14:20.124 INFO:tasks.workunit.client.0.vm03.stdout:6/124: truncate d9/f15 1778316 0 2026-03-09T16:14:20.126 INFO:tasks.workunit.client.0.vm03.stdout:9/163: getdents d2/df 0 2026-03-09T16:14:20.127 INFO:tasks.workunit.client.0.vm03.stdout:9/164: truncate d2/d4/d11/d12/d28/f2c 166271 0 2026-03-09T16:14:20.128 INFO:tasks.workunit.client.0.vm03.stdout:0/173: unlink d0/f28 0 2026-03-09T16:14:20.129 INFO:tasks.workunit.client.0.vm03.stdout:2/154: symlink db/l3a 0 2026-03-09T16:14:20.130 INFO:tasks.workunit.client.0.vm03.stdout:2/155: dread - db/f31 zero size 2026-03-09T16:14:20.131 INFO:tasks.workunit.client.0.vm03.stdout:9/165: chown d2/df/l16 376 1 2026-03-09T16:14:20.133 INFO:tasks.workunit.client.0.vm03.stdout:9/166: unlink d2/d4/f31 0 2026-03-09T16:14:20.134 INFO:tasks.workunit.client.0.vm03.stdout:9/167: readlink d2/d4/d11/d12/l1b 0 2026-03-09T16:14:20.134 INFO:tasks.workunit.client.0.vm03.stdout:9/168: chown d2/d4 63 1 2026-03-09T16:14:20.135 INFO:tasks.workunit.client.0.vm03.stdout:0/174: mkdir d0/da/d11/d32/d35 0 2026-03-09T16:14:20.137 INFO:tasks.workunit.client.0.vm03.stdout:2/156: mkdir db/d3b 0 2026-03-09T16:14:20.138 INFO:tasks.workunit.client.0.vm03.stdout:9/169: creat d2/df/f34 x:0 0 0 2026-03-09T16:14:20.141 INFO:tasks.workunit.client.0.vm03.stdout:9/170: dwrite d2/df/f34 [0,4194304] 0 2026-03-09T16:14:20.144 INFO:tasks.workunit.client.0.vm03.stdout:6/125: getdents d9/d17 0 2026-03-09T16:14:20.144 INFO:tasks.workunit.client.0.vm03.stdout:9/171: stat d2/df/f14 0 2026-03-09T16:14:20.145 INFO:tasks.workunit.client.0.vm03.stdout:6/126: fdatasync d9/f13 0 2026-03-09T16:14:20.145 INFO:tasks.workunit.client.0.vm03.stdout:9/172: chown d2/d4/d1f/c27 1 1 2026-03-09T16:14:20.147 INFO:tasks.workunit.client.0.vm03.stdout:9/173: truncate d2/d4/d11/d12/d28/f2c 176056 0 2026-03-09T16:14:20.151 INFO:tasks.workunit.client.0.vm03.stdout:3/136: sync 2026-03-09T16:14:20.153 INFO:tasks.workunit.client.0.vm03.stdout:0/175: sync 2026-03-09T16:14:20.154 INFO:tasks.workunit.client.0.vm03.stdout:9/174: sync 2026-03-09T16:14:20.154 INFO:tasks.workunit.client.0.vm03.stdout:9/175: fsync d2/df/f10 0 2026-03-09T16:14:20.157 INFO:tasks.workunit.client.0.vm03.stdout:0/176: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:20.160 INFO:tasks.workunit.client.0.vm03.stdout:2/157: symlink db/l3c 0 2026-03-09T16:14:20.160 INFO:tasks.workunit.client.0.vm03.stdout:9/176: dwrite d2/d4/d1f/f25 [4194304,4194304] 0 2026-03-09T16:14:20.168 INFO:tasks.workunit.client.0.vm03.stdout:6/127: creat d9/f1e x:0 0 0 2026-03-09T16:14:20.170 INFO:tasks.workunit.client.0.vm03.stdout:3/137: unlink d5/f6 0 2026-03-09T16:14:20.170 INFO:tasks.workunit.client.0.vm03.stdout:3/138: write d5/d13/f20 [108026,101392] 0 2026-03-09T16:14:20.173 INFO:tasks.workunit.client.0.vm03.stdout:3/139: dread d5/d13/f25 [0,4194304] 0 2026-03-09T16:14:20.178 INFO:tasks.workunit.client.0.vm03.stdout:9/177: creat d2/d4/d11/d12/f35 x:0 0 0 2026-03-09T16:14:20.179 INFO:tasks.workunit.client.0.vm03.stdout:2/158: mknod db/d12/c3d 0 2026-03-09T16:14:20.190 INFO:tasks.workunit.client.0.vm03.stdout:3/140: creat d5/d13/f29 x:0 0 0 2026-03-09T16:14:20.191 INFO:tasks.workunit.client.0.vm03.stdout:3/141: chown d5/d13/f14 672987 1 2026-03-09T16:14:20.191 INFO:tasks.workunit.client.0.vm03.stdout:4/152: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:20.191 INFO:tasks.workunit.client.0.vm03.stdout:3/142: write d5/f24 [2026388,57402] 0 2026-03-09T16:14:20.198 INFO:tasks.workunit.client.0.vm03.stdout:9/178: rename d2/d4/d1f/c27 to d2/de/c36 0 2026-03-09T16:14:20.199 INFO:tasks.workunit.client.0.vm03.stdout:9/179: truncate d2/d4/d11/d12/f1e 1693298 0 2026-03-09T16:14:20.200 INFO:tasks.workunit.client.0.vm03.stdout:9/180: write d2/df/f10 [873516,95900] 0 2026-03-09T16:14:20.202 INFO:tasks.workunit.client.0.vm03.stdout:2/159: rmdir db/d12 39 2026-03-09T16:14:20.202 INFO:tasks.workunit.client.0.vm03.stdout:2/160: write db/f31 [1043796,21520] 0 2026-03-09T16:14:20.203 INFO:tasks.workunit.client.0.vm03.stdout:3/143: sync 2026-03-09T16:14:20.228 INFO:tasks.workunit.client.0.vm03.stdout:9/181: mknod d2/de/c37 0 2026-03-09T16:14:20.232 INFO:tasks.workunit.client.0.vm03.stdout:9/182: dread d2/d4/d11/f13 [0,4194304] 0 2026-03-09T16:14:20.236 INFO:tasks.workunit.client.0.vm03.stdout:4/153: link d5/fa d5/dd/d1f/f2d 0 2026-03-09T16:14:20.237 INFO:tasks.workunit.client.0.vm03.stdout:4/154: dread - d5/db/f28 zero size 2026-03-09T16:14:20.241 INFO:tasks.workunit.client.0.vm03.stdout:2/161: symlink db/d3b/l3e 0 2026-03-09T16:14:20.247 INFO:tasks.workunit.client.0.vm03.stdout:5/182: truncate d2/d7/de/f2f 792891 0 2026-03-09T16:14:20.250 INFO:tasks.workunit.client.0.vm03.stdout:1/113: dwrite d4/fd [4194304,4194304] 0 2026-03-09T16:14:20.257 INFO:tasks.workunit.client.0.vm03.stdout:2/162: dwrite f8 [0,4194304] 0 2026-03-09T16:14:20.266 INFO:tasks.workunit.client.0.vm03.stdout:0/177: read d0/d7/f19 [583176,88633] 0 2026-03-09T16:14:20.272 INFO:tasks.workunit.client.0.vm03.stdout:5/183: mknod d2/c41 0 2026-03-09T16:14:20.278 INFO:tasks.workunit.client.0.vm03.stdout:7/107: truncate d4/dc/f1a 540059 0 2026-03-09T16:14:20.279 INFO:tasks.workunit.client.0.vm03.stdout:7/108: chown d4/c14 12726429 1 2026-03-09T16:14:20.279 INFO:tasks.workunit.client.0.vm03.stdout:5/184: dwrite d2/d7/de/d11/f32 [0,4194304] 0 2026-03-09T16:14:20.281 INFO:tasks.workunit.client.0.vm03.stdout:4/155: mkdir d5/dd/d1f/d24/d2e 0 2026-03-09T16:14:20.292 INFO:tasks.workunit.client.0.vm03.stdout:2/163: creat db/d3b/f3f x:0 0 0 2026-03-09T16:14:20.295 INFO:tasks.workunit.client.0.vm03.stdout:0/178: symlink d0/da/d1b/l36 0 2026-03-09T16:14:20.297 INFO:tasks.workunit.client.0.vm03.stdout:8/163: truncate da/db/fe 992456 0 2026-03-09T16:14:20.301 INFO:tasks.workunit.client.0.vm03.stdout:9/183: mkdir d2/d4/d11/d29/d2a/d38 0 2026-03-09T16:14:20.308 INFO:tasks.workunit.client.0.vm03.stdout:5/185: dread d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:20.317 INFO:tasks.workunit.client.0.vm03.stdout:1/114: symlink d4/d6/l27 0 2026-03-09T16:14:20.319 INFO:tasks.workunit.client.0.vm03.stdout:5/186: fsync d2/d7/de/d11/f26 0 2026-03-09T16:14:20.322 INFO:tasks.workunit.client.0.vm03.stdout:0/179: rename d0/da/d1b/d2f to d0/da/d11/d32/d37 0 2026-03-09T16:14:20.324 INFO:tasks.workunit.client.0.vm03.stdout:0/180: fsync d0/da/d11/d32/d37/f33 0 2026-03-09T16:14:20.324 INFO:tasks.workunit.client.0.vm03.stdout:0/181: chown d0/d7/f8 389134800 1 2026-03-09T16:14:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/164: mknod da/d10/d28/c2d 0 2026-03-09T16:14:20.326 INFO:tasks.workunit.client.0.vm03.stdout:8/165: chown da/d15/c17 3 1 2026-03-09T16:14:20.330 INFO:tasks.workunit.client.0.vm03.stdout:8/166: chown da/c16 22977398 1 2026-03-09T16:14:20.330 INFO:tasks.workunit.client.0.vm03.stdout:8/167: readlink da/d10/l2b 0 2026-03-09T16:14:20.331 INFO:tasks.workunit.client.0.vm03.stdout:9/184: symlink d2/df/l39 0 2026-03-09T16:14:20.333 INFO:tasks.workunit.client.0.vm03.stdout:6/128: dread d9/d14/f1d [0,4194304] 0 2026-03-09T16:14:20.350 INFO:tasks.workunit.client.0.vm03.stdout:3/144: getdents d5 0 2026-03-09T16:14:20.353 INFO:tasks.workunit.client.0.vm03.stdout:3/145: dread d5/fb [4194304,4194304] 0 2026-03-09T16:14:20.527 INFO:tasks.workunit.client.0.vm03.stdout:2/164: creat db/d12/d2a/f40 x:0 0 0 2026-03-09T16:14:20.527 INFO:tasks.workunit.client.0.vm03.stdout:2/165: stat db/d12/c1b 0 2026-03-09T16:14:20.541 INFO:tasks.workunit.client.0.vm03.stdout:5/187: creat d2/d7/de/d11/d19/d31/f42 x:0 0 0 2026-03-09T16:14:20.546 INFO:tasks.workunit.client.0.vm03.stdout:9/185: symlink d2/d4/d11/d29/d2a/l3a 0 2026-03-09T16:14:20.550 INFO:tasks.workunit.client.0.vm03.stdout:7/109: creat d4/f26 x:0 0 0 2026-03-09T16:14:20.552 INFO:tasks.workunit.client.0.vm03.stdout:6/129: unlink d9/l1a 0 2026-03-09T16:14:20.553 INFO:tasks.workunit.client.0.vm03.stdout:3/146: unlink d5/l19 0 2026-03-09T16:14:20.564 INFO:tasks.workunit.client.0.vm03.stdout:1/115: creat d4/d6/d1d/d20/d23/f28 x:0 0 0 2026-03-09T16:14:20.568 INFO:tasks.workunit.client.0.vm03.stdout:5/188: mkdir d2/d7/d8/d24/d27/d43 0 2026-03-09T16:14:20.568 INFO:tasks.workunit.client.0.vm03.stdout:5/189: dwrite d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:20.582 INFO:tasks.workunit.client.0.vm03.stdout:0/182: creat d0/da/d11/d32/d35/f38 x:0 0 0 2026-03-09T16:14:20.583 INFO:tasks.workunit.client.0.vm03.stdout:0/183: write d0/da/d11/d32/d35/f38 [828994,117616] 0 2026-03-09T16:14:20.598 INFO:tasks.workunit.client.0.vm03.stdout:7/110: symlink d4/da/d18/d22/d24/d15/l27 0 2026-03-09T16:14:20.600 INFO:tasks.workunit.client.0.vm03.stdout:6/130: creat d9/f1f x:0 0 0 2026-03-09T16:14:20.606 INFO:tasks.workunit.client.0.vm03.stdout:2/166: fdatasync f7 0 2026-03-09T16:14:20.606 INFO:tasks.workunit.client.0.vm03.stdout:2/167: stat db/l3a 0 2026-03-09T16:14:20.613 INFO:tasks.workunit.client.0.vm03.stdout:5/190: creat d2/d7/de/d11/f44 x:0 0 0 2026-03-09T16:14:20.618 INFO:tasks.workunit.client.0.vm03.stdout:0/184: rmdir d0/d7/d2a 39 2026-03-09T16:14:20.618 INFO:tasks.workunit.client.0.vm03.stdout:0/185: fsync d0/da/ff 0 2026-03-09T16:14:20.620 INFO:tasks.workunit.client.0.vm03.stdout:4/156: truncate f1 694464 0 2026-03-09T16:14:20.621 INFO:tasks.workunit.client.0.vm03.stdout:8/168: link da/d15/l27 da/d15/l2e 0 2026-03-09T16:14:20.626 INFO:tasks.workunit.client.0.vm03.stdout:6/131: chown d9/d14/c1b 1 1 2026-03-09T16:14:20.626 INFO:tasks.workunit.client.0.vm03.stdout:6/132: chown d9/d17 251118870 1 2026-03-09T16:14:20.628 INFO:tasks.workunit.client.0.vm03.stdout:6/133: chown d9/f1e 1 1 2026-03-09T16:14:20.631 INFO:tasks.workunit.client.0.vm03.stdout:6/134: dwrite f7 [0,4194304] 0 2026-03-09T16:14:20.636 INFO:tasks.workunit.client.0.vm03.stdout:6/135: dwrite d9/d17/f1c [0,4194304] 0 2026-03-09T16:14:20.638 INFO:tasks.workunit.client.0.vm03.stdout:2/168: creat db/d12/d2a/f41 x:0 0 0 2026-03-09T16:14:20.646 INFO:tasks.workunit.client.0.vm03.stdout:1/116: mknod d4/c29 0 2026-03-09T16:14:20.651 INFO:tasks.workunit.client.0.vm03.stdout:9/186: dwrite d2/f15 [4194304,4194304] 0 2026-03-09T16:14:20.652 INFO:tasks.workunit.client.0.vm03.stdout:9/187: chown d2/de/f1c 145712 1 2026-03-09T16:14:20.658 INFO:tasks.workunit.client.0.vm03.stdout:8/169: creat da/d15/f2f x:0 0 0 2026-03-09T16:14:20.659 INFO:tasks.workunit.client.0.vm03.stdout:8/170: write da/d10/d28/f29 [1008659,120365] 0 2026-03-09T16:14:20.666 INFO:tasks.workunit.client.0.vm03.stdout:7/111: creat d4/da/d19/f28 x:0 0 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:7/112: write d4/da/d19/f28 [298896,40490] 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:7/113: write d4/da/d18/f1b [5154703,9895] 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:6/136: creat d9/f20 x:0 0 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:2/169: mkdir db/d12/d42 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:1/117: readlink d4/d6/l10 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:9/188: write d2/d4/d11/d12/f1e [2531602,75365] 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:9/189: chown d2/d4/fd 0 1 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:9/190: chown d2/df/f34 7316 1 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:7/114: creat d4/da/d18/f29 x:0 0 0 2026-03-09T16:14:20.688 INFO:tasks.workunit.client.0.vm03.stdout:3/147: getdents d5/d27 0 2026-03-09T16:14:20.689 INFO:tasks.workunit.client.0.vm03.stdout:3/148: dread d5/f24 [0,4194304] 0 2026-03-09T16:14:20.692 INFO:tasks.workunit.client.0.vm03.stdout:2/170: symlink db/d1e/l43 0 2026-03-09T16:14:20.693 INFO:tasks.workunit.client.0.vm03.stdout:1/118: creat d4/d6/d1d/d20/f2a x:0 0 0 2026-03-09T16:14:20.702 INFO:tasks.workunit.client.0.vm03.stdout:9/191: write d2/d4/f17 [198952,3454] 0 2026-03-09T16:14:20.706 INFO:tasks.workunit.client.0.vm03.stdout:9/192: dwrite d2/d4/d11/d12/d28/f2c [0,4194304] 0 2026-03-09T16:14:20.710 INFO:tasks.workunit.client.0.vm03.stdout:9/193: write d2/f8 [8306367,97297] 0 2026-03-09T16:14:20.710 INFO:tasks.workunit.client.0.vm03.stdout:0/186: link d0/da/d1b/fd d0/da/d11/d32/d37/f39 0 2026-03-09T16:14:20.710 INFO:tasks.workunit.client.0.vm03.stdout:4/157: rmdir d5/dd/d1f/d24/d2e 0 2026-03-09T16:14:20.710 INFO:tasks.workunit.client.0.vm03.stdout:8/171: mkdir da/db/d30 0 2026-03-09T16:14:20.711 INFO:tasks.workunit.client.0.vm03.stdout:5/191: sync 2026-03-09T16:14:20.711 INFO:tasks.workunit.client.0.vm03.stdout:2/171: sync 2026-03-09T16:14:20.715 INFO:tasks.workunit.client.0.vm03.stdout:2/172: dwrite f0 [0,4194304] 0 2026-03-09T16:14:20.728 INFO:tasks.workunit.client.0.vm03.stdout:6/137: fsync f7 0 2026-03-09T16:14:20.743 INFO:tasks.workunit.client.0.vm03.stdout:4/158: creat d5/db/f2f x:0 0 0 2026-03-09T16:14:20.755 INFO:tasks.workunit.client.0.vm03.stdout:6/138: creat d9/f21 x:0 0 0 2026-03-09T16:14:20.762 INFO:tasks.workunit.client.0.vm03.stdout:0/187: symlink d0/d7/l3a 0 2026-03-09T16:14:20.770 INFO:tasks.workunit.client.0.vm03.stdout:4/159: mknod d5/d17/c30 0 2026-03-09T16:14:20.780 INFO:tasks.workunit.client.0.vm03.stdout:8/172: symlink da/db/l31 0 2026-03-09T16:14:20.781 INFO:tasks.workunit.client.0.vm03.stdout:3/149: write d5/d13/f25 [3650857,71419] 0 2026-03-09T16:14:20.781 INFO:tasks.workunit.client.0.vm03.stdout:3/150: fsync d5/d13/f1d 0 2026-03-09T16:14:20.783 INFO:tasks.workunit.client.0.vm03.stdout:5/192: symlink d2/d7/d1a/d1c/d3f/l45 0 2026-03-09T16:14:20.785 INFO:tasks.workunit.client.0.vm03.stdout:3/151: fsync d5/d13/f25 0 2026-03-09T16:14:20.786 INFO:tasks.workunit.client.0.vm03.stdout:3/152: chown d5/d13/f20 0 1 2026-03-09T16:14:20.787 INFO:tasks.workunit.client.0.vm03.stdout:3/153: truncate d5/d13/f1d 823866 0 2026-03-09T16:14:20.787 INFO:tasks.workunit.client.0.vm03.stdout:5/193: dwrite d2/d7/de/d11/f44 [0,4194304] 0 2026-03-09T16:14:20.800 INFO:tasks.workunit.client.0.vm03.stdout:6/139: rename d9/d17 to d9/d22 0 2026-03-09T16:14:20.808 INFO:tasks.workunit.client.0.vm03.stdout:6/140: stat d9/cc 0 2026-03-09T16:14:20.809 INFO:tasks.workunit.client.0.vm03.stdout:0/188: dread d0/da/d1b/fd [0,4194304] 0 2026-03-09T16:14:20.809 INFO:tasks.workunit.client.0.vm03.stdout:0/189: write d0/da/f1c [833003,44382] 0 2026-03-09T16:14:20.810 INFO:tasks.workunit.client.0.vm03.stdout:4/160: mkdir d5/db/d25/d31 0 2026-03-09T16:14:20.819 INFO:tasks.workunit.client.0.vm03.stdout:7/115: write d4/dc/f1a [1564719,38128] 0 2026-03-09T16:14:20.819 INFO:tasks.workunit.client.0.vm03.stdout:7/116: chown d4/c13 650 1 2026-03-09T16:14:20.826 INFO:tasks.workunit.client.0.vm03.stdout:1/119: truncate d4/f1b 339199 0 2026-03-09T16:14:20.826 INFO:tasks.workunit.client.0.vm03.stdout:9/194: truncate d2/d4/f17 1324098 0 2026-03-09T16:14:20.826 INFO:tasks.workunit.client.0.vm03.stdout:1/120: write d4/fd [5972069,29189] 0 2026-03-09T16:14:20.827 INFO:tasks.workunit.client.0.vm03.stdout:1/121: read - d4/d6/d1d/d20/d23/f28 zero size 2026-03-09T16:14:20.829 INFO:tasks.workunit.client.0.vm03.stdout:9/195: dread d2/df/f34 [0,4194304] 0 2026-03-09T16:14:20.831 INFO:tasks.workunit.client.0.vm03.stdout:3/154: symlink d5/d27/l2a 0 2026-03-09T16:14:20.834 INFO:tasks.workunit.client.0.vm03.stdout:3/155: dwrite d5/d13/f29 [0,4194304] 0 2026-03-09T16:14:20.836 INFO:tasks.workunit.client.0.vm03.stdout:3/156: write d5/d1e/f26 [272630,7098] 0 2026-03-09T16:14:20.847 INFO:tasks.workunit.client.0.vm03.stdout:5/194: dwrite d2/d7/de/f2f [0,4194304] 0 2026-03-09T16:14:20.853 INFO:tasks.workunit.client.0.vm03.stdout:2/173: rmdir db/d12/d42 0 2026-03-09T16:14:20.854 INFO:tasks.workunit.client.0.vm03.stdout:2/174: write f9 [449798,111054] 0 2026-03-09T16:14:20.854 INFO:tasks.workunit.client.0.vm03.stdout:5/195: dwrite d2/d7/de/f2f [0,4194304] 0 2026-03-09T16:14:20.867 INFO:tasks.workunit.client.0.vm03.stdout:8/173: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:20.870 INFO:tasks.workunit.client.0.vm03.stdout:0/190: mknod d0/da/d11/c3b 0 2026-03-09T16:14:20.875 INFO:tasks.workunit.client.0.vm03.stdout:7/117: creat d4/da/d18/d22/d24/d15/f2a x:0 0 0 2026-03-09T16:14:20.878 INFO:tasks.workunit.client.0.vm03.stdout:3/157: sync 2026-03-09T16:14:20.889 INFO:tasks.workunit.client.0.vm03.stdout:1/122: rmdir d4/d6/d1d/d20 39 2026-03-09T16:14:20.889 INFO:tasks.workunit.client.0.vm03.stdout:1/123: read d4/fd [3229622,65708] 0 2026-03-09T16:14:20.889 INFO:tasks.workunit.client.0.vm03.stdout:2/175: dread - db/f2e zero size 2026-03-09T16:14:20.889 INFO:tasks.workunit.client.0.vm03.stdout:5/196: symlink d2/d7/d8/l46 0 2026-03-09T16:14:20.891 INFO:tasks.workunit.client.0.vm03.stdout:6/141: truncate f7 267503 0 2026-03-09T16:14:20.891 INFO:tasks.workunit.client.0.vm03.stdout:6/142: rename d9 to d9/d14/d23 22 2026-03-09T16:14:20.892 INFO:tasks.workunit.client.0.vm03.stdout:5/197: dread d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:20.895 INFO:tasks.workunit.client.0.vm03.stdout:0/191: truncate d0/d7/f8 1014511 0 2026-03-09T16:14:20.896 INFO:tasks.workunit.client.0.vm03.stdout:0/192: write d0/da/f2c [1129221,73778] 0 2026-03-09T16:14:20.898 INFO:tasks.workunit.client.0.vm03.stdout:4/161: mkdir d5/dd/d32 0 2026-03-09T16:14:20.907 INFO:tasks.workunit.client.0.vm03.stdout:4/162: write d5/d17/f18 [655982,59475] 0 2026-03-09T16:14:20.907 INFO:tasks.workunit.client.0.vm03.stdout:7/118: mkdir d4/da/d18/d22/d24/d16/d2b 0 2026-03-09T16:14:20.907 INFO:tasks.workunit.client.0.vm03.stdout:3/158: creat d5/f2b x:0 0 0 2026-03-09T16:14:20.907 INFO:tasks.workunit.client.0.vm03.stdout:9/196: unlink d2/df/f34 0 2026-03-09T16:14:20.909 INFO:tasks.workunit.client.0.vm03.stdout:9/197: dwrite d2/f33 [0,4194304] 0 2026-03-09T16:14:20.909 INFO:tasks.workunit.client.0.vm03.stdout:5/198: sync 2026-03-09T16:14:20.909 INFO:tasks.workunit.client.0.vm03.stdout:3/159: sync 2026-03-09T16:14:20.910 INFO:tasks.workunit.client.0.vm03.stdout:5/199: readlink d2/d7/d1a/l1f 0 2026-03-09T16:14:20.932 INFO:tasks.workunit.client.0.vm03.stdout:6/143: creat d9/d22/f24 x:0 0 0 2026-03-09T16:14:20.932 INFO:tasks.workunit.client.0.vm03.stdout:6/144: write d9/f1e [222984,48830] 0 2026-03-09T16:14:20.934 INFO:tasks.workunit.client.0.vm03.stdout:8/174: mkdir da/d32 0 2026-03-09T16:14:20.937 INFO:tasks.workunit.client.0.vm03.stdout:8/175: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:20.938 INFO:tasks.workunit.client.0.vm03.stdout:8/176: chown da/db/fe 120186819 1 2026-03-09T16:14:20.954 INFO:tasks.workunit.client.0.vm03.stdout:4/163: dwrite d5/dd/f16 [0,4194304] 0 2026-03-09T16:14:20.955 INFO:tasks.workunit.client.0.vm03.stdout:4/164: dread - d5/db/f2f zero size 2026-03-09T16:14:20.963 INFO:tasks.workunit.client.0.vm03.stdout:2/176: write f7 [1154762,3618] 0 2026-03-09T16:14:20.964 INFO:tasks.workunit.client.0.vm03.stdout:7/119: rmdir d4 39 2026-03-09T16:14:20.966 INFO:tasks.workunit.client.0.vm03.stdout:1/124: mknod d4/d6/c2b 0 2026-03-09T16:14:20.968 INFO:tasks.workunit.client.0.vm03.stdout:3/160: unlink d5/d13/f14 0 2026-03-09T16:14:20.969 INFO:tasks.workunit.client.0.vm03.stdout:5/200: mknod d2/d7/c47 0 2026-03-09T16:14:20.970 INFO:tasks.workunit.client.0.vm03.stdout:5/201: read d2/d7/de/f2f [375586,77169] 0 2026-03-09T16:14:20.971 INFO:tasks.workunit.client.0.vm03.stdout:5/202: truncate d2/d7/de/d11/f44 4684406 0 2026-03-09T16:14:20.975 INFO:tasks.workunit.client.0.vm03.stdout:6/145: dwrite d9/f15 [0,4194304] 0 2026-03-09T16:14:20.975 INFO:tasks.workunit.client.0.vm03.stdout:4/165: sync 2026-03-09T16:14:20.976 INFO:tasks.workunit.client.0.vm03.stdout:4/166: dread - d5/db/f2f zero size 2026-03-09T16:14:20.976 INFO:tasks.workunit.client.0.vm03.stdout:4/167: chown d5/dd/f16 2076687 1 2026-03-09T16:14:20.978 INFO:tasks.workunit.client.0.vm03.stdout:4/168: truncate d5/db/d25/f26 912175 0 2026-03-09T16:14:20.980 INFO:tasks.workunit.client.0.vm03.stdout:8/177: unlink da/c20 0 2026-03-09T16:14:20.984 INFO:tasks.workunit.client.0.vm03.stdout:4/169: sync 2026-03-09T16:14:20.992 INFO:tasks.workunit.client.0.vm03.stdout:0/193: link d0/da/f2c d0/da/d11/d32/d35/f3c 0 2026-03-09T16:14:21.000 INFO:tasks.workunit.client.0.vm03.stdout:0/194: chown d0/d7 4624 1 2026-03-09T16:14:21.001 INFO:tasks.workunit.client.0.vm03.stdout:0/195: dwrite d0/f3 [0,4194304] 0 2026-03-09T16:14:21.001 INFO:tasks.workunit.client.0.vm03.stdout:2/177: mknod db/d3b/c44 0 2026-03-09T16:14:21.006 INFO:tasks.workunit.client.0.vm03.stdout:5/203: creat d2/d7/de/f48 x:0 0 0 2026-03-09T16:14:21.007 INFO:tasks.workunit.client.0.vm03.stdout:9/198: truncate d2/df/f22 891921 0 2026-03-09T16:14:21.007 INFO:tasks.workunit.client.0.vm03.stdout:9/199: chown d2/df/l24 463757292 1 2026-03-09T16:14:21.011 INFO:tasks.workunit.client.0.vm03.stdout:9/200: dwrite d2/d4/d11/d12/f1e [0,4194304] 0 2026-03-09T16:14:21.019 INFO:tasks.workunit.client.0.vm03.stdout:6/146: symlink d9/d14/l25 0 2026-03-09T16:14:21.020 INFO:tasks.workunit.client.0.vm03.stdout:8/178: creat da/d10/f33 x:0 0 0 2026-03-09T16:14:21.021 INFO:tasks.workunit.client.0.vm03.stdout:8/179: stat da/db/fe 0 2026-03-09T16:14:21.022 INFO:tasks.workunit.client.0.vm03.stdout:4/170: rename d5/dd/d1f/d24 to d5/db/d25/d31/d33 0 2026-03-09T16:14:21.026 INFO:tasks.workunit.client.0.vm03.stdout:7/120: symlink d4/da/d18/d22/d24/l2c 0 2026-03-09T16:14:21.028 INFO:tasks.workunit.client.0.vm03.stdout:4/171: dread d5/d17/f2b [0,4194304] 0 2026-03-09T16:14:21.030 INFO:tasks.workunit.client.0.vm03.stdout:5/204: unlink d2/d7/de/d11/d19/d29/l3e 0 2026-03-09T16:14:21.034 INFO:tasks.workunit.client.0.vm03.stdout:6/147: unlink d9/f21 0 2026-03-09T16:14:21.037 INFO:tasks.workunit.client.0.vm03.stdout:9/201: sync 2026-03-09T16:14:21.039 INFO:tasks.workunit.client.0.vm03.stdout:8/180: dwrite da/d15/f1b [0,4194304] 0 2026-03-09T16:14:21.042 INFO:tasks.workunit.client.0.vm03.stdout:9/202: dwrite d2/d4/fd [0,4194304] 0 2026-03-09T16:14:21.044 INFO:tasks.workunit.client.0.vm03.stdout:9/203: fdatasync d2/d4/d1f/f25 0 2026-03-09T16:14:21.049 INFO:tasks.workunit.client.0.vm03.stdout:8/181: dwrite da/d10/f14 [0,4194304] 0 2026-03-09T16:14:21.060 INFO:tasks.workunit.client.0.vm03.stdout:2/178: rename db/d12/d2a/f41 to db/d1e/f45 0 2026-03-09T16:14:21.075 INFO:tasks.workunit.client.0.vm03.stdout:7/121: unlink d4/da/d19/f28 0 2026-03-09T16:14:21.075 INFO:tasks.workunit.client.0.vm03.stdout:1/125: link d4/ce d4/d6/d1d/d24/c2c 0 2026-03-09T16:14:21.075 INFO:tasks.workunit.client.0.vm03.stdout:5/205: unlink d2/d7/de/f2f 0 2026-03-09T16:14:21.075 INFO:tasks.workunit.client.0.vm03.stdout:5/206: write d2/d7/de/d11/f32 [1524781,15335] 0 2026-03-09T16:14:21.075 INFO:tasks.workunit.client.0.vm03.stdout:6/148: symlink d9/d22/l26 0 2026-03-09T16:14:21.087 INFO:tasks.workunit.client.0.vm03.stdout:2/179: unlink db/d1e/f2b 0 2026-03-09T16:14:21.089 INFO:tasks.workunit.client.0.vm03.stdout:0/196: creat d0/d7/f3d x:0 0 0 2026-03-09T16:14:21.089 INFO:tasks.workunit.client.0.vm03.stdout:0/197: write d0/d7/f3d [710167,28147] 0 2026-03-09T16:14:21.095 INFO:tasks.workunit.client.0.vm03.stdout:2/180: dread db/f31 [0,4194304] 0 2026-03-09T16:14:21.096 INFO:tasks.workunit.client.0.vm03.stdout:2/181: dread db/f31 [0,4194304] 0 2026-03-09T16:14:21.097 INFO:tasks.workunit.client.0.vm03.stdout:2/182: chown db/d12/d2a/f38 12 1 2026-03-09T16:14:21.097 INFO:tasks.workunit.client.0.vm03.stdout:2/183: dread - db/d1e/f45 zero size 2026-03-09T16:14:21.098 INFO:tasks.workunit.client.0.vm03.stdout:2/184: truncate db/d3b/f3f 865850 0 2026-03-09T16:14:21.102 INFO:tasks.workunit.client.0.vm03.stdout:3/161: getdents d5/d1e 0 2026-03-09T16:14:21.106 INFO:tasks.workunit.client.0.vm03.stdout:3/162: dwrite d5/f10 [0,4194304] 0 2026-03-09T16:14:21.111 INFO:tasks.workunit.client.0.vm03.stdout:5/207: mknod d2/c49 0 2026-03-09T16:14:21.119 INFO:tasks.workunit.client.0.vm03.stdout:9/204: symlink d2/d4/d11/d29/d2a/d38/l3b 0 2026-03-09T16:14:21.127 INFO:tasks.workunit.client.0.vm03.stdout:4/172: write d5/fa [2497972,56964] 0 2026-03-09T16:14:21.135 INFO:tasks.workunit.client.0.vm03.stdout:7/122: mkdir d4/d2d 0 2026-03-09T16:14:21.136 INFO:tasks.workunit.client.0.vm03.stdout:7/123: readlink d4/da/d18/d22/d24/d15/l1f 0 2026-03-09T16:14:21.140 INFO:tasks.workunit.client.0.vm03.stdout:3/163: creat d5/d13/f2c x:0 0 0 2026-03-09T16:14:21.140 INFO:tasks.workunit.client.0.vm03.stdout:3/164: fdatasync d5/f10 0 2026-03-09T16:14:21.141 INFO:tasks.workunit.client.0.vm03.stdout:3/165: readlink d5/l21 0 2026-03-09T16:14:21.142 INFO:tasks.workunit.client.0.vm03.stdout:5/208: fsync d2/d7/de/d11/d19/d31/f42 0 2026-03-09T16:14:21.143 INFO:tasks.workunit.client.0.vm03.stdout:5/209: chown d2/d7/de/d11/d19/d31/f42 1893 1 2026-03-09T16:14:21.145 INFO:tasks.workunit.client.0.vm03.stdout:5/210: dread d2/d7/de/d11/f44 [0,4194304] 0 2026-03-09T16:14:21.146 INFO:tasks.workunit.client.0.vm03.stdout:5/211: fsync d2/d7/de/d11/f26 0 2026-03-09T16:14:21.148 INFO:tasks.workunit.client.0.vm03.stdout:5/212: truncate d2/d7/de/d11/d19/d31/f42 4697061 0 2026-03-09T16:14:21.150 INFO:tasks.workunit.client.0.vm03.stdout:5/213: dread d2/d7/de/d11/d19/d31/f42 [4194304,4194304] 0 2026-03-09T16:14:21.158 INFO:tasks.workunit.client.0.vm03.stdout:9/205: mknod d2/df/c3c 0 2026-03-09T16:14:21.158 INFO:tasks.workunit.client.0.vm03.stdout:8/182: creat da/db/f34 x:0 0 0 2026-03-09T16:14:21.158 INFO:tasks.workunit.client.0.vm03.stdout:9/206: fdatasync d2/d4/d1f/f23 0 2026-03-09T16:14:21.159 INFO:tasks.workunit.client.0.vm03.stdout:4/173: write d5/d17/f14 [408780,90711] 0 2026-03-09T16:14:21.160 INFO:tasks.workunit.client.0.vm03.stdout:4/174: read - d5/db/f2f zero size 2026-03-09T16:14:21.161 INFO:tasks.workunit.client.0.vm03.stdout:4/175: readlink l0 0 2026-03-09T16:14:21.161 INFO:tasks.workunit.client.0.vm03.stdout:4/176: readlink d5/dd/d1f/l2c 0 2026-03-09T16:14:21.162 INFO:tasks.workunit.client.0.vm03.stdout:8/183: dread da/db/f1c [0,4194304] 0 2026-03-09T16:14:21.168 INFO:tasks.workunit.client.0.vm03.stdout:8/184: dwrite da/d10/f1f [0,4194304] 0 2026-03-09T16:14:21.169 INFO:tasks.workunit.client.0.vm03.stdout:8/185: write da/db/f34 [534531,105158] 0 2026-03-09T16:14:21.171 INFO:tasks.workunit.client.0.vm03.stdout:8/186: write da/d15/f2f [521377,74597] 0 2026-03-09T16:14:21.183 INFO:tasks.workunit.client.0.vm03.stdout:1/126: dwrite d4/f1b [0,4194304] 0 2026-03-09T16:14:21.205 INFO:tasks.workunit.client.0.vm03.stdout:5/214: creat d2/d7/de/d11/d19/d31/d35/d39/f4a x:0 0 0 2026-03-09T16:14:21.224 INFO:tasks.workunit.client.0.vm03.stdout:4/177: creat d5/db/f34 x:0 0 0 2026-03-09T16:14:21.237 INFO:tasks.workunit.client.0.vm03.stdout:1/127: unlink d4/db/l1a 0 2026-03-09T16:14:21.238 INFO:tasks.workunit.client.0.vm03.stdout:1/128: readlink d4/l14 0 2026-03-09T16:14:21.241 INFO:tasks.workunit.client.0.vm03.stdout:7/124: mknod d4/da/d18/d22/d24/d16/d2b/c2e 0 2026-03-09T16:14:21.242 INFO:tasks.workunit.client.0.vm03.stdout:7/125: chown d4/da/l21 3 1 2026-03-09T16:14:21.245 INFO:tasks.workunit.client.0.vm03.stdout:7/126: dwrite d4/da/d18/f1b [0,4194304] 0 2026-03-09T16:14:21.263 INFO:tasks.workunit.client.0.vm03.stdout:6/149: getdents d9 0 2026-03-09T16:14:21.266 INFO:tasks.workunit.client.0.vm03.stdout:6/150: dwrite d9/d22/f24 [0,4194304] 0 2026-03-09T16:14:21.270 INFO:tasks.workunit.client.0.vm03.stdout:6/151: dwrite d9/d22/f24 [0,4194304] 0 2026-03-09T16:14:21.271 INFO:tasks.workunit.client.0.vm03.stdout:6/152: write d9/ff [2801676,128513] 0 2026-03-09T16:14:21.272 INFO:tasks.workunit.client.0.vm03.stdout:6/153: write d9/f20 [293813,99864] 0 2026-03-09T16:14:21.280 INFO:tasks.workunit.client.0.vm03.stdout:3/166: truncate d5/d13/f29 4080913 0 2026-03-09T16:14:21.282 INFO:tasks.workunit.client.0.vm03.stdout:4/178: mknod d5/c35 0 2026-03-09T16:14:21.283 INFO:tasks.workunit.client.0.vm03.stdout:4/179: write d5/d17/f18 [25394,40096] 0 2026-03-09T16:14:21.284 INFO:tasks.workunit.client.0.vm03.stdout:4/180: write d5/dd/f23 [769956,47394] 0 2026-03-09T16:14:21.288 INFO:tasks.workunit.client.0.vm03.stdout:4/181: dwrite d5/f8 [0,4194304] 0 2026-03-09T16:14:21.290 INFO:tasks.workunit.client.0.vm03.stdout:4/182: chown d5/db/l1a 61693 1 2026-03-09T16:14:21.291 INFO:tasks.workunit.client.0.vm03.stdout:4/183: write d5/db/f34 [1004805,6854] 0 2026-03-09T16:14:21.292 INFO:tasks.workunit.client.0.vm03.stdout:4/184: write d5/fa [3975641,115654] 0 2026-03-09T16:14:21.294 INFO:tasks.workunit.client.0.vm03.stdout:0/198: rename d0/d7/d2a to d0/d7/d3e 0 2026-03-09T16:14:21.311 INFO:tasks.workunit.client.0.vm03.stdout:9/207: truncate d2/d4/d11/f1d 790756 0 2026-03-09T16:14:21.326 INFO:tasks.workunit.client.0.vm03.stdout:2/185: link db/c27 db/d12/d2a/c46 0 2026-03-09T16:14:21.326 INFO:tasks.workunit.client.0.vm03.stdout:2/186: dread - db/d1e/f20 zero size 2026-03-09T16:14:21.326 INFO:tasks.workunit.client.0.vm03.stdout:5/215: mkdir d2/d7/d8/d24/d27/d43/d4b 0 2026-03-09T16:14:21.326 INFO:tasks.workunit.client.0.vm03.stdout:6/154: creat d9/d22/f27 x:0 0 0 2026-03-09T16:14:21.326 INFO:tasks.workunit.client.0.vm03.stdout:4/185: stat d5/dd/f1e 0 2026-03-09T16:14:21.327 INFO:tasks.workunit.client.0.vm03.stdout:4/186: write d5/fa [2656143,15691] 0 2026-03-09T16:14:21.336 INFO:tasks.workunit.client.0.vm03.stdout:8/187: link da/d10/f33 da/f35 0 2026-03-09T16:14:21.336 INFO:tasks.workunit.client.0.vm03.stdout:8/188: write da/d10/d28/f2c [8718,75618] 0 2026-03-09T16:14:21.337 INFO:tasks.workunit.client.0.vm03.stdout:8/189: write da/d10/f1f [1622526,66459] 0 2026-03-09T16:14:21.340 INFO:tasks.workunit.client.0.vm03.stdout:9/208: creat d2/d4/d11/d12/f3d x:0 0 0 2026-03-09T16:14:21.346 INFO:tasks.workunit.client.0.vm03.stdout:2/187: creat db/d1e/f47 x:0 0 0 2026-03-09T16:14:21.347 INFO:tasks.workunit.client.0.vm03.stdout:2/188: dread f0 [0,4194304] 0 2026-03-09T16:14:21.350 INFO:tasks.workunit.client.0.vm03.stdout:2/189: stat db/d12/f37 0 2026-03-09T16:14:21.350 INFO:tasks.workunit.client.0.vm03.stdout:7/127: link d4/da/d18/f1b d4/da/d18/d22/d24/f2f 0 2026-03-09T16:14:21.355 INFO:tasks.workunit.client.0.vm03.stdout:5/216: fdatasync d2/d7/de/d11/d19/d31/f42 0 2026-03-09T16:14:21.361 INFO:tasks.workunit.client.0.vm03.stdout:6/155: creat d9/d14/f28 x:0 0 0 2026-03-09T16:14:21.361 INFO:tasks.workunit.client.0.vm03.stdout:6/156: stat d9 0 2026-03-09T16:14:21.363 INFO:tasks.workunit.client.0.vm03.stdout:6/157: dread d9/d22/f1c [0,4194304] 0 2026-03-09T16:14:21.364 INFO:tasks.workunit.client.0.vm03.stdout:6/158: write d9/d14/f1d [940628,19279] 0 2026-03-09T16:14:21.366 INFO:tasks.workunit.client.0.vm03.stdout:6/159: dread d9/f15 [0,4194304] 0 2026-03-09T16:14:21.374 INFO:tasks.workunit.client.0.vm03.stdout:3/167: rmdir d5/d13 39 2026-03-09T16:14:21.393 INFO:tasks.workunit.client.0.vm03.stdout:4/187: mknod d5/db/c36 0 2026-03-09T16:14:21.393 INFO:tasks.workunit.client.0.vm03.stdout:4/188: chown d5/dd/d1f/f2d 30489978 1 2026-03-09T16:14:21.397 INFO:tasks.workunit.client.0.vm03.stdout:4/189: dwrite d5/dd/d1f/f2d [0,4194304] 0 2026-03-09T16:14:21.398 INFO:tasks.workunit.client.0.vm03.stdout:1/129: write d4/d6/f9 [559306,116935] 0 2026-03-09T16:14:21.398 INFO:tasks.workunit.client.0.vm03.stdout:1/130: chown d4/fd 12860106 1 2026-03-09T16:14:21.400 INFO:tasks.workunit.client.0.vm03.stdout:4/190: chown d5/db/l1a 6700419 1 2026-03-09T16:14:21.405 INFO:tasks.workunit.client.0.vm03.stdout:1/131: dwrite f1 [0,4194304] 0 2026-03-09T16:14:21.418 INFO:tasks.workunit.client.0.vm03.stdout:9/209: creat d2/d4/f3e x:0 0 0 2026-03-09T16:14:21.425 INFO:tasks.workunit.client.0.vm03.stdout:2/190: creat db/d12/f48 x:0 0 0 2026-03-09T16:14:21.429 INFO:tasks.workunit.client.0.vm03.stdout:7/128: creat d4/da/d18/d22/d24/f30 x:0 0 0 2026-03-09T16:14:21.433 INFO:tasks.workunit.client.0.vm03.stdout:7/129: dwrite d4/f26 [0,4194304] 0 2026-03-09T16:14:21.434 INFO:tasks.workunit.client.0.vm03.stdout:7/130: write d4/dc/f1a [895024,69431] 0 2026-03-09T16:14:21.435 INFO:tasks.workunit.client.0.vm03.stdout:7/131: readlink d4/ld 0 2026-03-09T16:14:21.440 INFO:tasks.workunit.client.0.vm03.stdout:7/132: dwrite d4/fe [0,4194304] 0 2026-03-09T16:14:21.461 INFO:tasks.workunit.client.0.vm03.stdout:1/132: fsync d4/d6/f9 0 2026-03-09T16:14:21.463 INFO:tasks.workunit.client.0.vm03.stdout:8/190: creat da/db/d30/f36 x:0 0 0 2026-03-09T16:14:21.467 INFO:tasks.workunit.client.0.vm03.stdout:2/191: creat db/d12/f49 x:0 0 0 2026-03-09T16:14:21.468 INFO:tasks.workunit.client.0.vm03.stdout:2/192: chown db/d12/l1c 62841 1 2026-03-09T16:14:21.476 INFO:tasks.workunit.client.0.vm03.stdout:6/160: link d9/ff d9/d14/f29 0 2026-03-09T16:14:21.488 INFO:tasks.workunit.client.0.vm03.stdout:7/133: fdatasync d4/da/d18/d22/d24/f2f 0 2026-03-09T16:14:21.490 INFO:tasks.workunit.client.0.vm03.stdout:0/199: link d0/c24 d0/d7/c3f 0 2026-03-09T16:14:21.494 INFO:tasks.workunit.client.0.vm03.stdout:1/133: chown d4/d6/lc 0 1 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr fail", "who": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: Activating manager daemon vm05.dygxfv 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "mgr fail", "who": "vm03.gbgzmu"}]': finished 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: mgrmap e19: vm05.dygxfv(active, starting, since 0.0142017s) 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: Active manager daemon vm05.dygxfv restarted 2026-03-09T16:14:21.499 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: Activating manager daemon vm05.dygxfv 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: mgrmap e20: vm05.dygxfv(active, starting, since 0.0765659s) 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:21 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr fail", "who": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:21.500 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: Activating manager daemon vm05.dygxfv 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.14225 192.168.123.103:0/1183707335' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "mgr fail", "who": "vm03.gbgzmu"}]': finished 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: mgrmap e19: vm05.dygxfv(active, starting, since 0.0142017s) 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: Active manager daemon vm05.dygxfv restarted 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: Activating manager daemon vm05.dygxfv 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: mgrmap e20: vm05.dygxfv(active, starting, since 0.0765659s) 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:14:21.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:21 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:14:21.508 INFO:tasks.workunit.client.0.vm03.stdout:5/217: link d2/d7/d8/d16/c10 d2/d7/d8/d24/d27/d43/d4b/c4c 0 2026-03-09T16:14:21.509 INFO:tasks.workunit.client.0.vm03.stdout:5/218: read - d2/d7/de/d11/d19/d31/d35/d39/f4a zero size 2026-03-09T16:14:21.512 INFO:tasks.workunit.client.0.vm03.stdout:5/219: dwrite d2/d7/de/d11/f32 [0,4194304] 0 2026-03-09T16:14:21.513 INFO:tasks.workunit.client.0.vm03.stdout:8/191: sync 2026-03-09T16:14:21.514 INFO:tasks.workunit.client.0.vm03.stdout:8/192: dread - da/f35 zero size 2026-03-09T16:14:21.515 INFO:tasks.workunit.client.0.vm03.stdout:8/193: fdatasync f8 0 2026-03-09T16:14:21.515 INFO:tasks.workunit.client.0.vm03.stdout:8/194: stat da/d15 0 2026-03-09T16:14:21.515 INFO:tasks.workunit.client.0.vm03.stdout:8/195: write da/d10/d28/f2c [689233,9340] 0 2026-03-09T16:14:21.516 INFO:tasks.workunit.client.0.vm03.stdout:8/196: fdatasync da/db/f34 0 2026-03-09T16:14:21.517 INFO:tasks.workunit.client.0.vm03.stdout:8/197: chown da/d10/d28/c2d 15420706 1 2026-03-09T16:14:21.524 INFO:tasks.workunit.client.0.vm03.stdout:8/198: sync 2026-03-09T16:14:21.527 INFO:tasks.workunit.client.0.vm03.stdout:8/199: dread da/d10/f14 [0,4194304] 0 2026-03-09T16:14:21.536 INFO:tasks.workunit.client.0.vm03.stdout:2/193: rename db/d1e/l43 to db/d12/d2a/l4a 0 2026-03-09T16:14:21.536 INFO:tasks.workunit.client.0.vm03.stdout:2/194: read - db/d12/f48 zero size 2026-03-09T16:14:21.550 INFO:tasks.workunit.client.0.vm03.stdout:4/191: dwrite d5/f8 [4194304,4194304] 0 2026-03-09T16:14:21.556 INFO:tasks.workunit.client.0.vm03.stdout:4/192: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:21.581 INFO:tasks.workunit.client.0.vm03.stdout:6/161: symlink d9/l2a 0 2026-03-09T16:14:21.584 INFO:tasks.workunit.client.0.vm03.stdout:3/168: link d5/d27/l28 d5/d27/l2d 0 2026-03-09T16:14:21.586 INFO:tasks.workunit.client.0.vm03.stdout:7/134: rmdir d4/da/d18 39 2026-03-09T16:14:21.587 INFO:tasks.workunit.client.0.vm03.stdout:7/135: write d4/f26 [973416,84257] 0 2026-03-09T16:14:21.589 INFO:tasks.workunit.client.0.vm03.stdout:9/210: dwrite d2/df/f22 [0,4194304] 0 2026-03-09T16:14:21.602 INFO:tasks.workunit.client.0.vm03.stdout:1/134: mknod d4/d6/d1d/d24/c2d 0 2026-03-09T16:14:21.614 INFO:tasks.workunit.client.0.vm03.stdout:5/220: creat d2/d7/d1a/f4d x:0 0 0 2026-03-09T16:14:21.624 INFO:tasks.workunit.client.0.vm03.stdout:1/135: sync 2026-03-09T16:14:21.626 INFO:tasks.workunit.client.0.vm03.stdout:9/211: fdatasync d2/df/f22 0 2026-03-09T16:14:21.634 INFO:tasks.workunit.client.0.vm03.stdout:0/200: rename d0/da/d11/l25 to d0/da/d11/d32/d35/l40 0 2026-03-09T16:14:21.634 INFO:tasks.workunit.client.0.vm03.stdout:0/201: fsync d0/d7/f3d 0 2026-03-09T16:14:21.636 INFO:tasks.workunit.client.0.vm03.stdout:2/195: readlink db/l1d 0 2026-03-09T16:14:21.648 INFO:tasks.workunit.client.0.vm03.stdout:6/162: fsync d9/d14/f1d 0 2026-03-09T16:14:21.656 INFO:tasks.workunit.client.0.vm03.stdout:3/169: dread d5/d13/f25 [0,4194304] 0 2026-03-09T16:14:21.656 INFO:tasks.workunit.client.0.vm03.stdout:3/170: fsync d5/f2b 0 2026-03-09T16:14:21.664 INFO:tasks.workunit.client.0.vm03.stdout:7/136: write d4/da/d18/d22/d24/f30 [16589,66270] 0 2026-03-09T16:14:21.665 INFO:tasks.workunit.client.0.vm03.stdout:7/137: truncate d4/da/f20 1020049 0 2026-03-09T16:14:21.673 INFO:tasks.workunit.client.0.vm03.stdout:5/221: symlink d2/d7/d8/d24/d27/d43/l4e 0 2026-03-09T16:14:21.677 INFO:tasks.workunit.client.0.vm03.stdout:1/136: creat d4/db/f2e x:0 0 0 2026-03-09T16:14:21.683 INFO:tasks.workunit.client.0.vm03.stdout:8/200: symlink da/d32/l37 0 2026-03-09T16:14:21.691 INFO:tasks.workunit.client.0.vm03.stdout:0/202: creat d0/da/d11/d32/d37/f41 x:0 0 0 2026-03-09T16:14:21.694 INFO:tasks.workunit.client.0.vm03.stdout:0/203: write d0/da/d11/d32/d37/f33 [365381,113741] 0 2026-03-09T16:14:21.695 INFO:tasks.workunit.client.0.vm03.stdout:2/196: unlink db/d12/f48 0 2026-03-09T16:14:21.705 INFO:tasks.workunit.client.0.vm03.stdout:4/193: mknod d5/dd/c37 0 2026-03-09T16:14:21.705 INFO:tasks.workunit.client.0.vm03.stdout:4/194: readlink d5/dd/d1f/l27 0 2026-03-09T16:14:21.707 INFO:tasks.workunit.client.0.vm03.stdout:6/163: mknod d9/d22/c2b 0 2026-03-09T16:14:21.719 INFO:tasks.workunit.client.0.vm03.stdout:9/212: truncate d2/d4/d11/f1d 1319761 0 2026-03-09T16:14:21.730 INFO:tasks.workunit.client.0.vm03.stdout:0/204: write d0/f29 [73012,54197] 0 2026-03-09T16:14:21.730 INFO:tasks.workunit.client.0.vm03.stdout:0/205: stat d0/da/d1b/l36 0 2026-03-09T16:14:21.731 INFO:tasks.workunit.client.0.vm03.stdout:0/206: chown d0/da/c23 231 1 2026-03-09T16:14:21.734 INFO:tasks.workunit.client.0.vm03.stdout:2/197: creat db/d12/f4b x:0 0 0 2026-03-09T16:14:21.743 INFO:tasks.workunit.client.0.vm03.stdout:6/164: rename d9/f13 to d9/f2c 0 2026-03-09T16:14:21.750 INFO:tasks.workunit.client.0.vm03.stdout:6/165: dwrite d9/f20 [0,4194304] 0 2026-03-09T16:14:21.762 INFO:tasks.workunit.client.0.vm03.stdout:3/171: mkdir d5/d2e 0 2026-03-09T16:14:21.770 INFO:tasks.workunit.client.0.vm03.stdout:7/138: unlink d4/c7 0 2026-03-09T16:14:21.771 INFO:tasks.workunit.client.0.vm03.stdout:7/139: write d4/da/d18/f29 [729629,50473] 0 2026-03-09T16:14:21.772 INFO:tasks.workunit.client.0.vm03.stdout:7/140: write d4/dc/f1a [1476672,35842] 0 2026-03-09T16:14:21.772 INFO:tasks.workunit.client.0.vm03.stdout:7/141: chown d4/f26 57857401 1 2026-03-09T16:14:21.785 INFO:tasks.workunit.client.0.vm03.stdout:1/137: mknod d4/d6/d1d/d24/d25/c2f 0 2026-03-09T16:14:21.785 INFO:tasks.workunit.client.0.vm03.stdout:7/142: dwrite d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:21.785 INFO:tasks.workunit.client.0.vm03.stdout:1/138: chown d4/c29 12 1 2026-03-09T16:14:21.791 INFO:tasks.workunit.client.0.vm03.stdout:7/143: write d4/f26 [381552,21253] 0 2026-03-09T16:14:21.794 INFO:tasks.workunit.client.0.vm03.stdout:7/144: write d4/f8 [5270918,95595] 0 2026-03-09T16:14:21.802 INFO:tasks.workunit.client.0.vm03.stdout:9/213: mknod d2/df/c3f 0 2026-03-09T16:14:21.807 INFO:tasks.workunit.client.0.vm03.stdout:9/214: dwrite d2/df/f22 [4194304,4194304] 0 2026-03-09T16:14:21.826 INFO:tasks.workunit.client.0.vm03.stdout:4/195: mknod d5/dd/c38 0 2026-03-09T16:14:21.826 INFO:tasks.workunit.client.0.vm03.stdout:2/198: creat db/d1e/f4c x:0 0 0 2026-03-09T16:14:21.843 INFO:tasks.workunit.client.0.vm03.stdout:0/207: truncate d0/da/f2c 897539 0 2026-03-09T16:14:21.859 INFO:tasks.workunit.client.0.vm03.stdout:2/199: unlink f8 0 2026-03-09T16:14:21.859 INFO:tasks.workunit.client.0.vm03.stdout:2/200: readlink db/d12/l1c 0 2026-03-09T16:14:21.863 INFO:tasks.workunit.client.0.vm03.stdout:5/222: getdents d2 0 2026-03-09T16:14:21.864 INFO:tasks.workunit.client.0.vm03.stdout:5/223: stat d2/d7/d8/d24/d27/l2c 0 2026-03-09T16:14:21.868 INFO:tasks.workunit.client.0.vm03.stdout:5/224: dwrite d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:21.884 INFO:tasks.workunit.client.0.vm03.stdout:3/172: mknod d5/c2f 0 2026-03-09T16:14:21.886 INFO:tasks.workunit.client.0.vm03.stdout:7/145: mknod d4/da/c31 0 2026-03-09T16:14:21.891 INFO:tasks.workunit.client.0.vm03.stdout:8/201: rename da/d15/c21 to da/c38 0 2026-03-09T16:14:21.898 INFO:tasks.workunit.client.0.vm03.stdout:6/166: truncate d9/f20 3208256 0 2026-03-09T16:14:21.899 INFO:tasks.workunit.client.0.vm03.stdout:6/167: fsync d9/d22/f27 0 2026-03-09T16:14:21.904 INFO:tasks.workunit.client.0.vm03.stdout:2/201: dwrite f5 [0,4194304] 0 2026-03-09T16:14:21.906 INFO:tasks.workunit.client.0.vm03.stdout:2/202: stat db/d12/l32 0 2026-03-09T16:14:21.909 INFO:tasks.workunit.client.0.vm03.stdout:2/203: dread f9 [0,4194304] 0 2026-03-09T16:14:21.919 INFO:tasks.workunit.client.0.vm03.stdout:5/225: symlink d2/d7/d8/d16/l4f 0 2026-03-09T16:14:21.926 INFO:tasks.workunit.client.0.vm03.stdout:5/226: write d2/d7/de/d11/d19/d31/d35/d39/f4a [329779,53453] 0 2026-03-09T16:14:21.927 INFO:tasks.workunit.client.0.vm03.stdout:5/227: dread d2/d7/de/d11/f26 [4194304,4194304] 0 2026-03-09T16:14:21.927 INFO:tasks.workunit.client.0.vm03.stdout:5/228: write d2/d7/de/d11/d19/d31/d35/d39/f4a [1316595,46349] 0 2026-03-09T16:14:21.937 INFO:tasks.workunit.client.0.vm03.stdout:3/173: dwrite d5/f16 [0,4194304] 0 2026-03-09T16:14:21.943 INFO:tasks.workunit.client.0.vm03.stdout:3/174: dwrite d5/d13/f1d [0,4194304] 0 2026-03-09T16:14:21.948 INFO:tasks.workunit.client.0.vm03.stdout:3/175: chown d5/d27/l28 178072149 1 2026-03-09T16:14:21.954 INFO:tasks.workunit.client.0.vm03.stdout:0/208: mkdir d0/d42 0 2026-03-09T16:14:21.958 INFO:tasks.workunit.client.0.vm03.stdout:0/209: dwrite d0/da/d11/f2e [0,4194304] 0 2026-03-09T16:14:21.960 INFO:tasks.workunit.client.0.vm03.stdout:0/210: fsync d0/f3 0 2026-03-09T16:14:21.962 INFO:tasks.workunit.client.0.vm03.stdout:0/211: dread d0/da/f1c [0,4194304] 0 2026-03-09T16:14:21.964 INFO:tasks.workunit.client.0.vm03.stdout:8/202: mknod da/d10/c39 0 2026-03-09T16:14:21.964 INFO:tasks.workunit.client.0.vm03.stdout:8/203: chown da/db/d30 110795 1 2026-03-09T16:14:21.965 INFO:tasks.workunit.client.0.vm03.stdout:6/168: creat d9/d22/f2d x:0 0 0 2026-03-09T16:14:21.966 INFO:tasks.workunit.client.0.vm03.stdout:2/204: symlink db/d12/l4d 0 2026-03-09T16:14:21.967 INFO:tasks.workunit.client.0.vm03.stdout:4/196: rmdir d5/dd/d32 0 2026-03-09T16:14:21.967 INFO:tasks.workunit.client.0.vm03.stdout:4/197: dread - d5/db/f2f zero size 2026-03-09T16:14:21.967 INFO:tasks.workunit.client.0.vm03.stdout:4/198: fdatasync d5/db/f2f 0 2026-03-09T16:14:21.971 INFO:tasks.workunit.client.0.vm03.stdout:4/199: dread d5/db/f34 [0,4194304] 0 2026-03-09T16:14:21.971 INFO:tasks.workunit.client.0.vm03.stdout:5/229: symlink d2/d7/l50 0 2026-03-09T16:14:21.972 INFO:tasks.workunit.client.0.vm03.stdout:7/146: creat d4/d2d/f32 x:0 0 0 2026-03-09T16:14:21.973 INFO:tasks.workunit.client.0.vm03.stdout:4/200: write d5/db/f28 [906891,97135] 0 2026-03-09T16:14:21.982 INFO:tasks.workunit.client.0.vm03.stdout:5/230: dread d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:21.983 INFO:tasks.workunit.client.0.vm03.stdout:5/231: fdatasync d2/d7/de/d11/d19/d31/f42 0 2026-03-09T16:14:21.986 INFO:tasks.workunit.client.0.vm03.stdout:9/215: getdents d2/d4/d11/d12/d28 0 2026-03-09T16:14:21.989 INFO:tasks.workunit.client.0.vm03.stdout:9/216: dread d2/d4/d11/d12/d28/f2c [0,4194304] 0 2026-03-09T16:14:21.991 INFO:tasks.workunit.client.0.vm03.stdout:1/139: rename d4/f1f to d4/d6/d1d/d20/d23/f30 0 2026-03-09T16:14:21.991 INFO:tasks.workunit.client.0.vm03.stdout:9/217: truncate d2/d4/f3e 459083 0 2026-03-09T16:14:22.009 INFO:tasks.workunit.client.0.vm03.stdout:8/204: symlink da/db/d30/l3a 0 2026-03-09T16:14:22.009 INFO:tasks.workunit.client.0.vm03.stdout:8/205: write da/d10/f1f [4341988,97371] 0 2026-03-09T16:14:22.016 INFO:tasks.workunit.client.0.vm03.stdout:2/205: rename db/d1e/c2c to db/d1e/c4e 0 2026-03-09T16:14:22.023 INFO:tasks.workunit.client.0.vm03.stdout:4/201: unlink d5/db/l1a 0 2026-03-09T16:14:22.025 INFO:tasks.workunit.client.0.vm03.stdout:4/202: dread d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:22.029 INFO:tasks.workunit.client.0.vm03.stdout:7/147: dwrite d4/da/d18/d22/d24/f2f [0,4194304] 0 2026-03-09T16:14:22.032 INFO:tasks.workunit.client.0.vm03.stdout:7/148: dread d4/da/d18/d22/d24/f30 [0,4194304] 0 2026-03-09T16:14:22.034 INFO:tasks.workunit.client.0.vm03.stdout:5/232: symlink d2/d7/d8/d24/l51 0 2026-03-09T16:14:22.037 INFO:tasks.workunit.client.0.vm03.stdout:5/233: dread d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:22.056 INFO:tasks.workunit.client.0.vm03.stdout:9/218: symlink d2/d4/d11/d12/l40 0 2026-03-09T16:14:22.056 INFO:tasks.workunit.client.0.vm03.stdout:9/219: chown d2/df/l32 2118438 1 2026-03-09T16:14:22.056 INFO:tasks.workunit.client.0.vm03.stdout:9/220: chown d2/d4 0 1 2026-03-09T16:14:22.057 INFO:tasks.workunit.client.0.vm03.stdout:9/221: write d2/df/f14 [1806204,24211] 0 2026-03-09T16:14:22.068 INFO:tasks.workunit.client.0.vm03.stdout:8/206: mkdir da/d1d/d3b 0 2026-03-09T16:14:22.079 INFO:tasks.workunit.client.0.vm03.stdout:4/203: rmdir d5/d17 39 2026-03-09T16:14:22.083 INFO:tasks.workunit.client.0.vm03.stdout:7/149: rmdir d4/da/d18/d22/d24/d16 39 2026-03-09T16:14:22.085 INFO:tasks.workunit.client.0.vm03.stdout:5/234: mkdir d2/d7/de/d11/d38/d52 0 2026-03-09T16:14:22.086 INFO:tasks.workunit.client.0.vm03.stdout:5/235: chown d2/d7/d8/d16/l4f 20904791 1 2026-03-09T16:14:22.086 INFO:tasks.workunit.client.0.vm03.stdout:5/236: fsync d2/d7/de/d11/f32 0 2026-03-09T16:14:22.087 INFO:tasks.workunit.client.0.vm03.stdout:3/176: link d5/l17 d5/d27/l30 0 2026-03-09T16:14:22.090 INFO:tasks.workunit.client.0.vm03.stdout:0/212: rename d0/da/d11/d32/d35/f3c to d0/da/d11/f43 0 2026-03-09T16:14:22.091 INFO:tasks.workunit.client.0.vm03.stdout:0/213: write d0/f1a [1300822,48869] 0 2026-03-09T16:14:22.101 INFO:tasks.workunit.client.0.vm03.stdout:6/169: link d9/cd d9/d14/c2e 0 2026-03-09T16:14:22.113 INFO:tasks.workunit.client.0.vm03.stdout:5/237: symlink d2/d7/d1a/d1c/l53 0 2026-03-09T16:14:22.116 INFO:tasks.workunit.client.0.vm03.stdout:8/207: sync 2026-03-09T16:14:22.117 INFO:tasks.workunit.client.0.vm03.stdout:8/208: stat da/c16 0 2026-03-09T16:14:22.117 INFO:tasks.workunit.client.0.vm03.stdout:8/209: chown da/d10/l2b 659564 1 2026-03-09T16:14:22.117 INFO:tasks.workunit.client.0.vm03.stdout:8/210: fdatasync da/db/f34 0 2026-03-09T16:14:22.120 INFO:tasks.workunit.client.0.vm03.stdout:6/170: dread d9/ff [0,4194304] 0 2026-03-09T16:14:22.121 INFO:tasks.workunit.client.0.vm03.stdout:8/211: dread da/d10/d28/f2c [0,4194304] 0 2026-03-09T16:14:22.121 INFO:tasks.workunit.client.0.vm03.stdout:8/212: dread - da/db/d30/f36 zero size 2026-03-09T16:14:22.125 INFO:tasks.workunit.client.0.vm03.stdout:1/140: truncate d4/f1b 3264352 0 2026-03-09T16:14:22.128 INFO:tasks.workunit.client.0.vm03.stdout:2/206: rename db/l3c to db/d12/d2a/l4f 0 2026-03-09T16:14:22.133 INFO:tasks.workunit.client.0.vm03.stdout:0/214: symlink d0/da/d1b/l44 0 2026-03-09T16:14:22.151 INFO:tasks.workunit.client.0.vm03.stdout:6/171: symlink d9/l2f 0 2026-03-09T16:14:22.153 INFO:tasks.workunit.client.0.vm03.stdout:8/213: mknod da/db/d30/c3c 0 2026-03-09T16:14:22.155 INFO:tasks.workunit.client.0.vm03.stdout:2/207: dread - db/f2d zero size 2026-03-09T16:14:22.156 INFO:tasks.workunit.client.0.vm03.stdout:3/177: rename d5/f24 to d5/d1e/f31 0 2026-03-09T16:14:22.158 INFO:tasks.workunit.client.0.vm03.stdout:0/215: mkdir d0/d7/d3e/d45 0 2026-03-09T16:14:22.158 INFO:tasks.workunit.client.0.vm03.stdout:0/216: write d0/f29 [202431,102025] 0 2026-03-09T16:14:22.160 INFO:tasks.workunit.client.0.vm03.stdout:9/222: creat d2/d4/d11/f41 x:0 0 0 2026-03-09T16:14:22.163 INFO:tasks.workunit.client.0.vm03.stdout:2/208: sync 2026-03-09T16:14:22.164 INFO:tasks.workunit.client.0.vm03.stdout:9/223: dwrite d2/d4/d1f/f23 [0,4194304] 0 2026-03-09T16:14:22.164 INFO:tasks.workunit.client.0.vm03.stdout:9/224: write d2/f8 [7367044,16028] 0 2026-03-09T16:14:22.179 INFO:tasks.workunit.client.0.vm03.stdout:6/172: unlink d9/f2c 0 2026-03-09T16:14:22.180 INFO:tasks.workunit.client.0.vm03.stdout:6/173: dread - d9/f1f zero size 2026-03-09T16:14:22.181 INFO:tasks.workunit.client.0.vm03.stdout:8/214: mkdir da/db/d30/d3d 0 2026-03-09T16:14:22.182 INFO:tasks.workunit.client.0.vm03.stdout:1/141: mkdir d4/d31 0 2026-03-09T16:14:22.183 INFO:tasks.workunit.client.0.vm03.stdout:1/142: fsync d4/db/f21 0 2026-03-09T16:14:22.187 INFO:tasks.workunit.client.0.vm03.stdout:5/238: rename d2/d7/de/d11/d19/d31/d35/d39 to d2/d7/de/d54 0 2026-03-09T16:14:22.188 INFO:tasks.workunit.client.0.vm03.stdout:3/178: mknod d5/d13/c32 0 2026-03-09T16:14:22.192 INFO:tasks.workunit.client.0.vm03.stdout:0/217: creat d0/da/d1b/f46 x:0 0 0 2026-03-09T16:14:22.193 INFO:tasks.workunit.client.0.vm03.stdout:3/179: sync 2026-03-09T16:14:22.195 INFO:tasks.workunit.client.0.vm03.stdout:4/204: getdents d5/db/d25/d31 0 2026-03-09T16:14:22.202 INFO:tasks.workunit.client.0.vm03.stdout:2/209: chown db/ff 51329 1 2026-03-09T16:14:22.206 INFO:tasks.workunit.client.0.vm03.stdout:2/210: dread f7 [0,4194304] 0 2026-03-09T16:14:22.207 INFO:tasks.workunit.client.0.vm03.stdout:2/211: truncate db/d12/f39 836658 0 2026-03-09T16:14:22.214 INFO:tasks.workunit.client.0.vm03.stdout:9/225: rmdir d2/d4/d11/d12 39 2026-03-09T16:14:22.217 INFO:tasks.workunit.client.0.vm03.stdout:7/150: getdents d4/da/d18/d22/d24 0 2026-03-09T16:14:22.221 INFO:tasks.workunit.client.0.vm03.stdout:6/174: dwrite d9/ff [0,4194304] 0 2026-03-09T16:14:22.239 INFO:tasks.workunit.client.0.vm03.stdout:0/218: mkdir d0/da/d11/d32/d35/d47 0 2026-03-09T16:14:22.250 INFO:tasks.workunit.client.0.vm03.stdout:7/151: creat d4/da/d18/d22/f33 x:0 0 0 2026-03-09T16:14:22.253 INFO:tasks.workunit.client.0.vm03.stdout:6/175: mknod d9/c30 0 2026-03-09T16:14:22.256 INFO:tasks.workunit.client.0.vm03.stdout:6/176: dread d9/d14/f29 [0,4194304] 0 2026-03-09T16:14:22.262 INFO:tasks.workunit.client.0.vm03.stdout:5/239: mknod d2/d7/de/d11/d38/d52/c55 0 2026-03-09T16:14:22.262 INFO:tasks.workunit.client.0.vm03.stdout:3/180: write d5/d1e/f26 [1030207,93065] 0 2026-03-09T16:14:22.263 INFO:tasks.workunit.client.0.vm03.stdout:6/177: dwrite d9/d22/f24 [0,4194304] 0 2026-03-09T16:14:22.264 INFO:tasks.workunit.client.0.vm03.stdout:5/240: fdatasync d2/d7/de/d54/f4a 0 2026-03-09T16:14:22.268 INFO:tasks.workunit.client.0.vm03.stdout:3/181: dread d5/d13/f1d [0,4194304] 0 2026-03-09T16:14:22.268 INFO:tasks.workunit.client.0.vm03.stdout:3/182: fdatasync d5/d13/f2c 0 2026-03-09T16:14:22.269 INFO:tasks.workunit.client.0.vm03.stdout:9/226: dread d2/d4/d11/f1d [0,4194304] 0 2026-03-09T16:14:22.269 INFO:tasks.workunit.client.0.vm03.stdout:3/183: write d5/f11 [146685,84644] 0 2026-03-09T16:14:22.271 INFO:tasks.workunit.client.0.vm03.stdout:5/241: sync 2026-03-09T16:14:22.274 INFO:tasks.workunit.client.0.vm03.stdout:5/242: dread d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:22.277 INFO:tasks.workunit.client.0.vm03.stdout:4/205: creat d5/d17/f39 x:0 0 0 2026-03-09T16:14:22.283 INFO:tasks.workunit.client.0.vm03.stdout:4/206: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:22.289 INFO:tasks.workunit.client.0.vm03.stdout:4/207: dread d5/f8 [0,4194304] 0 2026-03-09T16:14:22.293 INFO:tasks.workunit.client.0.vm03.stdout:8/215: rmdir da/db/d30/d3d 0 2026-03-09T16:14:22.294 INFO:tasks.workunit.client.0.vm03.stdout:1/143: rename d4/d6/lc to d4/d6/l32 0 2026-03-09T16:14:22.296 INFO:tasks.workunit.client.0.vm03.stdout:3/184: rmdir d5/d1e 39 2026-03-09T16:14:22.300 INFO:tasks.workunit.client.0.vm03.stdout:9/227: rename d2/d4/d11/f1d to d2/df/f42 0 2026-03-09T16:14:22.301 INFO:tasks.workunit.client.0.vm03.stdout:9/228: stat d2/d4/d11/d29 0 2026-03-09T16:14:22.306 INFO:tasks.workunit.client.0.vm03.stdout:9/229: dwrite d2/d4/d1f/f23 [0,4194304] 0 2026-03-09T16:14:22.308 INFO:tasks.workunit.client.0.vm03.stdout:0/219: truncate d0/da/d11/d32/d37/f33 3553648 0 2026-03-09T16:14:22.309 INFO:tasks.workunit.client.0.vm03.stdout:7/152: link d4/dc/f1a d4/da/d18/d22/d24/d15/f34 0 2026-03-09T16:14:22.316 INFO:tasks.workunit.client.0.vm03.stdout:1/144: truncate d4/fd 4148646 0 2026-03-09T16:14:22.317 INFO:tasks.workunit.client.0.vm03.stdout:1/145: write d4/db/f21 [1042477,12996] 0 2026-03-09T16:14:22.323 INFO:tasks.workunit.client.0.vm03.stdout:3/185: dread d5/d13/f29 [0,4194304] 0 2026-03-09T16:14:22.326 INFO:tasks.workunit.client.0.vm03.stdout:2/212: link db/c27 db/c50 0 2026-03-09T16:14:22.327 INFO:tasks.workunit.client.0.vm03.stdout:4/208: fsync d5/dd/f22 0 2026-03-09T16:14:22.327 INFO:tasks.workunit.client.0.vm03.stdout:4/209: chown f2 1115 1 2026-03-09T16:14:22.327 INFO:tasks.workunit.client.0.vm03.stdout:7/153: rmdir d4/da/d18/d22/d24/d16 39 2026-03-09T16:14:22.329 INFO:tasks.workunit.client.0.vm03.stdout:1/146: mkdir d4/d6/d1d/d20/d23/d33 0 2026-03-09T16:14:22.336 INFO:tasks.workunit.client.0.vm03.stdout:9/230: symlink d2/d4/d11/d12/l43 0 2026-03-09T16:14:22.341 INFO:tasks.workunit.client.0.vm03.stdout:9/231: stat d2/d4/d11/d12/l40 0 2026-03-09T16:14:22.341 INFO:tasks.workunit.client.0.vm03.stdout:9/232: write d2/f15 [2512426,68567] 0 2026-03-09T16:14:22.341 INFO:tasks.workunit.client.0.vm03.stdout:4/210: dread d5/d17/f21 [0,4194304] 0 2026-03-09T16:14:22.341 INFO:tasks.workunit.client.0.vm03.stdout:4/211: chown d5/dd/d1f/f2d 6 1 2026-03-09T16:14:22.343 INFO:tasks.workunit.client.0.vm03.stdout:4/212: chown d5/dd/f1e 110195 1 2026-03-09T16:14:22.346 INFO:tasks.workunit.client.0.vm03.stdout:6/178: rmdir d9 39 2026-03-09T16:14:22.351 INFO:tasks.workunit.client.0.vm03.stdout:4/213: dread d5/f7 [0,4194304] 0 2026-03-09T16:14:22.351 INFO:tasks.workunit.client.0.vm03.stdout:3/186: rmdir d5/d27 39 2026-03-09T16:14:22.362 INFO:tasks.workunit.client.0.vm03.stdout:5/243: dwrite d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:22.363 INFO:tasks.workunit.client.0.vm03.stdout:5/244: write d2/d7/de/f48 [625912,83049] 0 2026-03-09T16:14:22.369 INFO:tasks.workunit.client.0.vm03.stdout:8/216: write da/db/f1c [3937023,54389] 0 2026-03-09T16:14:22.370 INFO:tasks.workunit.client.0.vm03.stdout:8/217: truncate da/db/fe 1299085 0 2026-03-09T16:14:22.378 INFO:tasks.workunit.client.0.vm03.stdout:0/220: rename d0/da/d11 to d0/d7/d48 0 2026-03-09T16:14:22.388 INFO:tasks.workunit.client.0.vm03.stdout:6/179: chown d9/d22/c2b 78795489 1 2026-03-09T16:14:22.388 INFO:tasks.workunit.client.0.vm03.stdout:4/214: creat d5/db/f3a x:0 0 0 2026-03-09T16:14:22.388 INFO:tasks.workunit.client.0.vm03.stdout:2/213: fdatasync fa 0 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Manager daemon vm05.dygxfv is now available 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Migrating agent root cert to cert store 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Migrating agent root key to cert store 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Checking for cert/key for grafana.vm03 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Migrating grafana.vm03 cert to cert store 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: Migrating grafana.vm03 key to cert store 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:22.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:22 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:22.391 INFO:tasks.workunit.client.0.vm03.stdout:7/154: chown d4/da/d18/d22/d24/d16 4112129 1 2026-03-09T16:14:22.393 INFO:tasks.workunit.client.0.vm03.stdout:0/221: unlink d0/d7/f19 0 2026-03-09T16:14:22.394 INFO:tasks.workunit.client.0.vm03.stdout:9/233: truncate d2/df/f42 218786 0 2026-03-09T16:14:22.396 INFO:tasks.workunit.client.0.vm03.stdout:4/215: mknod d5/db/d25/d31/d33/c3b 0 2026-03-09T16:14:22.397 INFO:tasks.workunit.client.0.vm03.stdout:8/218: sync 2026-03-09T16:14:22.397 INFO:tasks.workunit.client.0.vm03.stdout:6/180: sync 2026-03-09T16:14:22.401 INFO:tasks.workunit.client.0.vm03.stdout:5/245: creat d2/d7/d3c/d3d/f56 x:0 0 0 2026-03-09T16:14:22.408 INFO:tasks.workunit.client.0.vm03.stdout:9/234: creat d2/d4/d1f/f44 x:0 0 0 2026-03-09T16:14:22.411 INFO:tasks.workunit.client.0.vm03.stdout:3/187: creat d5/f33 x:0 0 0 2026-03-09T16:14:22.413 INFO:tasks.workunit.client.0.vm03.stdout:8/219: creat da/d10/f3e x:0 0 0 2026-03-09T16:14:22.425 INFO:tasks.workunit.client.0.vm03.stdout:6/181: dwrite d9/d14/f1d [0,4194304] 0 2026-03-09T16:14:22.428 INFO:tasks.workunit.client.0.vm03.stdout:5/246: creat d2/d7/de/d11/d38/f57 x:0 0 0 2026-03-09T16:14:22.429 INFO:tasks.workunit.client.0.vm03.stdout:5/247: write d2/d7/de/d11/d19/d31/f42 [5665016,106767] 0 2026-03-09T16:14:22.433 INFO:tasks.workunit.client.0.vm03.stdout:5/248: dwrite d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:22.441 INFO:tasks.workunit.client.0.vm03.stdout:7/155: rmdir d4/da/d18/d22/d24/d15 39 2026-03-09T16:14:22.443 INFO:tasks.workunit.client.0.vm03.stdout:1/147: getdents d4/d6/d1d/d24/d25 0 2026-03-09T16:14:22.449 INFO:tasks.workunit.client.0.vm03.stdout:3/188: mkdir d5/d13/d34 0 2026-03-09T16:14:22.452 INFO:tasks.workunit.client.0.vm03.stdout:8/220: symlink da/d10/d28/l3f 0 2026-03-09T16:14:22.455 INFO:tasks.workunit.client.0.vm03.stdout:2/214: link db/d12/c1b db/d1e/c51 0 2026-03-09T16:14:22.457 INFO:tasks.workunit.client.0.vm03.stdout:2/215: truncate db/f2d 143131 0 2026-03-09T16:14:22.457 INFO:tasks.workunit.client.0.vm03.stdout:2/216: dread - db/d1e/f45 zero size 2026-03-09T16:14:22.459 INFO:tasks.workunit.client.0.vm03.stdout:6/182: creat d9/d14/f31 x:0 0 0 2026-03-09T16:14:22.466 INFO:tasks.workunit.client.0.vm03.stdout:7/156: dwrite d4/da/d18/d22/d24/d15/f2a [0,4194304] 0 2026-03-09T16:14:22.472 INFO:tasks.workunit.client.0.vm03.stdout:1/148: dread d4/d6/d1d/d20/d23/f30 [0,4194304] 0 2026-03-09T16:14:22.475 INFO:tasks.workunit.client.0.vm03.stdout:1/149: dread d4/db/f21 [0,4194304] 0 2026-03-09T16:14:22.478 INFO:tasks.workunit.client.0.vm03.stdout:0/222: truncate d0/da/d1b/fd 1610631 0 2026-03-09T16:14:22.478 INFO:tasks.workunit.client.0.vm03.stdout:0/223: fsync d0/f29 0 2026-03-09T16:14:22.488 INFO:tasks.workunit.client.0.vm03.stdout:4/216: creat d5/f3c x:0 0 0 2026-03-09T16:14:22.490 INFO:tasks.workunit.client.0.vm03.stdout:8/221: unlink da/db/l1e 0 2026-03-09T16:14:22.494 INFO:tasks.workunit.client.0.vm03.stdout:2/217: mkdir db/d3b/d52 0 2026-03-09T16:14:22.494 INFO:tasks.workunit.client.0.vm03.stdout:2/218: dread - db/d1e/f47 zero size 2026-03-09T16:14:22.499 INFO:tasks.workunit.client.0.vm03.stdout:6/183: dread f7 [0,4194304] 0 2026-03-09T16:14:22.500 INFO:tasks.workunit.client.0.vm03.stdout:6/184: write d9/f15 [2318000,24549] 0 2026-03-09T16:14:22.503 INFO:tasks.workunit.client.0.vm03.stdout:9/235: truncate d2/f33 3723004 0 2026-03-09T16:14:22.506 INFO:tasks.workunit.client.0.vm03.stdout:5/249: rename d2/d7/d8/d24/d27/c2e to d2/d7/d8/d16/c58 0 2026-03-09T16:14:22.509 INFO:tasks.workunit.client.0.vm03.stdout:7/157: symlink d4/d2d/l35 0 2026-03-09T16:14:22.519 INFO:tasks.workunit.client.0.vm03.stdout:4/217: unlink d5/dd/d1f/l2c 0 2026-03-09T16:14:22.520 INFO:tasks.workunit.client.0.vm03.stdout:3/189: mknod d5/d2e/c35 0 2026-03-09T16:14:22.521 INFO:tasks.workunit.client.0.vm03.stdout:3/190: readlink d5/l12 0 2026-03-09T16:14:22.524 INFO:tasks.workunit.client.0.vm03.stdout:8/222: rmdir da 39 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Manager daemon vm05.dygxfv is now available 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Migrating agent root cert to cert store 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Migrating agent root key to cert store 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Checking for cert/key for grafana.vm03 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Migrating grafana.vm03 cert to cert store 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: Migrating grafana.vm03 key to cert store 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:22 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.dygxfv/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:22.528 INFO:tasks.workunit.client.0.vm03.stdout:2/219: creat db/d3b/f53 x:0 0 0 2026-03-09T16:14:22.531 INFO:tasks.workunit.client.0.vm03.stdout:2/220: dread db/d12/f37 [0,4194304] 0 2026-03-09T16:14:22.534 INFO:tasks.workunit.client.0.vm03.stdout:6/185: mknod d9/d14/c32 0 2026-03-09T16:14:22.538 INFO:tasks.workunit.client.0.vm03.stdout:9/236: creat d2/d4/d11/d12/f45 x:0 0 0 2026-03-09T16:14:22.540 INFO:tasks.workunit.client.0.vm03.stdout:0/224: rename d0/da/d1b/l36 to d0/d7/d48/d32/d37/l49 0 2026-03-09T16:14:22.544 INFO:tasks.workunit.client.0.vm03.stdout:5/250: mknod d2/d7/d8/d24/c59 0 2026-03-09T16:14:22.553 INFO:tasks.workunit.client.0.vm03.stdout:4/218: dread f1 [0,4194304] 0 2026-03-09T16:14:22.554 INFO:tasks.workunit.client.0.vm03.stdout:8/223: fsync da/d10/f23 0 2026-03-09T16:14:22.558 INFO:tasks.workunit.client.0.vm03.stdout:8/224: dread da/db/fe [0,4194304] 0 2026-03-09T16:14:22.560 INFO:tasks.workunit.client.0.vm03.stdout:2/221: creat db/d1e/f54 x:0 0 0 2026-03-09T16:14:22.561 INFO:tasks.workunit.client.0.vm03.stdout:8/225: dread da/d15/f2f [0,4194304] 0 2026-03-09T16:14:22.561 INFO:tasks.workunit.client.0.vm03.stdout:2/222: truncate db/f34 574177 0 2026-03-09T16:14:22.562 INFO:tasks.workunit.client.0.vm03.stdout:2/223: truncate db/f23 929900 0 2026-03-09T16:14:22.565 INFO:tasks.workunit.client.0.vm03.stdout:6/186: rmdir d9/d22 39 2026-03-09T16:14:22.565 INFO:tasks.workunit.client.0.vm03.stdout:2/224: dwrite f5 [0,4194304] 0 2026-03-09T16:14:22.565 INFO:tasks.workunit.client.0.vm03.stdout:6/187: readlink d9/l2a 0 2026-03-09T16:14:22.568 INFO:tasks.workunit.client.0.vm03.stdout:6/188: dread d9/f1e [0,4194304] 0 2026-03-09T16:14:22.576 INFO:tasks.workunit.client.0.vm03.stdout:9/237: mkdir d2/d4/d11/d29/d2a/d46 0 2026-03-09T16:14:22.579 INFO:tasks.workunit.client.0.vm03.stdout:0/225: unlink d0/f1a 0 2026-03-09T16:14:22.580 INFO:tasks.workunit.client.0.vm03.stdout:0/226: chown d0/f1e 2 1 2026-03-09T16:14:22.582 INFO:tasks.workunit.client.0.vm03.stdout:5/251: creat d2/f5a x:0 0 0 2026-03-09T16:14:22.585 INFO:tasks.workunit.client.0.vm03.stdout:5/252: dread d2/d7/de/f48 [0,4194304] 0 2026-03-09T16:14:22.591 INFO:tasks.workunit.client.0.vm03.stdout:4/219: symlink d5/db/l3d 0 2026-03-09T16:14:22.591 INFO:tasks.workunit.client.0.vm03.stdout:4/220: stat d5/dd/f22 0 2026-03-09T16:14:22.598 INFO:tasks.workunit.client.0.vm03.stdout:7/158: truncate d4/fe 3539677 0 2026-03-09T16:14:22.599 INFO:tasks.workunit.client.0.vm03.stdout:7/159: fdatasync d4/f26 0 2026-03-09T16:14:22.602 INFO:tasks.workunit.client.0.vm03.stdout:7/160: dread d4/f26 [0,4194304] 0 2026-03-09T16:14:22.603 INFO:tasks.workunit.client.0.vm03.stdout:8/226: fdatasync da/d10/f33 0 2026-03-09T16:14:22.612 INFO:tasks.workunit.client.0.vm03.stdout:6/189: write d9/d22/f24 [4649486,10623] 0 2026-03-09T16:14:22.621 INFO:tasks.workunit.client.0.vm03.stdout:0/227: creat d0/d7/d48/f4a x:0 0 0 2026-03-09T16:14:22.622 INFO:tasks.workunit.client.0.vm03.stdout:0/228: chown d0/d7/d48 1 1 2026-03-09T16:14:22.622 INFO:tasks.workunit.client.0.vm03.stdout:0/229: fdatasync d0/da/d1b/f46 0 2026-03-09T16:14:22.630 INFO:tasks.workunit.client.0.vm03.stdout:0/230: write d0/da/d1b/f46 [2890,121882] 0 2026-03-09T16:14:22.635 INFO:tasks.workunit.client.0.vm03.stdout:1/150: getdents d4/db 0 2026-03-09T16:14:22.641 INFO:tasks.workunit.client.0.vm03.stdout:4/221: unlink d5/d17/f14 0 2026-03-09T16:14:22.653 INFO:tasks.workunit.client.0.vm03.stdout:3/191: dwrite d5/d1e/f31 [4194304,4194304] 0 2026-03-09T16:14:22.684 INFO:tasks.workunit.client.0.vm03.stdout:9/238: creat d2/d4/d11/d29/d2a/d46/f47 x:0 0 0 2026-03-09T16:14:22.691 INFO:tasks.workunit.client.0.vm03.stdout:0/231: creat d0/d7/d48/d32/d35/f4b x:0 0 0 2026-03-09T16:14:22.692 INFO:tasks.workunit.client.0.vm03.stdout:0/232: chown d0/d7/d48/d32/d35/d47 0 1 2026-03-09T16:14:22.692 INFO:tasks.workunit.client.0.vm03.stdout:0/233: fsync d0/d7/d48/f2e 0 2026-03-09T16:14:22.693 INFO:tasks.workunit.client.0.vm03.stdout:0/234: write d0/f29 [1248074,115434] 0 2026-03-09T16:14:22.698 INFO:tasks.workunit.client.0.vm03.stdout:1/151: symlink d4/d6/d1d/d20/l34 0 2026-03-09T16:14:22.703 INFO:tasks.workunit.client.0.vm03.stdout:4/222: rmdir d5/dd/d1f 39 2026-03-09T16:14:22.703 INFO:tasks.workunit.client.0.vm03.stdout:4/223: truncate d5/d17/f2b 2496416 0 2026-03-09T16:14:22.704 INFO:tasks.workunit.client.0.vm03.stdout:4/224: write d5/dd/f16 [147563,129838] 0 2026-03-09T16:14:22.713 INFO:tasks.workunit.client.0.vm03.stdout:7/161: truncate d4/f26 1345418 0 2026-03-09T16:14:22.719 INFO:tasks.workunit.client.0.vm03.stdout:8/227: link da/d10/f23 da/d15/f40 0 2026-03-09T16:14:22.723 INFO:tasks.workunit.client.0.vm03.stdout:2/225: link db/f14 db/f55 0 2026-03-09T16:14:22.732 INFO:tasks.workunit.client.0.vm03.stdout:9/239: rmdir d2 39 2026-03-09T16:14:22.745 INFO:tasks.workunit.client.0.vm03.stdout:4/225: mknod d5/db/d25/c3e 0 2026-03-09T16:14:22.746 INFO:tasks.workunit.client.0.vm03.stdout:4/226: chown d5/d17/c30 206 1 2026-03-09T16:14:22.748 INFO:tasks.workunit.client.0.vm03.stdout:7/162: mkdir d4/da/d19/d36 0 2026-03-09T16:14:22.748 INFO:tasks.workunit.client.0.vm03.stdout:7/163: stat d4/da/d18/d22/d24/d15/l27 0 2026-03-09T16:14:22.752 INFO:tasks.workunit.client.0.vm03.stdout:8/228: dread da/d10/f14 [0,4194304] 0 2026-03-09T16:14:22.755 INFO:tasks.workunit.client.0.vm03.stdout:2/226: rmdir db/d12 39 2026-03-09T16:14:22.756 INFO:tasks.workunit.client.0.vm03.stdout:6/190: creat d9/f33 x:0 0 0 2026-03-09T16:14:22.757 INFO:tasks.workunit.client.0.vm03.stdout:6/191: fdatasync d9/d14/f28 0 2026-03-09T16:14:22.763 INFO:tasks.workunit.client.0.vm03.stdout:5/253: getdents d2/d7/d8 0 2026-03-09T16:14:22.764 INFO:tasks.workunit.client.0.vm03.stdout:5/254: write d2/d7/d1a/f4d [4692327,82467] 0 2026-03-09T16:14:22.765 INFO:tasks.workunit.client.0.vm03.stdout:5/255: write d2/d7/de/d11/d38/f57 [454615,93386] 0 2026-03-09T16:14:22.772 INFO:tasks.workunit.client.0.vm03.stdout:1/152: creat d4/d6/d1d/d20/d23/d33/f35 x:0 0 0 2026-03-09T16:14:22.772 INFO:tasks.workunit.client.0.vm03.stdout:1/153: dread - d4/db/f2e zero size 2026-03-09T16:14:22.782 INFO:tasks.workunit.client.0.vm03.stdout:2/227: chown db/d12/f49 16 1 2026-03-09T16:14:22.783 INFO:tasks.workunit.client.0.vm03.stdout:2/228: truncate db/d12/f39 929280 0 2026-03-09T16:14:22.785 INFO:tasks.workunit.client.0.vm03.stdout:8/229: write da/d10/f33 [515783,115350] 0 2026-03-09T16:14:22.785 INFO:tasks.workunit.client.0.vm03.stdout:6/192: mknod d9/d22/c34 0 2026-03-09T16:14:22.792 INFO:tasks.workunit.client.0.vm03.stdout:9/240: creat d2/de/f48 x:0 0 0 2026-03-09T16:14:22.794 INFO:tasks.workunit.client.0.vm03.stdout:8/230: dread da/f35 [0,4194304] 0 2026-03-09T16:14:22.814 INFO:tasks.workunit.client.0.vm03.stdout:3/192: getdents d5/d13 0 2026-03-09T16:14:22.815 INFO:tasks.workunit.client.0.vm03.stdout:7/164: dread d4/f26 [0,4194304] 0 2026-03-09T16:14:22.816 INFO:tasks.workunit.client.0.vm03.stdout:5/256: write d2/d7/de/d11/f44 [1363707,94944] 0 2026-03-09T16:14:22.818 INFO:tasks.workunit.client.0.vm03.stdout:2/229: symlink db/d12/d2a/l56 0 2026-03-09T16:14:22.819 INFO:tasks.workunit.client.0.vm03.stdout:2/230: read - db/d1e/f54 zero size 2026-03-09T16:14:22.820 INFO:tasks.workunit.client.0.vm03.stdout:6/193: rmdir d9/d22 39 2026-03-09T16:14:22.828 INFO:tasks.workunit.client.0.vm03.stdout:9/241: creat d2/d4/d11/d29/d2a/d38/f49 x:0 0 0 2026-03-09T16:14:22.829 INFO:tasks.workunit.client.0.vm03.stdout:8/231: truncate da/d10/d28/f2c 1729962 0 2026-03-09T16:14:22.831 INFO:tasks.workunit.client.0.vm03.stdout:0/235: rename d0/c24 to d0/d7/c4c 0 2026-03-09T16:14:22.832 INFO:tasks.workunit.client.0.vm03.stdout:0/236: write d0/f29 [658327,59112] 0 2026-03-09T16:14:22.833 INFO:tasks.workunit.client.0.vm03.stdout:1/154: getdents d4/d31 0 2026-03-09T16:14:22.838 INFO:tasks.workunit.client.0.vm03.stdout:7/165: creat d4/da/d18/f37 x:0 0 0 2026-03-09T16:14:22.844 INFO:tasks.workunit.client.0.vm03.stdout:5/257: symlink d2/d7/d1a/d1c/l5b 0 2026-03-09T16:14:22.850 INFO:tasks.workunit.client.0.vm03.stdout:2/231: creat db/d12/f57 x:0 0 0 2026-03-09T16:14:22.851 INFO:tasks.workunit.client.0.vm03.stdout:6/194: truncate d9/d22/f24 5127045 0 2026-03-09T16:14:22.856 INFO:tasks.workunit.client.0.vm03.stdout:8/232: unlink da/db/d30/c3c 0 2026-03-09T16:14:22.863 INFO:tasks.workunit.client.0.vm03.stdout:4/227: rename l0 to d5/dd/d1f/l3f 0 2026-03-09T16:14:22.872 INFO:tasks.workunit.client.0.vm03.stdout:5/258: mkdir d2/d7/d8/d16/d5c 0 2026-03-09T16:14:22.874 INFO:tasks.workunit.client.0.vm03.stdout:2/232: write f0 [1130914,33483] 0 2026-03-09T16:14:22.877 INFO:tasks.workunit.client.0.vm03.stdout:8/233: write da/d10/f33 [357133,8876] 0 2026-03-09T16:14:22.891 INFO:tasks.workunit.client.0.vm03.stdout:9/242: rename d2/de/f48 to d2/d4/d11/d29/d2a/f4a 0 2026-03-09T16:14:22.892 INFO:tasks.workunit.client.0.vm03.stdout:9/243: write d2/d4/d1f/f23 [3150215,6285] 0 2026-03-09T16:14:22.892 INFO:tasks.workunit.client.0.vm03.stdout:9/244: dread - d2/d4/d11/d12/f3d zero size 2026-03-09T16:14:22.893 INFO:tasks.workunit.client.0.vm03.stdout:9/245: readlink d2/d4/d11/d29/d2a/l3a 0 2026-03-09T16:14:22.893 INFO:tasks.workunit.client.0.vm03.stdout:9/246: chown d2/d4/d11/d29/d2a/f4a 446 1 2026-03-09T16:14:22.897 INFO:tasks.workunit.client.0.vm03.stdout:1/155: link d4/d6/d1d/d20/f2a d4/d6/d1d/d20/d23/d33/f36 0 2026-03-09T16:14:22.898 INFO:tasks.workunit.client.0.vm03.stdout:1/156: dread - d4/d6/d1d/d20/d23/d33/f36 zero size 2026-03-09T16:14:22.898 INFO:tasks.workunit.client.0.vm03.stdout:1/157: stat d4/cf 0 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.899+0000 7f5190f25640 1 -- 192.168.123.103:0/2910321797 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c072370 msgr2=0x7f518c10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.899+0000 7f5190f25640 1 --2- 192.168.123.103:0/2910321797 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c072370 0x7f518c10c590 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f518400b0a0 tx=0x7f518402f4c0 comp rx=0 tx=0).stop 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 -- 192.168.123.103:0/2910321797 shutdown_connections 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 --2- 192.168.123.103:0/2910321797 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c072370 0x7f518c10c590 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 --2- 192.168.123.103:0/2910321797 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 0x7f518c071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 -- 192.168.123.103:0/2910321797 >> 192.168.123.103:0/2910321797 conn(0x7f518c06d4f0 msgr2=0x7f518c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 -- 192.168.123.103:0/2910321797 shutdown_connections 2026-03-09T16:14:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.900+0000 7f5190f25640 1 -- 192.168.123.103:0/2910321797 wait complete. 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.901+0000 7f5190f25640 1 Processor -- start 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f5190f25640 1 -- start start 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f5190f25640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 0x7f518c115900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f5190f25640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f5190f25640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c116410 con 0x7f518c0719a0 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f5190f25640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c116580 con 0x7f518c1172b0 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.902+0000 7f518affd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.903+0000 7f518affd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:38184/0 (socket says 192.168.123.103:38184) 2026-03-09T16:14:22.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.903+0000 7f518affd640 1 -- 192.168.123.103:0/3965537129 learned_addr learned my addr 192.168.123.103:0/3965537129 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:22.905 INFO:tasks.workunit.client.0.vm03.stdout:7/166: symlink d4/da/d19/d36/l38 0 2026-03-09T16:14:22.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.904+0000 7f518b7fe640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 0x7f518c115900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:22.905 INFO:tasks.workunit.client.0.vm03.stdout:7/167: write d4/da/f20 [1473554,52110] 0 2026-03-09T16:14:22.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.905+0000 7f518affd640 1 -- 192.168.123.103:0/3965537129 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 msgr2=0x7f518c115900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:22.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.905+0000 7f518affd640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 0x7f518c115900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:22.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.905+0000 7f518affd640 1 -- 192.168.123.103:0/3965537129 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5184009d00 con 0x7f518c1172b0 2026-03-09T16:14:22.907 INFO:tasks.workunit.client.0.vm03.stdout:5/259: chown d2/d7/d8/c18 245 1 2026-03-09T16:14:22.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.907+0000 7f518affd640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f5184009fd0 tx=0x7f5184009300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:22.908 INFO:tasks.workunit.client.0.vm03.stdout:5/260: read d2/d7/de/d54/f4a [722444,114916] 0 2026-03-09T16:14:22.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.908+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51840048f0 con 0x7f518c1172b0 2026-03-09T16:14:22.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.908+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f51840093f0 con 0x7f518c1172b0 2026-03-09T16:14:22.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.909+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51840408e0 con 0x7f518c1172b0 2026-03-09T16:14:22.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.909+0000 7f5190f25640 1 -- 192.168.123.103:0/3965537129 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f518c1b5700 con 0x7f518c1172b0 2026-03-09T16:14:22.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.909+0000 7f5190f25640 1 -- 192.168.123.103:0/3965537129 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f518c1b5a60 con 0x7f518c1172b0 2026-03-09T16:14:22.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.910+0000 7f5190f25640 1 -- 192.168.123.103:0/3965537129 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f518c072370 con 0x7f518c1172b0 2026-03-09T16:14:22.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.910+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 21) v1 ==== 50039+0+0 (secure 0 0 0) 0x7f5184040a40 con 0x7f518c1172b0 2026-03-09T16:14:22.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.910+0000 7f5188ff9640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 0x7f516003fde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:22.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.910+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f5184077b60 con 0x7f518c1172b0 2026-03-09T16:14:22.916 INFO:tasks.workunit.client.0.vm03.stdout:2/233: rmdir db/d12 39 2026-03-09T16:14:22.917 INFO:tasks.workunit.client.0.vm03.stdout:8/234: mknod da/d32/c41 0 2026-03-09T16:14:22.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.917+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f51840074c0 con 0x7f518c1172b0 2026-03-09T16:14:22.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.917+0000 7f518b7fe640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 0x7f516003fde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:22.920 INFO:tasks.workunit.client.0.vm03.stdout:3/193: rename d5/d27/l30 to d5/d2e/l36 0 2026-03-09T16:14:22.923 INFO:tasks.workunit.client.0.vm03.stdout:9/247: fdatasync d2/d4/d11/d29/d2a/f4a 0 2026-03-09T16:14:22.926 INFO:tasks.workunit.client.0.vm03.stdout:9/248: dwrite d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:22.928 INFO:tasks.workunit.client.0.vm03.stdout:9/249: write d2/d4/d11/d12/f45 [914857,16601] 0 2026-03-09T16:14:22.929 INFO:tasks.workunit.client.0.vm03.stdout:9/250: stat d2/d4/d11/d12/f45 0 2026-03-09T16:14:22.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:22.930+0000 7f518b7fe640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 0x7f516003fde0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f518c071800 tx=0x7f517c009290 comp rx=0 tx=0).ready entity=mgr.24357 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:22.933 INFO:tasks.workunit.client.0.vm03.stdout:0/237: creat d0/f4d x:0 0 0 2026-03-09T16:14:22.934 INFO:tasks.workunit.client.0.vm03.stdout:0/238: write d0/d7/f3d [1332485,55825] 0 2026-03-09T16:14:22.934 INFO:tasks.workunit.client.0.vm03.stdout:0/239: fsync d0/d7/d48/d32/d37/f41 0 2026-03-09T16:14:22.934 INFO:tasks.workunit.client.0.vm03.stdout:0/240: readlink d0/d7/l3a 0 2026-03-09T16:14:22.939 INFO:tasks.workunit.client.0.vm03.stdout:1/158: fdatasync d4/d6/f19 0 2026-03-09T16:14:22.943 INFO:tasks.workunit.client.0.vm03.stdout:7/168: creat d4/da/d18/d22/d24/d16/f39 x:0 0 0 2026-03-09T16:14:22.948 INFO:tasks.workunit.client.0.vm03.stdout:6/195: creat d9/f35 x:0 0 0 2026-03-09T16:14:22.951 INFO:tasks.workunit.client.0.vm03.stdout:8/235: symlink da/db/d30/l42 0 2026-03-09T16:14:22.951 INFO:tasks.workunit.client.0.vm03.stdout:8/236: write f8 [902317,75132] 0 2026-03-09T16:14:22.953 INFO:tasks.workunit.client.0.vm03.stdout:3/194: mkdir d5/d13/d37 0 2026-03-09T16:14:22.955 INFO:tasks.workunit.client.0.vm03.stdout:3/195: dread d5/d1e/f26 [0,4194304] 0 2026-03-09T16:14:22.956 INFO:tasks.workunit.client.0.vm03.stdout:3/196: write d5/f16 [4105666,67436] 0 2026-03-09T16:14:22.958 INFO:tasks.workunit.client.0.vm03.stdout:9/251: write d2/d4/d11/d12/d28/f2c [3221835,121418] 0 2026-03-09T16:14:22.964 INFO:tasks.workunit.client.0.vm03.stdout:7/169: unlink d4/da/c23 0 2026-03-09T16:14:22.966 INFO:tasks.workunit.client.0.vm03.stdout:5/261: unlink d2/d7/de/c23 0 2026-03-09T16:14:22.969 INFO:tasks.workunit.client.0.vm03.stdout:2/234: creat db/d12/d2a/f58 x:0 0 0 2026-03-09T16:14:22.971 INFO:tasks.workunit.client.0.vm03.stdout:6/196: fdatasync d9/d22/f1c 0 2026-03-09T16:14:22.981 INFO:tasks.workunit.client.0.vm03.stdout:4/228: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T16:14:22.986 INFO:tasks.workunit.client.0.vm03.stdout:9/252: write d2/d4/f17 [939707,77268] 0 2026-03-09T16:14:22.987 INFO:tasks.workunit.client.0.vm03.stdout:9/253: truncate d2/d4/d11/f41 899554 0 2026-03-09T16:14:22.995 INFO:tasks.workunit.client.0.vm03.stdout:7/170: mknod d4/da/d18/d22/c3a 0 2026-03-09T16:14:22.999 INFO:tasks.workunit.client.0.vm03.stdout:7/171: dwrite d4/da/f20 [0,4194304] 0 2026-03-09T16:14:23.004 INFO:tasks.workunit.client.0.vm03.stdout:5/262: mknod d2/d7/de/d11/d19/c5d 0 2026-03-09T16:14:23.004 INFO:tasks.workunit.client.0.vm03.stdout:5/263: chown d2/l22 94503 1 2026-03-09T16:14:23.005 INFO:tasks.workunit.client.0.vm03.stdout:5/264: dread d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:23.005 INFO:tasks.workunit.client.0.vm03.stdout:5/265: chown d2/d7/de/d11/d19 98235 1 2026-03-09T16:14:23.006 INFO:tasks.workunit.client.0.vm03.stdout:5/266: chown d2/d7/de/d54/f4a 844 1 2026-03-09T16:14:23.007 INFO:tasks.workunit.client.0.vm03.stdout:5/267: read d2/d7/de/d11/f44 [518951,30718] 0 2026-03-09T16:14:23.010 INFO:tasks.workunit.client.0.vm03.stdout:5/268: write d2/d7/de/d11/d19/d31/f42 [4405049,59298] 0 2026-03-09T16:14:23.017 INFO:tasks.workunit.client.0.vm03.stdout:5/269: dwrite d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:23.019 INFO:tasks.workunit.client.0.vm03.stdout:5/270: read d2/d7/d1a/f4d [2278504,44565] 0 2026-03-09T16:14:23.042 INFO:tasks.workunit.client.0.vm03.stdout:3/197: dwrite d5/fb [4194304,4194304] 0 2026-03-09T16:14:23.063 INFO:tasks.workunit.client.0.vm03.stdout:8/237: mkdir da/db/d43 0 2026-03-09T16:14:23.079 INFO:tasks.workunit.client.0.vm03.stdout:9/254: rename d2/d4/d11/d29/c2d to d2/de/c4b 0 2026-03-09T16:14:23.080 INFO:tasks.workunit.client.0.vm03.stdout:9/255: chown d2/d4/d11/d29/d2a/d38 32502 1 2026-03-09T16:14:23.089 INFO:tasks.workunit.client.0.vm03.stdout:0/241: creat d0/f4e x:0 0 0 2026-03-09T16:14:23.091 INFO:tasks.workunit.client.0.vm03.stdout:1/159: rename d4/d6/c2b to d4/d6/c37 0 2026-03-09T16:14:23.109 INFO:tasks.workunit.client.0.vm03.stdout:5/271: creat d2/d7/d1a/d1c/f5e x:0 0 0 2026-03-09T16:14:23.117 INFO:tasks.workunit.client.0.vm03.stdout:7/172: dwrite d4/f26 [0,4194304] 0 2026-03-09T16:14:23.118 INFO:tasks.workunit.client.0.vm03.stdout:7/173: stat d4/d2d/l35 0 2026-03-09T16:14:23.122 INFO:tasks.workunit.client.0.vm03.stdout:7/174: dread d4/f26 [0,4194304] 0 2026-03-09T16:14:23.132 INFO:tasks.workunit.client.0.vm03.stdout:3/198: symlink d5/d1e/l38 0 2026-03-09T16:14:23.132 INFO:tasks.workunit.client.0.vm03.stdout:3/199: chown d5/d13/c1c 252363736 1 2026-03-09T16:14:23.132 INFO:tasks.workunit.client.0.vm03.stdout:3/200: read d5/d13/f1d [2620306,130259] 0 2026-03-09T16:14:23.132 INFO:tasks.workunit.client.0.vm03.stdout:3/201: fdatasync d5/f11 0 2026-03-09T16:14:23.132 INFO:tasks.workunit.client.0.vm03.stdout:3/202: read - d5/d13/f2c zero size 2026-03-09T16:14:23.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.130+0000 7f5190f25640 1 -- 192.168.123.103:0/3965537129 --> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f518c10b590 con 0x7f516003d920 2026-03-09T16:14:23.133 INFO:tasks.workunit.client.0.vm03.stdout:4/229: mkdir d5/d40 0 2026-03-09T16:14:23.134 INFO:tasks.workunit.client.0.vm03.stdout:9/256: creat d2/d4/d11/d12/f4c x:0 0 0 2026-03-09T16:14:23.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.134+0000 7f5188ff9640 1 -- 192.168.123.103:0/3965537129 <== mgr.24357 v2:192.168.123.105:6828/2751989419 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7f518c10b590 con 0x7f516003d920 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 msgr2=0x7f516003fde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 0x7f516003fde0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f518c071800 tx=0x7f517c009290 comp rx=0 tx=0).stop 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 msgr2=0x7f518c115e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f5184009fd0 tx=0x7f5184009300 comp rx=0 tx=0).stop 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 shutdown_connections 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f516003d920 0x7f516003fde0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f518c1172b0 0x7f518c115e40 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 --2- 192.168.123.103:0/3965537129 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c0719a0 0x7f518c115900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.137+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 >> 192.168.123.103:0/3965537129 conn(0x7f518c06d4f0 msgr2=0x7f518c0703a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.138+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 shutdown_connections 2026-03-09T16:14:23.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.138+0000 7f516a7fc640 1 -- 192.168.123.103:0/3965537129 wait complete. 2026-03-09T16:14:23.147 INFO:tasks.workunit.client.0.vm03.stdout:1/160: fsync d4/d6/d1d/d20/d23/f30 0 2026-03-09T16:14:23.148 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:14:23.153 INFO:tasks.workunit.client.0.vm03.stdout:5/272: symlink d2/d7/d8/d16/l5f 0 2026-03-09T16:14:23.156 INFO:tasks.workunit.client.0.vm03.stdout:2/235: rename db/d3b to db/d59 0 2026-03-09T16:14:23.163 INFO:tasks.workunit.client.0.vm03.stdout:3/203: rmdir d5/d13 39 2026-03-09T16:14:23.166 INFO:tasks.workunit.client.0.vm03.stdout:3/204: dwrite d5/f16 [0,4194304] 0 2026-03-09T16:14:23.168 INFO:tasks.workunit.client.0.vm03.stdout:6/197: creat d9/f36 x:0 0 0 2026-03-09T16:14:23.180 INFO:tasks.workunit.client.0.vm03.stdout:9/257: mkdir d2/d4/d11/d29/d2a/d4d 0 2026-03-09T16:14:23.180 INFO:tasks.workunit.client.0.vm03.stdout:9/258: fdatasync d2/d4/f17 0 2026-03-09T16:14:23.180 INFO:tasks.workunit.client.0.vm03.stdout:0/242: fsync d0/da/f2c 0 2026-03-09T16:14:23.180 INFO:tasks.workunit.client.0.vm03.stdout:0/243: chown d0/d7/d48/d32 27710114 1 2026-03-09T16:14:23.190 INFO:tasks.workunit.client.0.vm03.stdout:5/273: symlink d2/d7/d8/d24/l60 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/198: dread d9/ff [0,4194304] 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/199: truncate d9/d14/f28 384829 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/200: truncate d9/d14/f28 1084342 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/201: readlink d9/l2f 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/202: dread d9/ff [0,4194304] 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/203: stat d9/d22/f1c 0 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:6/204: chown d9/d14 29 1 2026-03-09T16:14:23.214 INFO:tasks.workunit.client.0.vm03.stdout:9/259: rename d2/d4/d11/d12/f4c to d2/d4/d11/d29/f4e 0 2026-03-09T16:14:23.215 INFO:tasks.workunit.client.0.vm03.stdout:9/260: fsync d2/d4/d11/d12/f45 0 2026-03-09T16:14:23.215 INFO:tasks.workunit.client.0.vm03.stdout:9/261: readlink d2/df/l16 0 2026-03-09T16:14:23.215 INFO:tasks.workunit.client.0.vm03.stdout:9/262: chown d2/df 43108 1 2026-03-09T16:14:23.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.212+0000 7fbda4404640 1 -- 192.168.123.103:0/3845207289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c072440 msgr2=0x7fbd9c0771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.212+0000 7fbda4404640 1 --2- 192.168.123.103:0/3845207289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c072440 0x7fbd9c0771b0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fbd94009040 tx=0x7fbd9402fc10 comp rx=0 tx=0).stop 2026-03-09T16:14:23.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.215+0000 7fbda4404640 1 -- 192.168.123.103:0/3845207289 shutdown_connections 2026-03-09T16:14:23.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.215+0000 7fbda4404640 1 --2- 192.168.123.103:0/3845207289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c072440 0x7fbd9c0771b0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.215+0000 7fbda4404640 1 --2- 192.168.123.103:0/3845207289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.215+0000 7fbda4404640 1 -- 192.168.123.103:0/3845207289 >> 192.168.123.103:0/3845207289 conn(0x7fbd9c06d4f0 msgr2=0x7fbd9c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.218 INFO:tasks.workunit.client.0.vm03.stdout:9/263: dread d2/df/f22 [0,4194304] 0 2026-03-09T16:14:23.218 INFO:tasks.workunit.client.0.vm03.stdout:9/264: chown d2 82257301 1 2026-03-09T16:14:23.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.216+0000 7fbda4404640 1 -- 192.168.123.103:0/3845207289 shutdown_connections 2026-03-09T16:14:23.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.216+0000 7fbda4404640 1 -- 192.168.123.103:0/3845207289 wait complete. 2026-03-09T16:14:23.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 Processor -- start 2026-03-09T16:14:23.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 -- start start 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c082760 0x7fbd9c082be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd9c084650 con 0x7fbd9c071a70 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda4404640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd9c083150 con 0x7fbd9c082760 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.218+0000 7fbda2179640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda2179640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50338/0 (socket says 192.168.123.103:50338) 2026-03-09T16:14:23.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda2179640 1 -- 192.168.123.103:0/2673014302 learned_addr learned my addr 192.168.123.103:0/2673014302 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:23.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda1978640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c082760 0x7fbd9c082be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda2179640 1 -- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c082760 msgr2=0x7fbd9c082be0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda2179640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c082760 0x7fbd9c082be0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.219+0000 7fbda2179640 1 -- 192.168.123.103:0/2673014302 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd94008cf0 con 0x7fbd9c071a70 2026-03-09T16:14:23.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.220+0000 7fbda2179640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7fbd98009870 tx=0x7fbd98009d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.221+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd98010040 con 0x7fbd9c071a70 2026-03-09T16:14:23.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.222+0000 7fbda4404640 1 -- 192.168.123.103:0/2673014302 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd9c083430 con 0x7fbd9c071a70 2026-03-09T16:14:23.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.222+0000 7fbda4404640 1 -- 192.168.123.103:0/2673014302 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd9c12efc0 con 0x7fbd9c071a70 2026-03-09T16:14:23.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.223+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbd9800ecf0 con 0x7fbd9c071a70 2026-03-09T16:14:23.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.223+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd9c079e60 con 0x7fbd9c071a70 2026-03-09T16:14:23.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.223+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd98002cf0 con 0x7fbd9c071a70 2026-03-09T16:14:23.224 INFO:tasks.workunit.client.0.vm03.stdout:0/244: creat d0/d7/d3e/f4f x:0 0 0 2026-03-09T16:14:23.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.224+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 21) v1 ==== 50039+0+0 (secure 0 0 0) 0x7fbd9800e830 con 0x7fbd9c071a70 2026-03-09T16:14:23.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.225+0000 7fbd937fe640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 0x7fbd8403fff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.225+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fbd98052570 con 0x7fbd9c071a70 2026-03-09T16:14:23.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.225+0000 7fbda1978640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 0x7fbd8403fff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.226+0000 7fbda1978640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 0x7fbd8403fff0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fbd94002790 tx=0x7fbd940061f0 comp rx=0 tx=0).ready entity=mgr.24357 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.229 INFO:tasks.workunit.client.0.vm03.stdout:1/161: creat d4/d31/f38 x:0 0 0 2026-03-09T16:14:23.230 INFO:tasks.workunit.client.0.vm03.stdout:5/274: mknod d2/d7/de/d11/c61 0 2026-03-09T16:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.236+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fbd9801ba90 con 0x7fbd9c071a70 2026-03-09T16:14:23.239 INFO:tasks.workunit.client.0.vm03.stdout:7/175: creat d4/f3b x:0 0 0 2026-03-09T16:14:23.240 INFO:tasks.workunit.client.0.vm03.stdout:7/176: readlink d4/da/d19/d36/l38 0 2026-03-09T16:14:23.240 INFO:tasks.workunit.client.0.vm03.stdout:6/205: creat d9/d22/f37 x:0 0 0 2026-03-09T16:14:23.242 INFO:tasks.workunit.client.0.vm03.stdout:7/177: write d4/f3b [329271,76509] 0 2026-03-09T16:14:23.244 INFO:tasks.workunit.client.0.vm03.stdout:3/205: dread d5/d13/f25 [0,4194304] 0 2026-03-09T16:14:23.246 INFO:tasks.workunit.client.0.vm03.stdout:9/265: unlink d2/d4/fd 0 2026-03-09T16:14:23.247 INFO:tasks.workunit.client.0.vm03.stdout:9/266: dread - d2/d4/d11/d29/f4e zero size 2026-03-09T16:14:23.251 INFO:tasks.workunit.client.0.vm03.stdout:9/267: dwrite d2/df/f14 [0,4194304] 0 2026-03-09T16:14:23.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.263+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 22) v1 ==== 50339+0+0 (secure 0 0 0) 0x7fbd980501b0 con 0x7fbd9c071a70 2026-03-09T16:14:23.273 INFO:tasks.workunit.client.0.vm03.stdout:8/238: write da/d15/f40 [904232,75820] 0 2026-03-09T16:14:23.278 INFO:tasks.workunit.client.0.vm03.stdout:0/245: dread d0/f3 [0,4194304] 0 2026-03-09T16:14:23.279 INFO:tasks.workunit.client.0.vm03.stdout:2/236: rmdir db 39 2026-03-09T16:14:23.287 INFO:tasks.workunit.client.0.vm03.stdout:5/275: rename d2/d7/d8/d16/c30 to d2/d7/d3c/c62 0 2026-03-09T16:14:23.287 INFO:tasks.workunit.client.0.vm03.stdout:5/276: write d2/d7/d8/f36 [741054,124010] 0 2026-03-09T16:14:23.289 INFO:tasks.workunit.client.0.vm03.stdout:4/230: getdents d5/db/d25 0 2026-03-09T16:14:23.290 INFO:tasks.workunit.client.0.vm03.stdout:4/231: chown d5/db/d25/c3e 122268164 1 2026-03-09T16:14:23.291 INFO:tasks.workunit.client.0.vm03.stdout:5/277: dwrite d2/d7/de/d11/d38/f57 [0,4194304] 0 2026-03-09T16:14:23.295 INFO:tasks.workunit.client.0.vm03.stdout:7/178: mknod d4/da/d18/d22/d24/d16/d2b/c3c 0 2026-03-09T16:14:23.295 INFO:tasks.workunit.client.0.vm03.stdout:7/179: chown d4/da/d18/d22/f33 0 1 2026-03-09T16:14:23.299 INFO:tasks.workunit.client.0.vm03.stdout:7/180: dwrite d4/da/d18/d22/f33 [0,4194304] 0 2026-03-09T16:14:23.303 INFO:tasks.workunit.client.0.vm03.stdout:6/206: symlink d9/d22/l38 0 2026-03-09T16:14:23.303 INFO:tasks.workunit.client.0.vm03.stdout:6/207: stat d9/d22/c2b 0 2026-03-09T16:14:23.304 INFO:tasks.workunit.client.0.vm03.stdout:6/208: dread - d9/f33 zero size 2026-03-09T16:14:23.305 INFO:tasks.workunit.client.0.vm03.stdout:1/162: mkdir d4/d39 0 2026-03-09T16:14:23.305 INFO:tasks.workunit.client.0.vm03.stdout:1/163: stat d4/db/f21 0 2026-03-09T16:14:23.306 INFO:tasks.workunit.client.0.vm03.stdout:1/164: truncate d4/d6/f9 2794628 0 2026-03-09T16:14:23.324 INFO:tasks.workunit.client.0.vm03.stdout:4/232: chown d5/l29 2865059 1 2026-03-09T16:14:23.326 INFO:tasks.workunit.client.0.vm03.stdout:4/233: dread d5/fa [0,4194304] 0 2026-03-09T16:14:23.331 INFO:tasks.workunit.client.0.vm03.stdout:5/278: unlink d2/d7/d8/c18 0 2026-03-09T16:14:23.334 INFO:tasks.workunit.client.0.vm03.stdout:3/206: symlink d5/d13/d34/l39 0 2026-03-09T16:14:23.337 INFO:tasks.workunit.client.0.vm03.stdout:3/207: dread d5/f10 [0,4194304] 0 2026-03-09T16:14:23.340 INFO:tasks.workunit.client.0.vm03.stdout:3/208: dread d5/d13/f25 [0,4194304] 0 2026-03-09T16:14:23.351 INFO:tasks.workunit.client.0.vm03.stdout:9/268: write d2/df/f14 [5153510,92617] 0 2026-03-09T16:14:23.351 INFO:tasks.workunit.client.0.vm03.stdout:9/269: chown d2/f7 30 1 2026-03-09T16:14:23.354 INFO:tasks.workunit.client.0.vm03.stdout:1/165: write d4/d6/d1d/d20/d23/d33/f36 [969539,1662] 0 2026-03-09T16:14:23.362 INFO:tasks.workunit.client.0.vm03.stdout:0/246: mknod d0/d42/c50 0 2026-03-09T16:14:23.367 INFO:tasks.workunit.client.0.vm03.stdout:2/237: mknod db/d59/c5a 0 2026-03-09T16:14:23.369 INFO:tasks.workunit.client.0.vm03.stdout:5/279: symlink d2/d7/d3c/l63 0 2026-03-09T16:14:23.370 INFO:tasks.workunit.client.0.vm03.stdout:3/209: mkdir d5/d13/d3a 0 2026-03-09T16:14:23.370 INFO:tasks.workunit.client.0.vm03.stdout:3/210: chown d5/f16 515889737 1 2026-03-09T16:14:23.372 INFO:tasks.workunit.client.0.vm03.stdout:9/270: symlink d2/d4/d11/d29/d2a/d46/l4f 0 2026-03-09T16:14:23.373 INFO:tasks.workunit.client.0.vm03.stdout:9/271: write d2/de/f1c [579435,88386] 0 2026-03-09T16:14:23.373 INFO:tasks.workunit.client.0.vm03.stdout:9/272: write d2/f15 [2295733,58650] 0 2026-03-09T16:14:23.385 INFO:tasks.workunit.client.0.vm03.stdout:6/209: dwrite d9/f20 [0,4194304] 0 2026-03-09T16:14:23.388 INFO:tasks.workunit.client.0.vm03.stdout:6/210: dread d9/f20 [0,4194304] 0 2026-03-09T16:14:23.394 INFO:tasks.workunit.client.0.vm03.stdout:6/211: dwrite d9/f1f [0,4194304] 0 2026-03-09T16:14:23.402 INFO:tasks.workunit.client.0.vm03.stdout:1/166: mknod d4/d6/d1d/d20/d23/c3a 0 2026-03-09T16:14:23.405 INFO:tasks.workunit.client.0.vm03.stdout:1/167: dread f1 [0,4194304] 0 2026-03-09T16:14:23.411 INFO:tasks.workunit.client.0.vm03.stdout:8/239: link da/d15/f1b da/db/f44 0 2026-03-09T16:14:23.414 INFO:tasks.workunit.client.0.vm03.stdout:8/240: dwrite f8 [0,4194304] 0 2026-03-09T16:14:23.422 INFO:tasks.workunit.client.0.vm03.stdout:0/247: creat d0/d42/f51 x:0 0 0 2026-03-09T16:14:23.428 INFO:tasks.workunit.client.0.vm03.stdout:2/238: symlink db/d12/l5b 0 2026-03-09T16:14:23.434 INFO:tasks.workunit.client.0.vm03.stdout:7/181: rename d4/da/d18/f29 to d4/da/d18/d22/d24/f3d 0 2026-03-09T16:14:23.435 INFO:tasks.workunit.client.0.vm03.stdout:7/182: readlink d4/da/d18/d22/d24/d16/l1d 0 2026-03-09T16:14:23.440 INFO:tasks.workunit.client.0.vm03.stdout:9/273: unlink d2/d4/d11/d29/d2a/d38/l3b 0 2026-03-09T16:14:23.441 INFO:tasks.workunit.client.0.vm03.stdout:9/274: write d2/d4/d11/d12/d28/f2c [2305924,118815] 0 2026-03-09T16:14:23.441 INFO:tasks.workunit.client.0.vm03.stdout:9/275: readlink d2/d4/l19 0 2026-03-09T16:14:23.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.449+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 --> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbd9c061ea0 con 0x7fbd8403db30 2026-03-09T16:14:23.460 INFO:tasks.workunit.client.0.vm03.stdout:0/248: rmdir d0/d7/d48/d32 39 2026-03-09T16:14:23.460 INFO:tasks.workunit.client.0.vm03.stdout:0/249: write d0/d7/d48/f2e [5794412,93360] 0 2026-03-09T16:14:23.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:23 vm03.local ceph-mon[51019]: Deploying cephadm binary to vm05 2026-03-09T16:14:23.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:23 vm03.local ceph-mon[51019]: Deploying cephadm binary to vm03 2026-03-09T16:14:23.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:23 vm03.local ceph-mon[51019]: mgrmap e21: vm05.dygxfv(active, since 1.57614s) 2026-03-09T16:14:23.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:23 vm03.local ceph-mon[51019]: pgmap v3: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:23.470 INFO:tasks.workunit.client.0.vm03.stdout:6/212: write d9/d22/f1c [1603604,16180] 0 2026-03-09T16:14:23.475 INFO:tasks.workunit.client.0.vm03.stdout:2/239: creat db/d1e/f5c x:0 0 0 2026-03-09T16:14:23.475 INFO:tasks.workunit.client.0.vm03.stdout:4/234: creat d5/dd/f41 x:0 0 0 2026-03-09T16:14:23.476 INFO:tasks.workunit.client.0.vm03.stdout:4/235: write d5/d17/f18 [1795822,82760] 0 2026-03-09T16:14:23.481 INFO:tasks.workunit.client.0.vm03.stdout:5/280: creat d2/d7/d8/d16/d5c/f64 x:0 0 0 2026-03-09T16:14:23.482 INFO:tasks.workunit.client.0.vm03.stdout:5/281: dread - d2/d7/d8/d16/d5c/f64 zero size 2026-03-09T16:14:23.483 INFO:tasks.workunit.client.0.vm03.stdout:4/236: dread d5/db/f28 [0,4194304] 0 2026-03-09T16:14:23.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.483+0000 7fbd937fe640 1 -- 192.168.123.103:0/2673014302 <== mgr.24357 v2:192.168.123.105:6828/2751989419 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7fbd9c061ea0 con 0x7fbd8403db30 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.485+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 msgr2=0x7fbd8403fff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 0x7fbd8403fff0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fbd94002790 tx=0x7fbd940061f0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 msgr2=0x7fbd9c084110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7fbd98009870 tx=0x7fbd98009d40 comp rx=0 tx=0).stop 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 shutdown_connections 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7fbd8403db30 0x7fbd8403fff0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd9c082760 0x7fbd9c082be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 --2- 192.168.123.103:0/2673014302 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd9c071a70 0x7fbd9c084110 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 >> 192.168.123.103:0/2673014302 conn(0x7fbd9c06d4f0 msgr2=0x7fbd9c073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 shutdown_connections 2026-03-09T16:14:23.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.486+0000 7fbd917fa640 1 -- 192.168.123.103:0/2673014302 wait complete. 2026-03-09T16:14:23.516 INFO:tasks.workunit.client.0.vm03.stdout:8/241: mkdir da/d45 0 2026-03-09T16:14:23.516 INFO:tasks.workunit.client.0.vm03.stdout:4/237: truncate f1 656993 0 2026-03-09T16:14:23.516 INFO:tasks.workunit.client.0.vm03.stdout:7/183: dread d4/da/d18/d22/d24/f30 [0,4194304] 0 2026-03-09T16:14:23.516 INFO:tasks.workunit.client.0.vm03.stdout:8/242: chown c5 19691469 1 2026-03-09T16:14:23.517 INFO:tasks.workunit.client.0.vm03.stdout:8/243: chown da/d1d 108284517 1 2026-03-09T16:14:23.517 INFO:tasks.workunit.client.0.vm03.stdout:7/184: fsync d4/da/d18/d22/d24/d16/f39 0 2026-03-09T16:14:23.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:23 vm05.local ceph-mon[58702]: Deploying cephadm binary to vm05 2026-03-09T16:14:23.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:23 vm05.local ceph-mon[58702]: Deploying cephadm binary to vm03 2026-03-09T16:14:23.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:23 vm05.local ceph-mon[58702]: mgrmap e21: vm05.dygxfv(active, since 1.57614s) 2026-03-09T16:14:23.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:23 vm05.local ceph-mon[58702]: pgmap v3: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:23.529 INFO:tasks.workunit.client.0.vm03.stdout:3/211: rename d5/l17 to d5/d2e/l3b 0 2026-03-09T16:14:23.530 INFO:tasks.workunit.client.0.vm03.stdout:3/212: dread - d5/d13/f2c zero size 2026-03-09T16:14:23.554 INFO:tasks.workunit.client.0.vm03.stdout:5/282: link d2/d7/d8/d24/l51 d2/d7/de/d11/d19/l65 0 2026-03-09T16:14:23.562 INFO:tasks.workunit.client.0.vm03.stdout:4/238: mknod d5/c42 0 2026-03-09T16:14:23.562 INFO:tasks.workunit.client.0.vm03.stdout:4/239: chown d5 5 1 2026-03-09T16:14:23.562 INFO:tasks.workunit.client.0.vm03.stdout:4/240: truncate d5/f3c 994400 0 2026-03-09T16:14:23.575 INFO:tasks.workunit.client.0.vm03.stdout:9/276: dwrite d2/f33 [0,4194304] 0 2026-03-09T16:14:23.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.575+0000 7f510e17b640 1 -- 192.168.123.103:0/3948804829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 msgr2=0x7f5108105e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.575+0000 7f510e17b640 1 --2- 192.168.123.103:0/3948804829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f5108105e10 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f50fc0098e0 tx=0x7f50fc02f1b0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.578 INFO:tasks.workunit.client.0.vm03.stdout:5/283: dread d2/d7/de/d11/f44 [0,4194304] 0 2026-03-09T16:14:23.578 INFO:tasks.workunit.client.0.vm03.stdout:1/168: rename d4/d6/d1d/d20/d23/d33 to d4/d6/d3b 0 2026-03-09T16:14:23.582 INFO:tasks.workunit.client.0.vm03.stdout:6/213: rename d9/d22/l38 to d9/d14/l39 0 2026-03-09T16:14:23.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.582+0000 7f510e17b640 1 -- 192.168.123.103:0/3948804829 shutdown_connections 2026-03-09T16:14:23.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.582+0000 7f510e17b640 1 --2- 192.168.123.103:0/3948804829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f5108105e10 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.582+0000 7f510e17b640 1 --2- 192.168.123.103:0/3948804829 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51081010f0 0x7f51081034e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.582+0000 7f510e17b640 1 -- 192.168.123.103:0/3948804829 >> 192.168.123.103:0/3948804829 conn(0x7f51080fac90 msgr2=0x7f51080fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 -- 192.168.123.103:0/3948804829 shutdown_connections 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 -- 192.168.123.103:0/3948804829 wait complete. 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 Processor -- start 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 -- start start 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51081010f0 0x7f51081a2c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51081a37d0 con 0x7f5108103a20 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.586+0000 7f510e17b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51081a3940 con 0x7f51081010f0 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50366/0 (socket says 192.168.123.103:50366) 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 -- 192.168.123.103:0/1696919128 learned_addr learned my addr 192.168.123.103:0/1696919128 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 -- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51081010f0 msgr2=0x7f51081a2c90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51081010f0 0x7f51081a2c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.587+0000 7f510c978640 1 -- 192.168.123.103:0/1696919128 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50fc009590 con 0x7f5108103a20 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.588+0000 7f510c978640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f50fc005e00 tx=0x7f50fc031c20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.588+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50fc03d070 con 0x7f5108103a20 2026-03-09T16:14:23.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.588+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f50fc02fe90 con 0x7f5108103a20 2026-03-09T16:14:23.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.589+0000 7f510e17b640 1 -- 192.168.123.103:0/1696919128 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f51081a8370 con 0x7f5108103a20 2026-03-09T16:14:23.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.589+0000 7f510e17b640 1 -- 192.168.123.103:0/1696919128 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f51081a8830 con 0x7f5108103a20 2026-03-09T16:14:23.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.590+0000 7f510e17b640 1 -- 192.168.123.103:0/1696919128 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f50d0005350 con 0x7f5108103a20 2026-03-09T16:14:23.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.590+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50fc004550 con 0x7f5108103a20 2026-03-09T16:14:23.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.590+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 22) v1 ==== 50339+0+0 (secure 0 0 0) 0x7f50fc0316e0 con 0x7f5108103a20 2026-03-09T16:14:23.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.591+0000 7f50f67fc640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 0x7f50e40400c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.591+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f50fc0377e0 con 0x7f5108103a20 2026-03-09T16:14:23.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.592+0000 7f510d179640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 0x7f50e40400c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.592+0000 7f510d179640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 0x7f50e40400c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f50f80059c0 tx=0x7f50f8005950 comp rx=0 tx=0).ready entity=mgr.24357 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.594+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f50fc049dd0 con 0x7f5108103a20 2026-03-09T16:14:23.601 INFO:tasks.workunit.client.0.vm03.stdout:1/169: mknod d4/c3c 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:1/170: readlink d4/l14 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:2/240: rename db/fd to db/d1e/f5d 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:2/241: fdatasync f7 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:1/171: fdatasync d4/d6/d1d/d20/d23/f30 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:1/172: write d4/db/f2e [143152,35345] 0 2026-03-09T16:14:23.609 INFO:tasks.workunit.client.0.vm03.stdout:1/173: readlink d4/l14 0 2026-03-09T16:14:23.615 INFO:tasks.workunit.client.0.vm03.stdout:7/185: sync 2026-03-09T16:14:23.628 INFO:tasks.workunit.client.0.vm03.stdout:0/250: rename d0/d42 to d0/d7/d48/d32/d35/d52 0 2026-03-09T16:14:23.633 INFO:tasks.workunit.client.0.vm03.stdout:0/251: dwrite d0/d7/d48/f2e [4194304,4194304] 0 2026-03-09T16:14:23.642 INFO:tasks.workunit.client.0.vm03.stdout:9/277: dwrite d2/df/f42 [0,4194304] 0 2026-03-09T16:14:23.650 INFO:tasks.workunit.client.0.vm03.stdout:6/214: link d9/d14/f1d d9/d14/f3a 0 2026-03-09T16:14:23.659 INFO:tasks.workunit.client.0.vm03.stdout:6/215: dwrite d9/d22/f24 [4194304,4194304] 0 2026-03-09T16:14:23.668 INFO:tasks.workunit.client.0.vm03.stdout:0/252: dread d0/d7/d48/f13 [0,4194304] 0 2026-03-09T16:14:23.711 INFO:tasks.workunit.client.0.vm03.stdout:7/186: mkdir d4/da/d18/d22/d24/d16/d3e 0 2026-03-09T16:14:23.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.770+0000 7f510e17b640 1 -- 192.168.123.103:0/1696919128 --> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f50d0002bf0 con 0x7f50e403dc00 2026-03-09T16:14:23.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.804+0000 7f50f67fc640 1 -- 192.168.123.103:0/1696919128 <== mgr.24357 v2:192.168.123.105:6828/2751989419 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f50d0002bf0 con 0x7f50e403dc00 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 2m ago 4m 24.7M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 2m ago 4m 8300k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (3m) 6s ago 3m 8715k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 2m ago 4m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (3m) 6s ago 3m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 2m ago 4m 82.6M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (2m) 2m ago 2m 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (2m) 2m ago 2m 18.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (2m) 6s ago 2m 14.1M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (2m) 6s ago 2m 15.9M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:9283,8765,8443 running (4m) 2m ago 4m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 55454b4aaab2 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (9s) 6s ago 3m 66.9M - 19.2.3-678-ge911bdeb 654f31e6858e b47787a071c8 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 2m ago 4m 54.8M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (3m) 6s ago 3m 51.0M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 2m ago 4m 14.1M - 1.5.0 0da6a335fe13 8c7f00e55632 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 6s ago 3m 15.0M - 1.5.0 0da6a335fe13 4c3ab3bdf8cf 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 2m ago 3m 48.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 2m ago 3m 68.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 2m ago 2m 47.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (2m) 6s ago 2m 148M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (2m) 6s ago 2m 123M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (2m) 6s ago 2m 108M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:14:23.806 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 2m ago 3m 37.2M - 2.43.0 a07b618ecd1d 89a8f084cd57 2026-03-09T16:14:23.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.807+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 msgr2=0x7f50e40400c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 0x7f50e40400c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f50f80059c0 tx=0x7f50f8005950 comp rx=0 tx=0).stop 2026-03-09T16:14:23.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 msgr2=0x7f51081a31d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f50fc005e00 tx=0x7f50fc031c20 comp rx=0 tx=0).stop 2026-03-09T16:14:23.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 shutdown_connections 2026-03-09T16:14:23.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f50e403dc00 0x7f50e40400c0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5108103a20 0x7f51081a31d0 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 --2- 192.168.123.103:0/1696919128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51081010f0 0x7f51081a2c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.808+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 >> 192.168.123.103:0/1696919128 conn(0x7f51080fac90 msgr2=0x7f51080ff1c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.810+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 shutdown_connections 2026-03-09T16:14:23.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.810+0000 7f50d7fff640 1 -- 192.168.123.103:0/1696919128 wait complete. 2026-03-09T16:14:23.824 INFO:tasks.workunit.client.0.vm03.stdout:9/278: creat d2/d4/d11/d12/f50 x:0 0 0 2026-03-09T16:14:23.830 INFO:tasks.workunit.client.0.vm03.stdout:1/174: truncate d4/d6/f15 412087 0 2026-03-09T16:14:23.835 INFO:tasks.workunit.client.0.vm03.stdout:7/187: write d4/da/d18/d22/d24/f3d [600413,35255] 0 2026-03-09T16:14:23.837 INFO:tasks.workunit.client.0.vm03.stdout:7/188: write d4/da/d18/d22/f33 [1846325,46397] 0 2026-03-09T16:14:23.843 INFO:tasks.workunit.client.0.vm03.stdout:7/189: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:23.860 INFO:tasks.workunit.client.0.vm03.stdout:9/279: read d2/df/f22 [4616958,19438] 0 2026-03-09T16:14:23.862 INFO:tasks.workunit.client.0.vm03.stdout:9/280: dread - d2/d4/d11/d12/f35 zero size 2026-03-09T16:14:23.872 INFO:tasks.workunit.client.0.vm03.stdout:7/190: chown d4/da/d18/d22/d24/l2c 62 1 2026-03-09T16:14:23.873 INFO:tasks.workunit.client.0.vm03.stdout:8/244: rename da/db/d30/l3a to da/l46 0 2026-03-09T16:14:23.895 INFO:tasks.workunit.client.0.vm03.stdout:6/216: creat d9/f3b x:0 0 0 2026-03-09T16:14:23.903 INFO:tasks.workunit.client.0.vm03.stdout:9/281: chown d2/de/c36 0 1 2026-03-09T16:14:23.907 INFO:tasks.workunit.client.0.vm03.stdout:9/282: dread d2/d4/d11/f41 [0,4194304] 0 2026-03-09T16:14:23.912 INFO:tasks.workunit.client.0.vm03.stdout:6/217: mknod d9/d22/c3c 0 2026-03-09T16:14:23.914 INFO:tasks.workunit.client.0.vm03.stdout:9/283: dwrite d2/d4/d11/d12/f50 [0,4194304] 0 2026-03-09T16:14:23.922 INFO:tasks.workunit.client.0.vm03.stdout:8/245: dread da/d10/d28/f2c [0,4194304] 0 2026-03-09T16:14:23.937 INFO:tasks.workunit.client.0.vm03.stdout:8/246: dwrite da/d10/f23 [0,4194304] 0 2026-03-09T16:14:23.940 INFO:tasks.workunit.client.0.vm03.stdout:3/213: rename d5/d27 to d5/d13/d3a/d3c 0 2026-03-09T16:14:23.948 INFO:tasks.workunit.client.0.vm03.stdout:8/247: truncate da/db/d30/f36 767289 0 2026-03-09T16:14:23.949 INFO:tasks.workunit.client.0.vm03.stdout:8/248: stat c4 0 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 -- 192.168.123.103:0/1122525309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c072420 msgr2=0x7f355c077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 --2- 192.168.123.103:0/1122525309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c072420 0x7f355c077190 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f3558008880 tx=0x7f355802eed0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 -- 192.168.123.103:0/1122525309 shutdown_connections 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 --2- 192.168.123.103:0/1122525309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c072420 0x7f355c077190 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 --2- 192.168.123.103:0/1122525309 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 -- 192.168.123.103:0/1122525309 >> 192.168.123.103:0/1122525309 conn(0x7f355c06d4f0 msgr2=0x7f355c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 -- 192.168.123.103:0/1122525309 shutdown_connections 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.957+0000 7f356226a640 1 -- 192.168.123.103:0/1122525309 wait complete. 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.958+0000 7f356226a640 1 Processor -- start 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.958+0000 7f356226a640 1 -- start start 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.958+0000 7f356226a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f356226a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c082750 0x7f355c082bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f356226a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f355c084640 con 0x7f355c082750 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f356226a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f355c083110 con 0x7f355c071a50 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:38238/0 (socket says 192.168.123.103:38238) 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 -- 192.168.123.103:0/2578160899 learned_addr learned my addr 192.168.123.103:0/2578160899 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3560a67640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c082750 0x7f355c082bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 -- 192.168.123.103:0/2578160899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c082750 msgr2=0x7f355c082bd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c082750 0x7f355c082bd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.959+0000 7f3561268640 1 -- 192.168.123.103:0/2578160899 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3558008530 con 0x7f355c071a50 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.960+0000 7f3561268640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f355400e390 tx=0x7f355400e860 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.960+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3554010040 con 0x7f355c071a50 2026-03-09T16:14:23.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.960+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f355c083390 con 0x7f355c071a50 2026-03-09T16:14:23.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.960+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f355c12ef70 con 0x7f355c071a50 2026-03-09T16:14:23.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.963+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f355400b490 con 0x7f355c071a50 2026-03-09T16:14:23.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.963+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35540042e0 con 0x7f355c071a50 2026-03-09T16:14:23.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.964+0000 7f352bfff640 1 -- 192.168.123.103:0/2578160899 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3524005350 con 0x7f355c071a50 2026-03-09T16:14:23.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.966+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 22) v1 ==== 50339+0+0 (secure 0 0 0) 0x7f3554004b40 con 0x7f355c071a50 2026-03-09T16:14:23.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.966+0000 7f35527fc640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 0x7f353c040320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:23.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.967+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f3554052f00 con 0x7f355c071a50 2026-03-09T16:14:23.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.968+0000 7f3560a67640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 0x7f353c040320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:23.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.970+0000 7f3560a67640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 0x7f353c040320 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f3558002410 tx=0x7f355803a040 comp rx=0 tx=0).ready entity=mgr.24357 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:23.976 INFO:tasks.workunit.client.0.vm03.stdout:4/241: rename d5/db/l3d to d5/db/d25/l43 0 2026-03-09T16:14:23.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:23.971+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f3554002a60 con 0x7f355c071a50 2026-03-09T16:14:23.977 INFO:tasks.workunit.client.0.vm03.stdout:6/218: dwrite d9/f20 [4194304,4194304] 0 2026-03-09T16:14:23.984 INFO:tasks.workunit.client.0.vm03.stdout:6/219: dread - d9/f35 zero size 2026-03-09T16:14:23.987 INFO:tasks.workunit.client.0.vm03.stdout:3/214: mknod d5/c3d 0 2026-03-09T16:14:23.989 INFO:tasks.workunit.client.0.vm03.stdout:6/220: write d9/d22/f2d [373361,37503] 0 2026-03-09T16:14:23.990 INFO:tasks.workunit.client.0.vm03.stdout:3/215: dread d5/d13/f29 [0,4194304] 0 2026-03-09T16:14:23.990 INFO:tasks.workunit.client.0.vm03.stdout:3/216: truncate d5/f11 759853 0 2026-03-09T16:14:24.005 INFO:tasks.workunit.client.0.vm03.stdout:8/249: symlink da/d45/l47 0 2026-03-09T16:14:24.007 INFO:tasks.workunit.client.0.vm03.stdout:8/250: chown da/db/d30/f36 384647739 1 2026-03-09T16:14:24.011 INFO:tasks.workunit.client.0.vm03.stdout:6/221: unlink d9/d14/f3a 0 2026-03-09T16:14:24.014 INFO:tasks.workunit.client.0.vm03.stdout:6/222: fsync d9/f20 0 2026-03-09T16:14:24.014 INFO:tasks.workunit.client.0.vm03.stdout:8/251: dread da/d10/f33 [0,4194304] 0 2026-03-09T16:14:24.015 INFO:tasks.workunit.client.0.vm03.stdout:3/217: mknod d5/d1e/c3e 0 2026-03-09T16:14:24.019 INFO:tasks.workunit.client.0.vm03.stdout:8/252: write da/f35 [96319,102691] 0 2026-03-09T16:14:24.032 INFO:tasks.workunit.client.0.vm03.stdout:2/242: truncate f7 2113545 0 2026-03-09T16:14:24.032 INFO:tasks.workunit.client.0.vm03.stdout:0/253: write d0/d7/f8 [641857,26287] 0 2026-03-09T16:14:24.034 INFO:tasks.workunit.client.0.vm03.stdout:6/223: dwrite d9/f20 [4194304,4194304] 0 2026-03-09T16:14:24.073 INFO:tasks.workunit.client.0.vm03.stdout:0/254: mknod d0/da/d1b/c53 0 2026-03-09T16:14:24.073 INFO:tasks.workunit.client.0.vm03.stdout:0/255: write d0/d7/f3d [549388,32419] 0 2026-03-09T16:14:24.082 INFO:tasks.workunit.client.0.vm03.stdout:6/224: creat d9/d14/f3d x:0 0 0 2026-03-09T16:14:24.083 INFO:tasks.workunit.client.0.vm03.stdout:5/284: rename d2/d7/d8/d24/d27/d43/l4e to d2/l66 0 2026-03-09T16:14:24.099 INFO:tasks.workunit.client.0.vm03.stdout:4/242: rmdir d5/d17 39 2026-03-09T16:14:24.106 INFO:tasks.workunit.client.0.vm03.stdout:7/191: rename d4/da/d18/d22/d24/l2c to d4/d2d/l3f 0 2026-03-09T16:14:24.107 INFO:tasks.workunit.client.0.vm03.stdout:5/285: creat d2/d7/d1a/d1c/d3f/f67 x:0 0 0 2026-03-09T16:14:24.107 INFO:tasks.workunit.client.0.vm03.stdout:1/175: write d4/f1b [197728,6904] 0 2026-03-09T16:14:24.110 INFO:tasks.workunit.client.0.vm03.stdout:1/176: dwrite d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:24.117 INFO:tasks.workunit.client.0.vm03.stdout:5/286: write d2/d7/de/d11/f32 [3899536,42176] 0 2026-03-09T16:14:24.117 INFO:tasks.workunit.client.0.vm03.stdout:6/225: rmdir d9/d22 39 2026-03-09T16:14:24.118 INFO:tasks.workunit.client.0.vm03.stdout:7/192: write d4/da/d18/d22/d24/f2f [5434807,16167] 0 2026-03-09T16:14:24.124 INFO:tasks.workunit.client.0.vm03.stdout:5/287: stat d2/d7/d8/d24/d27 0 2026-03-09T16:14:24.124 INFO:tasks.workunit.client.0.vm03.stdout:7/193: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:24.130 INFO:tasks.workunit.client.0.vm03.stdout:5/288: chown d2/d7/d8/d24/d27/d43/d4b 98517218 1 2026-03-09T16:14:24.133 INFO:tasks.workunit.client.0.vm03.stdout:7/194: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:24.135 INFO:tasks.workunit.client.0.vm03.stdout:8/253: rename da/c16 to da/db/d30/c48 0 2026-03-09T16:14:24.147 INFO:tasks.workunit.client.0.vm03.stdout:8/254: write f8 [4882264,84055] 0 2026-03-09T16:14:24.153 INFO:tasks.workunit.client.0.vm03.stdout:4/243: mkdir d5/d17/d44 0 2026-03-09T16:14:24.164 INFO:tasks.workunit.client.0.vm03.stdout:9/284: write d2/df/f22 [5214922,69539] 0 2026-03-09T16:14:24.172 INFO:tasks.workunit.client.0.vm03.stdout:6/226: chown d9/d22/l26 4334029 1 2026-03-09T16:14:24.172 INFO:tasks.workunit.client.0.vm03.stdout:5/289: dread d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:24.172 INFO:tasks.workunit.client.0.vm03.stdout:2/243: rename db/d1e/f2f to db/d12/d2a/f5e 0 2026-03-09T16:14:24.172 INFO:tasks.workunit.client.0.vm03.stdout:0/256: creat d0/f54 x:0 0 0 2026-03-09T16:14:24.173 INFO:tasks.workunit.client.0.vm03.stdout:7/195: symlink d4/da/d19/l40 0 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.192+0000 7f352bfff640 1 -- 192.168.123.103:0/2578160899 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f3524005e10 con 0x7f355c071a50 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:14:24.194 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:14:24.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.193+0000 7f35527fc640 1 -- 192.168.123.103:0/2578160899 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f3554019070 con 0x7f355c071a50 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.196+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 msgr2=0x7f353c040320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.196+0000 7f356226a640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 0x7f353c040320 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f3558002410 tx=0x7f355803a040 comp rx=0 tx=0).stop 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.196+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 msgr2=0x7f355c084100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.196+0000 7f356226a640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f355400e390 tx=0x7f355400e860 comp rx=0 tx=0).stop 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 shutdown_connections 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f353c03de60 0x7f353c040320 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f355c082750 0x7f355c082bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 --2- 192.168.123.103:0/2578160899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f355c071a50 0x7f355c084100 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 >> 192.168.123.103:0/2578160899 conn(0x7f355c06d4f0 msgr2=0x7f355c073130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 shutdown_connections 2026-03-09T16:14:24.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.197+0000 7f356226a640 1 -- 192.168.123.103:0/2578160899 wait complete. 2026-03-09T16:14:24.202 INFO:tasks.workunit.client.0.vm03.stdout:9/285: creat d2/d4/d1f/f51 x:0 0 0 2026-03-09T16:14:24.210 INFO:tasks.workunit.client.0.vm03.stdout:6/227: creat d9/d22/f3e x:0 0 0 2026-03-09T16:14:24.212 INFO:tasks.workunit.client.0.vm03.stdout:5/290: dwrite d2/d7/d3c/d3d/f56 [0,4194304] 0 2026-03-09T16:14:24.213 INFO:tasks.workunit.client.0.vm03.stdout:2/244: creat db/d12/d2a/f5f x:0 0 0 2026-03-09T16:14:24.213 INFO:tasks.workunit.client.0.vm03.stdout:5/291: read - d2/d7/d1a/d1c/d3f/f67 zero size 2026-03-09T16:14:24.224 INFO:tasks.workunit.client.0.vm03.stdout:8/255: mknod da/c49 0 2026-03-09T16:14:24.231 INFO:tasks.workunit.client.0.vm03.stdout:4/244: mknod d5/c45 0 2026-03-09T16:14:24.248 INFO:tasks.workunit.client.0.vm03.stdout:9/286: symlink d2/d4/d11/d29/d2a/d46/l52 0 2026-03-09T16:14:24.251 INFO:tasks.workunit.client.0.vm03.stdout:9/287: dwrite d2/f7 [0,4194304] 0 2026-03-09T16:14:24.256 INFO:tasks.workunit.client.0.vm03.stdout:9/288: fsync d2/f15 0 2026-03-09T16:14:24.257 INFO:tasks.workunit.client.0.vm03.stdout:2/245: creat db/d12/d2a/f60 x:0 0 0 2026-03-09T16:14:24.257 INFO:tasks.workunit.client.0.vm03.stdout:6/228: creat d9/d22/f3f x:0 0 0 2026-03-09T16:14:24.267 INFO:tasks.workunit.client.0.vm03.stdout:5/292: unlink d2/d7/de/d11/f44 0 2026-03-09T16:14:24.271 INFO:tasks.workunit.client.0.vm03.stdout:3/218: truncate d5/d1e/f31 3731922 0 2026-03-09T16:14:24.271 INFO:tasks.workunit.client.0.vm03.stdout:6/229: chown d9/d22/c2b 5007 1 2026-03-09T16:14:24.276 INFO:tasks.workunit.client.0.vm03.stdout:5/293: dread d2/d7/de/d11/d38/f57 [0,4194304] 0 2026-03-09T16:14:24.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- 192.168.123.103:0/2063664653 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8072440 msgr2=0x7f8ca80771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 --2- 192.168.123.103:0/2063664653 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8072440 0x7f8ca80771b0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f8ca0009f90 tx=0x7f8ca00304e0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- 192.168.123.103:0/2063664653 shutdown_connections 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 --2- 192.168.123.103:0/2063664653 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8072440 0x7f8ca80771b0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 --2- 192.168.123.103:0/2063664653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- 192.168.123.103:0/2063664653 >> 192.168.123.103:0/2063664653 conn(0x7f8ca806d4f0 msgr2=0x7f8ca806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- 192.168.123.103:0/2063664653 shutdown_connections 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- 192.168.123.103:0/2063664653 wait complete. 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 Processor -- start 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- start start 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8084070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 0x7f8ca8082b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ca80845b0 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.294+0000 7f8cadf46640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ca8083080 con 0x7f8ca80826c0 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca6ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 0x7f8ca8082b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca6ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 0x7f8ca8082b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:38246/0 (socket says 192.168.123.103:38246) 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca6ffd640 1 -- 192.168.123.103:0/3647472222 learned_addr learned my addr 192.168.123.103:0/3647472222 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca77fe640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8084070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca77fe640 1 -- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 msgr2=0x7f8ca8082b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca77fe640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 0x7f8ca8082b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.295+0000 7f8ca77fe640 1 -- 192.168.123.103:0/3647472222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ca0009c40 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.296+0000 7f8ca77fe640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8084070 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f8c9800e970 tx=0x7f8c9800ee40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.296+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c9800ccb0 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.296+0000 7f8cadf46640 1 -- 192.168.123.103:0/3647472222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ca8083360 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.297+0000 7f8cadf46640 1 -- 192.168.123.103:0/3647472222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ca81b5bc0 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.297+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8c98004590 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.297+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c98010640 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.298+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 22) v1 ==== 50339+0+0 (secure 0 0 0) 0x7f8c980107a0 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.298+0000 7f8ca4ff9640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 0x7f8c94040350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.299+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f8c98014070 con 0x7f8ca8071a70 2026-03-09T16:14:24.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.299+0000 7f8ca6ffd640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 0x7f8c94040350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:24.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.300+0000 7f8cadf46640 1 -- 192.168.123.103:0/3647472222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c74005350 con 0x7f8ca8071a70 2026-03-09T16:14:24.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.303+0000 7f8ca6ffd640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 0x7f8c94040350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8ca0002790 tx=0x7f8ca003a040 comp rx=0 tx=0).ready entity=mgr.24357 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:24.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.308+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f8c98016b70 con 0x7f8ca8071a70 2026-03-09T16:14:24.327 INFO:tasks.workunit.client.0.vm03.stdout:7/196: creat d4/da/d18/d22/d24/f41 x:0 0 0 2026-03-09T16:14:24.327 INFO:tasks.workunit.client.0.vm03.stdout:7/197: chown d4/da/d18/d22/d24/d15/l27 2048152 1 2026-03-09T16:14:24.340 INFO:tasks.workunit.client.0.vm03.stdout:3/219: symlink d5/d1e/l3f 0 2026-03-09T16:14:24.352 INFO:tasks.workunit.client.0.vm03.stdout:1/177: dwrite d4/d6/f15 [0,4194304] 0 2026-03-09T16:14:24.364 INFO:tasks.workunit.client.0.vm03.stdout:4/245: mkdir d5/d40/d46 0 2026-03-09T16:14:24.364 INFO:tasks.workunit.client.0.vm03.stdout:5/294: unlink d2/d7/d8/d16/c2d 0 2026-03-09T16:14:24.366 INFO:tasks.workunit.client.0.vm03.stdout:0/257: dwrite d0/d7/d48/f43 [0,4194304] 0 2026-03-09T16:14:24.380 INFO:tasks.workunit.client.0.vm03.stdout:0/258: dwrite d0/d7/d48/d32/d35/f38 [0,4194304] 0 2026-03-09T16:14:24.386 INFO:tasks.workunit.client.0.vm03.stdout:0/259: stat d0/c22 0 2026-03-09T16:14:24.389 INFO:tasks.workunit.client.0.vm03.stdout:0/260: write d0/d7/d48/d32/d37/f31 [480104,14024] 0 2026-03-09T16:14:24.413 INFO:tasks.workunit.client.0.vm03.stdout:9/289: creat d2/d4/d11/d29/d2a/d4d/f53 x:0 0 0 2026-03-09T16:14:24.434 INFO:tasks.workunit.client.0.vm03.stdout:3/220: symlink d5/d13/d3a/l40 0 2026-03-09T16:14:24.436 INFO:tasks.workunit.client.0.vm03.stdout:2/246: write db/f14 [2502619,63442] 0 2026-03-09T16:14:24.452 INFO:tasks.workunit.client.0.vm03.stdout:8/256: dwrite da/d15/f2f [0,4194304] 0 2026-03-09T16:14:24.460 INFO:tasks.workunit.client.0.vm03.stdout:8/257: readlink da/db/l31 0 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: pgmap v4: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: from='client.24381 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: mgrmap e22: vm05.dygxfv(active, since 2s) 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: from='client.14610 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:23] ENGINE Bus STARTING 2026-03-09T16:14:24.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:24 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/2578160899' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:14:24.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.463+0000 7f8cadf46640 1 -- 192.168.123.103:0/3647472222 --> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8c74002bf0 con 0x7f8c9403de90 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "1/2 daemons upgraded", 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:14:24.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.467+0000 7f8ca4ff9640 1 -- 192.168.123.103:0/3647472222 <== mgr.24357 v2:192.168.123.105:6828/2751989419 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7f8c74002bf0 con 0x7f8c9403de90 2026-03-09T16:14:24.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.471+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 msgr2=0x7f8c94040350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.471+0000 7f8c867fc640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 0x7f8c94040350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8ca0002790 tx=0x7f8ca003a040 comp rx=0 tx=0).stop 2026-03-09T16:14:24.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.471+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 msgr2=0x7f8ca8084070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:24.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.471+0000 7f8c867fc640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8084070 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f8c9800e970 tx=0x7f8c9800ee40 comp rx=0 tx=0).stop 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 shutdown_connections 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:6828/2751989419,v1:192.168.123.105:6829/2751989419] conn(0x7f8c9403de90 0x7f8c94040350 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca80826c0 0x7f8ca8082b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 --2- 192.168.123.103:0/3647472222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ca8071a70 0x7f8ca8084070 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 >> 192.168.123.103:0/3647472222 conn(0x7f8ca806d4f0 msgr2=0x7f8ca8070440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 shutdown_connections 2026-03-09T16:14:24.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:24.473+0000 7f8c867fc640 1 -- 192.168.123.103:0/3647472222 wait complete. 2026-03-09T16:14:24.487 INFO:tasks.workunit.client.0.vm03.stdout:5/295: chown d2/d7/d8/d16/cc 144981 1 2026-03-09T16:14:24.496 INFO:tasks.workunit.client.0.vm03.stdout:4/246: write d5/fa [2282156,82859] 0 2026-03-09T16:14:24.503 INFO:tasks.workunit.client.0.vm03.stdout:0/261: unlink d0/d7/d48/d32/d35/d52/f51 0 2026-03-09T16:14:24.522 INFO:tasks.workunit.client.0.vm03.stdout:7/198: unlink d4/fe 0 2026-03-09T16:14:24.523 INFO:tasks.workunit.client.0.vm03.stdout:7/199: write d4/f3b [3923396,11128] 0 2026-03-09T16:14:24.531 INFO:tasks.workunit.client.0.vm03.stdout:2/247: fsync f9 0 2026-03-09T16:14:24.532 INFO:tasks.workunit.client.0.vm03.stdout:6/230: creat d9/f40 x:0 0 0 2026-03-09T16:14:24.532 INFO:tasks.workunit.client.0.vm03.stdout:6/231: truncate d9/f33 264796 0 2026-03-09T16:14:24.533 INFO:tasks.workunit.client.0.vm03.stdout:6/232: fsync d9/f1f 0 2026-03-09T16:14:24.536 INFO:tasks.workunit.client.0.vm03.stdout:8/258: rmdir da/db 39 2026-03-09T16:14:24.538 INFO:tasks.workunit.client.0.vm03.stdout:1/178: mkdir d4/d6/d1d/d3d 0 2026-03-09T16:14:24.553 INFO:tasks.workunit.client.0.vm03.stdout:0/262: sync 2026-03-09T16:14:24.553 INFO:tasks.workunit.client.0.vm03.stdout:2/248: sync 2026-03-09T16:14:24.553 INFO:tasks.workunit.client.0.vm03.stdout:2/249: write db/d1e/f4c [436326,3770] 0 2026-03-09T16:14:24.561 INFO:tasks.workunit.client.0.vm03.stdout:9/290: truncate d2/de/f1c 154064 0 2026-03-09T16:14:24.561 INFO:tasks.workunit.client.0.vm03.stdout:5/296: write d2/d7/de/f48 [601065,115666] 0 2026-03-09T16:14:24.565 INFO:tasks.workunit.client.0.vm03.stdout:5/297: dread d2/d7/de/f48 [0,4194304] 0 2026-03-09T16:14:24.580 INFO:tasks.workunit.client.0.vm03.stdout:8/259: creat da/d1d/f4a x:0 0 0 2026-03-09T16:14:24.580 INFO:tasks.workunit.client.0.vm03.stdout:1/179: rmdir d4/d6/d1d/d20 39 2026-03-09T16:14:24.584 INFO:tasks.workunit.client.0.vm03.stdout:4/247: link d5/d17/f21 d5/db/d25/d31/f47 0 2026-03-09T16:14:24.585 INFO:tasks.workunit.client.0.vm03.stdout:1/180: sync 2026-03-09T16:14:24.590 INFO:tasks.workunit.client.0.vm03.stdout:0/263: mknod d0/d7/d3e/c55 0 2026-03-09T16:14:24.590 INFO:tasks.workunit.client.0.vm03.stdout:9/291: mkdir d2/d54 0 2026-03-09T16:14:24.591 INFO:tasks.workunit.client.0.vm03.stdout:0/264: stat d0/d7/d48/d32/d35/d52/c50 0 2026-03-09T16:14:24.599 INFO:tasks.workunit.client.0.vm03.stdout:6/233: symlink d9/l41 0 2026-03-09T16:14:24.599 INFO:tasks.workunit.client.0.vm03.stdout:1/181: read d4/db/f2e [120598,52847] 0 2026-03-09T16:14:24.603 INFO:tasks.workunit.client.0.vm03.stdout:4/248: creat d5/dd/d1f/f48 x:0 0 0 2026-03-09T16:14:24.604 INFO:tasks.workunit.client.0.vm03.stdout:6/234: write d9/d22/f3f [503582,75669] 0 2026-03-09T16:14:24.612 INFO:tasks.workunit.client.0.vm03.stdout:7/200: rename d4/da/d18/d22/d24/f3d to d4/da/f42 0 2026-03-09T16:14:24.612 INFO:tasks.workunit.client.0.vm03.stdout:3/221: getdents d5/d2e 0 2026-03-09T16:14:24.612 INFO:tasks.workunit.client.0.vm03.stdout:7/201: chown d4/d2d 61507080 1 2026-03-09T16:14:24.616 INFO:tasks.workunit.client.0.vm03.stdout:9/292: mknod d2/de/c55 0 2026-03-09T16:14:24.628 INFO:tasks.workunit.client.0.vm03.stdout:5/298: truncate d2/d7/de/d11/f26 6423725 0 2026-03-09T16:14:24.641 INFO:tasks.workunit.client.0.vm03.stdout:8/260: creat da/d1d/d3b/f4b x:0 0 0 2026-03-09T16:14:24.642 INFO:tasks.workunit.client.0.vm03.stdout:2/250: rename db/d1e to db/d12/d2a/d61 0 2026-03-09T16:14:24.646 INFO:tasks.workunit.client.0.vm03.stdout:1/182: dread d4/fa [0,4194304] 0 2026-03-09T16:14:24.655 INFO:tasks.workunit.client.0.vm03.stdout:4/249: dread d5/f8 [0,4194304] 0 2026-03-09T16:14:24.655 INFO:tasks.workunit.client.0.vm03.stdout:4/250: chown d5/c15 60 1 2026-03-09T16:14:24.657 INFO:tasks.workunit.client.0.vm03.stdout:9/293: creat d2/d4/d11/d29/d2a/d4d/f56 x:0 0 0 2026-03-09T16:14:24.658 INFO:tasks.workunit.client.0.vm03.stdout:9/294: write d2/d4/d11/d29/d2a/d46/f47 [116256,84252] 0 2026-03-09T16:14:24.659 INFO:tasks.workunit.client.0.vm03.stdout:9/295: write d2/d4/d11/d29/d2a/d4d/f53 [353829,79410] 0 2026-03-09T16:14:24.661 INFO:tasks.workunit.client.0.vm03.stdout:9/296: fdatasync d2/d4/d11/d12/f35 0 2026-03-09T16:14:24.664 INFO:tasks.workunit.client.0.vm03.stdout:9/297: dread d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:24.671 INFO:tasks.workunit.client.0.vm03.stdout:6/235: mkdir d9/d42 0 2026-03-09T16:14:24.683 INFO:tasks.workunit.client.0.vm03.stdout:0/265: creat d0/d7/f56 x:0 0 0 2026-03-09T16:14:24.698 INFO:tasks.workunit.client.0.vm03.stdout:9/298: symlink d2/d4/d1f/l57 0 2026-03-09T16:14:24.701 INFO:tasks.workunit.client.0.vm03.stdout:2/251: truncate f7 539446 0 2026-03-09T16:14:24.701 INFO:tasks.workunit.client.0.vm03.stdout:2/252: fdatasync db/d12/f4b 0 2026-03-09T16:14:24.706 INFO:tasks.workunit.client.0.vm03.stdout:1/183: mkdir d4/d6/d1d/d20/d23/d3e 0 2026-03-09T16:14:24.715 INFO:tasks.workunit.client.0.vm03.stdout:0/266: mkdir d0/d7/d3e/d57 0 2026-03-09T16:14:24.715 INFO:tasks.workunit.client.0.vm03.stdout:0/267: chown d0/d7/d48/f18 127759007 1 2026-03-09T16:14:24.715 INFO:tasks.workunit.client.0.vm03.stdout:1/184: dread d4/f1b [0,4194304] 0 2026-03-09T16:14:24.717 INFO:tasks.workunit.client.0.vm03.stdout:0/268: dread d0/da/d1b/f46 [0,4194304] 0 2026-03-09T16:14:24.720 INFO:tasks.workunit.client.0.vm03.stdout:4/251: link d5/dd/f16 d5/d40/d46/f49 0 2026-03-09T16:14:24.721 INFO:tasks.workunit.client.0.vm03.stdout:4/252: read d5/f9 [113563,82985] 0 2026-03-09T16:14:24.730 INFO:tasks.workunit.client.0.vm03.stdout:6/236: getdents d9/d42 0 2026-03-09T16:14:24.731 INFO:tasks.workunit.client.0.vm03.stdout:2/253: creat db/d12/f62 x:0 0 0 2026-03-09T16:14:24.738 INFO:tasks.workunit.client.0.vm03.stdout:8/261: rename da/d10/f3e to da/f4c 0 2026-03-09T16:14:24.746 INFO:tasks.workunit.client.0.vm03.stdout:3/222: getdents d5/d13/d3a/d3c 0 2026-03-09T16:14:24.749 INFO:tasks.workunit.client.0.vm03.stdout:3/223: stat d5/d1e/l3f 0 2026-03-09T16:14:24.752 INFO:tasks.workunit.client.0.vm03.stdout:5/299: getdents d2/d7/de/d11/d38 0 2026-03-09T16:14:24.758 INFO:tasks.workunit.client.0.vm03.stdout:2/254: creat db/d12/f63 x:0 0 0 2026-03-09T16:14:24.759 INFO:tasks.workunit.client.0.vm03.stdout:2/255: dread - db/d12/d2a/f5f zero size 2026-03-09T16:14:24.773 INFO:tasks.workunit.client.0.vm03.stdout:1/185: mkdir d4/d6/d1d/d20/d23/d3e/d3f 0 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: pgmap v4: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: from='client.24381 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: mgrmap e22: vm05.dygxfv(active, since 2s) 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: from='client.14610 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:23] ENGINE Bus STARTING 2026-03-09T16:14:24.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:24 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/2578160899' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:14:24.782 INFO:tasks.workunit.client.0.vm03.stdout:2/256: mkdir db/d59/d64 0 2026-03-09T16:14:24.786 INFO:tasks.workunit.client.0.vm03.stdout:2/257: dwrite db/d12/d2a/d61/f4c [0,4194304] 0 2026-03-09T16:14:24.799 INFO:tasks.workunit.client.0.vm03.stdout:8/262: dwrite da/db/fe [0,4194304] 0 2026-03-09T16:14:24.801 INFO:tasks.workunit.client.0.vm03.stdout:8/263: truncate da/d1d/f4a 460749 0 2026-03-09T16:14:24.809 INFO:tasks.workunit.client.0.vm03.stdout:8/264: read da/db/f1c [2347343,103405] 0 2026-03-09T16:14:24.817 INFO:tasks.workunit.client.0.vm03.stdout:1/186: fsync d4/db/f2e 0 2026-03-09T16:14:24.821 INFO:tasks.workunit.client.0.vm03.stdout:5/300: creat d2/d7/de/d11/d38/d3b/f68 x:0 0 0 2026-03-09T16:14:24.824 INFO:tasks.workunit.client.0.vm03.stdout:9/299: link d2/de/f1c d2/d4/d11/d29/d2a/f58 0 2026-03-09T16:14:24.824 INFO:tasks.workunit.client.0.vm03.stdout:5/301: dwrite d2/d7/d1a/d1c/d3f/f67 [0,4194304] 0 2026-03-09T16:14:24.840 INFO:tasks.workunit.client.0.vm03.stdout:3/224: truncate d5/d13/f20 255537 0 2026-03-09T16:14:24.841 INFO:tasks.workunit.client.0.vm03.stdout:2/258: creat db/d12/d2a/d61/f65 x:0 0 0 2026-03-09T16:14:24.844 INFO:tasks.workunit.client.0.vm03.stdout:7/202: rename d4/c14 to d4/da/c43 0 2026-03-09T16:14:24.848 INFO:tasks.workunit.client.0.vm03.stdout:7/203: chown d4/da/d19/l40 2531 1 2026-03-09T16:14:24.878 INFO:tasks.workunit.client.0.vm03.stdout:8/265: write da/d10/f23 [5032520,82060] 0 2026-03-09T16:14:24.883 INFO:tasks.workunit.client.0.vm03.stdout:1/187: rmdir d4/d6/d3b 39 2026-03-09T16:14:24.886 INFO:tasks.workunit.client.0.vm03.stdout:4/253: getdents d5/dd 0 2026-03-09T16:14:24.887 INFO:tasks.workunit.client.0.vm03.stdout:4/254: readlink d5/l29 0 2026-03-09T16:14:24.893 INFO:tasks.workunit.client.0.vm03.stdout:5/302: mknod d2/d7/d8/d24/d27/d43/c69 0 2026-03-09T16:14:24.894 INFO:tasks.workunit.client.0.vm03.stdout:5/303: readlink d2/d7/d1a/d1c/l5b 0 2026-03-09T16:14:24.899 INFO:tasks.workunit.client.0.vm03.stdout:3/225: chown d5/d2e/l36 478301 1 2026-03-09T16:14:24.908 INFO:tasks.workunit.client.0.vm03.stdout:7/204: rename d4/da/d18/f1b to d4/da/d18/f44 0 2026-03-09T16:14:24.919 INFO:tasks.workunit.client.0.vm03.stdout:7/205: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:24.939 INFO:tasks.workunit.client.0.vm03.stdout:1/188: fdatasync d4/d6/d1d/d20/d23/f30 0 2026-03-09T16:14:24.963 INFO:tasks.workunit.client.0.vm03.stdout:2/259: creat db/d59/d64/f66 x:0 0 0 2026-03-09T16:14:24.968 INFO:tasks.workunit.client.0.vm03.stdout:0/269: rename d0/d7/d48/d32/d35/l40 to d0/d7/d3e/d57/l58 0 2026-03-09T16:14:24.978 INFO:tasks.workunit.client.0.vm03.stdout:4/255: creat d5/d17/d44/f4a x:0 0 0 2026-03-09T16:14:24.978 INFO:tasks.workunit.client.0.vm03.stdout:4/256: stat f2 0 2026-03-09T16:14:24.978 INFO:tasks.workunit.client.0.vm03.stdout:4/257: readlink d5/l29 0 2026-03-09T16:14:24.988 INFO:tasks.workunit.client.0.vm03.stdout:5/304: symlink d2/d7/d8/d24/d27/l6a 0 2026-03-09T16:14:24.991 INFO:tasks.workunit.client.0.vm03.stdout:2/260: symlink db/d12/d2a/l67 0 2026-03-09T16:14:24.992 INFO:tasks.workunit.client.0.vm03.stdout:2/261: fsync db/f2e 0 2026-03-09T16:14:24.994 INFO:tasks.workunit.client.0.vm03.stdout:3/226: symlink d5/d13/d37/l41 0 2026-03-09T16:14:24.999 INFO:tasks.workunit.client.0.vm03.stdout:6/237: rename d9/f1f to d9/d22/f43 0 2026-03-09T16:14:25.002 INFO:tasks.workunit.client.0.vm03.stdout:0/270: creat d0/d7/d48/d32/f59 x:0 0 0 2026-03-09T16:14:25.003 INFO:tasks.workunit.client.0.vm03.stdout:8/266: link da/db/f1c da/d32/f4d 0 2026-03-09T16:14:25.013 INFO:tasks.workunit.client.0.vm03.stdout:4/258: creat d5/dd/f4b x:0 0 0 2026-03-09T16:14:25.015 INFO:tasks.workunit.client.0.vm03.stdout:9/300: getdents d2 0 2026-03-09T16:14:25.029 INFO:tasks.workunit.client.0.vm03.stdout:2/262: mkdir db/d59/d64/d68 0 2026-03-09T16:14:25.030 INFO:tasks.workunit.client.0.vm03.stdout:5/305: dread d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:25.030 INFO:tasks.workunit.client.0.vm03.stdout:8/267: unlink da/db/d30/l42 0 2026-03-09T16:14:25.031 INFO:tasks.workunit.client.0.vm03.stdout:9/301: rename d2/d4/d1f/l57 to d2/d4/d1f/l59 0 2026-03-09T16:14:25.033 INFO:tasks.workunit.client.0.vm03.stdout:6/238: sync 2026-03-09T16:14:25.036 INFO:tasks.workunit.client.0.vm03.stdout:1/189: creat d4/d6/d1d/f40 x:0 0 0 2026-03-09T16:14:25.038 INFO:tasks.workunit.client.0.vm03.stdout:9/302: dread d2/d4/d11/d12/d28/f2f [4194304,4194304] 0 2026-03-09T16:14:25.039 INFO:tasks.workunit.client.0.vm03.stdout:2/263: creat db/d12/f69 x:0 0 0 2026-03-09T16:14:25.039 INFO:tasks.workunit.client.0.vm03.stdout:9/303: fsync d2/df/f42 0 2026-03-09T16:14:25.042 INFO:tasks.workunit.client.0.vm03.stdout:2/264: dwrite db/d59/f53 [0,4194304] 0 2026-03-09T16:14:25.050 INFO:tasks.workunit.client.0.vm03.stdout:2/265: dread db/d12/f39 [0,4194304] 0 2026-03-09T16:14:25.054 INFO:tasks.workunit.client.0.vm03.stdout:5/306: dread d2/d7/de/f48 [0,4194304] 0 2026-03-09T16:14:25.077 INFO:tasks.workunit.client.0.vm03.stdout:8/268: rmdir da/d15 39 2026-03-09T16:14:25.077 INFO:tasks.workunit.client.0.vm03.stdout:7/206: getdents d4/da/d18/d22 0 2026-03-09T16:14:25.077 INFO:tasks.workunit.client.0.vm03.stdout:0/271: write d0/d7/d48/d32/d37/f33 [3263827,81877] 0 2026-03-09T16:14:25.080 INFO:tasks.workunit.client.0.vm03.stdout:8/269: fsync da/db/fe 0 2026-03-09T16:14:25.080 INFO:tasks.workunit.client.0.vm03.stdout:7/207: chown d4/ld 2257188 1 2026-03-09T16:14:25.081 INFO:tasks.workunit.client.0.vm03.stdout:8/270: readlink da/d45/l47 0 2026-03-09T16:14:25.083 INFO:tasks.workunit.client.0.vm03.stdout:3/227: dwrite d5/d1e/f26 [0,4194304] 0 2026-03-09T16:14:25.099 INFO:tasks.workunit.client.0.vm03.stdout:3/228: dread d5/d13/f29 [0,4194304] 0 2026-03-09T16:14:25.100 INFO:tasks.workunit.client.0.vm03.stdout:0/272: dwrite d0/f29 [0,4194304] 0 2026-03-09T16:14:25.103 INFO:tasks.workunit.client.0.vm03.stdout:0/273: stat d0/da/d1b/l44 0 2026-03-09T16:14:25.107 INFO:tasks.workunit.client.0.vm03.stdout:1/190: symlink d4/d6/d1d/d20/d23/d3e/l41 0 2026-03-09T16:14:25.107 INFO:tasks.workunit.client.0.vm03.stdout:6/239: creat d9/d14/f44 x:0 0 0 2026-03-09T16:14:25.115 INFO:tasks.workunit.client.0.vm03.stdout:3/229: dread d5/f11 [0,4194304] 0 2026-03-09T16:14:25.133 INFO:tasks.workunit.client.0.vm03.stdout:2/266: dread fa [0,4194304] 0 2026-03-09T16:14:25.133 INFO:tasks.workunit.client.0.vm03.stdout:2/267: read - db/d12/d2a/f38 zero size 2026-03-09T16:14:25.134 INFO:tasks.workunit.client.0.vm03.stdout:2/268: fdatasync db/f55 0 2026-03-09T16:14:25.141 INFO:tasks.workunit.client.0.vm03.stdout:4/259: chown d5/dd/c37 0 1 2026-03-09T16:14:25.141 INFO:tasks.workunit.client.0.vm03.stdout:4/260: chown d5/db/d25/d31 6108527 1 2026-03-09T16:14:25.141 INFO:tasks.workunit.client.0.vm03.stdout:4/261: fdatasync d5/d17/f18 0 2026-03-09T16:14:25.143 INFO:tasks.workunit.client.0.vm03.stdout:2/269: dread f9 [0,4194304] 0 2026-03-09T16:14:25.144 INFO:tasks.workunit.client.0.vm03.stdout:2/270: fdatasync db/f2d 0 2026-03-09T16:14:25.163 INFO:tasks.workunit.client.0.vm03.stdout:6/240: unlink d9/d22/c34 0 2026-03-09T16:14:25.163 INFO:tasks.workunit.client.0.vm03.stdout:9/304: getdents d2/d54 0 2026-03-09T16:14:25.167 INFO:tasks.workunit.client.0.vm03.stdout:1/191: creat d4/d6/d1d/d24/f42 x:0 0 0 2026-03-09T16:14:25.179 INFO:tasks.workunit.client.0.vm03.stdout:8/271: dread da/db/f1c [0,4194304] 0 2026-03-09T16:14:25.183 INFO:tasks.workunit.client.0.vm03.stdout:5/307: rename d2/d7/d3c/c62 to d2/d7/d8/d16/d5c/c6b 0 2026-03-09T16:14:25.194 INFO:tasks.workunit.client.0.vm03.stdout:8/272: dread da/d10/f1f [0,4194304] 0 2026-03-09T16:14:25.197 INFO:tasks.workunit.client.0.vm03.stdout:8/273: dread da/d32/f4d [0,4194304] 0 2026-03-09T16:14:25.205 INFO:tasks.workunit.client.0.vm03.stdout:4/262: readlink d5/db/d25/l43 0 2026-03-09T16:14:25.210 INFO:tasks.workunit.client.0.vm03.stdout:7/208: mkdir d4/da/d45 0 2026-03-09T16:14:25.211 INFO:tasks.workunit.client.0.vm03.stdout:7/209: dread d4/da/d18/d22/d24/f30 [0,4194304] 0 2026-03-09T16:14:25.212 INFO:tasks.workunit.client.0.vm03.stdout:9/305: creat d2/f5a x:0 0 0 2026-03-09T16:14:25.219 INFO:tasks.workunit.client.0.vm03.stdout:1/192: symlink d4/d6/d1d/d20/d23/l43 0 2026-03-09T16:14:25.236 INFO:tasks.workunit.client.0.vm03.stdout:9/306: dread d2/f15 [4194304,4194304] 0 2026-03-09T16:14:25.243 INFO:tasks.workunit.client.0.vm03.stdout:0/274: rename d0/d7/d48/d32/d35 to d0/d7/d3e/d57/d5a 0 2026-03-09T16:14:25.249 INFO:tasks.workunit.client.0.vm03.stdout:5/308: mkdir d2/d7/d1a/d1c/d6c 0 2026-03-09T16:14:25.251 INFO:tasks.workunit.client.0.vm03.stdout:0/275: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:25.252 INFO:tasks.workunit.client.0.vm03.stdout:0/276: stat d0/d7/d3e/c55 0 2026-03-09T16:14:25.252 INFO:tasks.workunit.client.0.vm03.stdout:0/277: stat d0/da/d1b/c53 0 2026-03-09T16:14:25.261 INFO:tasks.workunit.client.0.vm03.stdout:4/263: read - d5/dd/d1f/f48 zero size 2026-03-09T16:14:25.271 INFO:tasks.workunit.client.0.vm03.stdout:7/210: fsync d4/da/d18/d22/d24/d15/f34 0 2026-03-09T16:14:25.279 INFO:tasks.workunit.client.0.vm03.stdout:6/241: mkdir d9/d42/d45 0 2026-03-09T16:14:25.279 INFO:tasks.workunit.client.0.vm03.stdout:1/193: rmdir d4/d6 39 2026-03-09T16:14:25.280 INFO:tasks.workunit.client.0.vm03.stdout:1/194: stat d4/d31/f38 0 2026-03-09T16:14:25.280 INFO:tasks.workunit.client.0.vm03.stdout:6/242: dread d9/d14/f28 [0,4194304] 0 2026-03-09T16:14:25.282 INFO:tasks.workunit.client.0.vm03.stdout:8/274: dwrite da/d10/d28/f2c [0,4194304] 0 2026-03-09T16:14:25.290 INFO:tasks.workunit.client.0.vm03.stdout:3/230: rename d5/d13 to d5/d1e/d42 0 2026-03-09T16:14:25.294 INFO:tasks.workunit.client.0.vm03.stdout:4/264: creat d5/dd/d1f/f4c x:0 0 0 2026-03-09T16:14:25.300 INFO:tasks.workunit.client.0.vm03.stdout:8/275: dwrite da/d10/d28/f29 [0,4194304] 0 2026-03-09T16:14:25.309 INFO:tasks.workunit.client.0.vm03.stdout:7/211: symlink d4/da/d18/d22/d24/d16/l46 0 2026-03-09T16:14:25.309 INFO:tasks.workunit.client.0.vm03.stdout:9/307: dwrite d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:25.314 INFO:tasks.workunit.client.0.vm03.stdout:9/308: fsync d2/d4/d11/d29/d2a/d4d/f56 0 2026-03-09T16:14:25.315 INFO:tasks.workunit.client.0.vm03.stdout:8/276: dread da/d32/f4d [0,4194304] 0 2026-03-09T16:14:25.318 INFO:tasks.workunit.client.0.vm03.stdout:5/309: creat d2/d7/d1a/d1c/d6c/f6d x:0 0 0 2026-03-09T16:14:25.323 INFO:tasks.workunit.client.0.vm03.stdout:2/271: getdents db 0 2026-03-09T16:14:25.325 INFO:tasks.workunit.client.0.vm03.stdout:8/277: dread da/db/d30/f36 [0,4194304] 0 2026-03-09T16:14:25.335 INFO:tasks.workunit.client.0.vm03.stdout:0/278: symlink d0/d7/l5b 0 2026-03-09T16:14:25.336 INFO:tasks.workunit.client.0.vm03.stdout:9/309: creat d2/d4/d1f/f5b x:0 0 0 2026-03-09T16:14:25.336 INFO:tasks.workunit.client.0.vm03.stdout:4/265: chown d5/dd/f23 946 1 2026-03-09T16:14:25.345 INFO:tasks.workunit.client.0.vm03.stdout:9/310: dwrite d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:25.347 INFO:tasks.workunit.client.0.vm03.stdout:6/243: symlink d9/l46 0 2026-03-09T16:14:25.347 INFO:tasks.workunit.client.0.vm03.stdout:5/310: creat d2/d7/d1a/f6e x:0 0 0 2026-03-09T16:14:25.351 INFO:tasks.workunit.client.0.vm03.stdout:8/278: dread da/db/f44 [0,4194304] 0 2026-03-09T16:14:25.355 INFO:tasks.workunit.client.0.vm03.stdout:1/195: rename d4/d6/l22 to d4/d6/d1d/d20/d23/d3e/d3f/l44 0 2026-03-09T16:14:25.356 INFO:tasks.workunit.client.0.vm03.stdout:2/272: mknod db/d12/c6a 0 2026-03-09T16:14:25.362 INFO:tasks.workunit.client.0.vm03.stdout:6/244: dread d9/d22/f2d [0,4194304] 0 2026-03-09T16:14:25.363 INFO:tasks.workunit.client.0.vm03.stdout:6/245: chown d9/d22/f43 39858 1 2026-03-09T16:14:25.364 INFO:tasks.workunit.client.0.vm03.stdout:6/246: fdatasync d9/d22/f24 0 2026-03-09T16:14:25.366 INFO:tasks.workunit.client.0.vm03.stdout:2/273: dwrite db/d12/f63 [0,4194304] 0 2026-03-09T16:14:25.392 INFO:tasks.workunit.client.0.vm03.stdout:0/279: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:25.392 INFO:tasks.workunit.client.0.vm03.stdout:0/280: chown d0/da/d1b/l44 622 1 2026-03-09T16:14:25.393 INFO:tasks.workunit.client.0.vm03.stdout:0/281: dread - d0/f4d zero size 2026-03-09T16:14:25.393 INFO:tasks.workunit.client.0.vm03.stdout:0/282: fsync d0/f4d 0 2026-03-09T16:14:25.395 INFO:tasks.workunit.client.0.vm03.stdout:4/266: mkdir d5/db/d25/d31/d4d 0 2026-03-09T16:14:25.400 INFO:tasks.workunit.client.0.vm03.stdout:9/311: mknod d2/d4/d11/d29/d2a/d46/c5c 0 2026-03-09T16:14:25.400 INFO:tasks.workunit.client.0.vm03.stdout:2/274: rename db/f2e to db/d59/d64/d68/f6b 0 2026-03-09T16:14:25.406 INFO:tasks.workunit.client.0.vm03.stdout:8/279: mknod da/db/d43/c4e 0 2026-03-09T16:14:25.406 INFO:tasks.workunit.client.0.vm03.stdout:8/280: chown c5 1911751856 1 2026-03-09T16:14:25.406 INFO:tasks.workunit.client.0.vm03.stdout:0/283: fdatasync d0/da/d1b/f46 0 2026-03-09T16:14:25.419 INFO:tasks.workunit.client.0.vm03.stdout:3/231: dwrite d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:25.435 INFO:tasks.workunit.client.0.vm03.stdout:1/196: creat d4/d6/d1d/d3d/f45 x:0 0 0 2026-03-09T16:14:25.438 INFO:tasks.workunit.client.0.vm03.stdout:6/247: mkdir d9/d42/d45/d47 0 2026-03-09T16:14:25.439 INFO:tasks.workunit.client.0.vm03.stdout:4/267: rename d5/db/f13 to d5/db/d25/f4e 0 2026-03-09T16:14:25.445 INFO:tasks.workunit.client.0.vm03.stdout:7/212: getdents d4/da/d19 0 2026-03-09T16:14:25.445 INFO:tasks.workunit.client.0.vm03.stdout:4/268: dwrite d5/d17/d44/f4a [0,4194304] 0 2026-03-09T16:14:25.451 INFO:tasks.workunit.client.0.vm03.stdout:5/311: dwrite d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:25.451 INFO:tasks.workunit.client.0.vm03.stdout:0/284: rmdir d0/da 39 2026-03-09T16:14:25.455 INFO:tasks.workunit.client.0.vm03.stdout:0/285: stat d0/d7/d48/d32/d37 0 2026-03-09T16:14:25.455 INFO:tasks.workunit.client.0.vm03.stdout:1/197: symlink d4/d6/d1d/d20/d23/l46 0 2026-03-09T16:14:25.456 INFO:tasks.workunit.client.0.vm03.stdout:0/286: dread - d0/d7/d48/d32/f59 zero size 2026-03-09T16:14:25.456 INFO:tasks.workunit.client.0.vm03.stdout:0/287: fsync d0/f4e 0 2026-03-09T16:14:25.459 INFO:tasks.workunit.client.0.vm03.stdout:5/312: stat d2/d7/d1a 0 2026-03-09T16:14:25.471 INFO:tasks.workunit.client.0.vm03.stdout:5/313: write d2/d7/de/d11/f32 [130955,22003] 0 2026-03-09T16:14:25.478 INFO:tasks.workunit.client.0.vm03.stdout:4/269: rename f2 to d5/d40/d46/f4f 0 2026-03-09T16:14:25.478 INFO:tasks.workunit.client.0.vm03.stdout:8/281: rmdir da/d15 39 2026-03-09T16:14:25.481 INFO:tasks.workunit.client.0.vm03.stdout:4/270: write d5/dd/d1f/f2d [1092220,28492] 0 2026-03-09T16:14:25.485 INFO:tasks.workunit.client.0.vm03.stdout:6/248: mkdir d9/d42/d45/d47/d48 0 2026-03-09T16:14:25.489 INFO:tasks.workunit.client.0.vm03.stdout:8/282: dwrite da/d1d/f4a [0,4194304] 0 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='client.14614 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:23] ENGINE Serving on http://192.168.123.105:8765 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:23] ENGINE Serving on https://192.168.123.105:7150 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:23] ENGINE Bus STARTED 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:23] ENGINE Client ('192.168.123.105', 33516) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.489 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:25 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.493 INFO:tasks.workunit.client.0.vm03.stdout:9/312: link d2/d4/d11/d12/d28/f2f d2/d4/d11/d29/f5d 0 2026-03-09T16:14:25.494 INFO:tasks.workunit.client.0.vm03.stdout:3/232: creat d5/f43 x:0 0 0 2026-03-09T16:14:25.494 INFO:tasks.workunit.client.0.vm03.stdout:5/314: dwrite d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:25.496 INFO:tasks.workunit.client.0.vm03.stdout:6/249: mknod d9/d14/c49 0 2026-03-09T16:14:25.511 INFO:tasks.workunit.client.0.vm03.stdout:0/288: rename d0/d7/d48/d32/d37 to d0/da/d5c 0 2026-03-09T16:14:25.512 INFO:tasks.workunit.client.0.vm03.stdout:0/289: fdatasync d0/f54 0 2026-03-09T16:14:25.516 INFO:tasks.workunit.client.0.vm03.stdout:0/290: dread d0/d7/d48/f2e [4194304,4194304] 0 2026-03-09T16:14:25.520 INFO:tasks.workunit.client.0.vm03.stdout:8/283: mkdir da/d10/d28/d4f 0 2026-03-09T16:14:25.520 INFO:tasks.workunit.client.0.vm03.stdout:5/315: mknod d2/d7/de/d11/d19/d31/d35/c6f 0 2026-03-09T16:14:25.522 INFO:tasks.workunit.client.0.vm03.stdout:6/250: dwrite d9/d22/f37 [0,4194304] 0 2026-03-09T16:14:25.524 INFO:tasks.workunit.client.0.vm03.stdout:4/271: dread d5/db/d25/d31/f47 [0,4194304] 0 2026-03-09T16:14:25.530 INFO:tasks.workunit.client.0.vm03.stdout:4/272: dwrite d5/d17/d44/f4a [0,4194304] 0 2026-03-09T16:14:25.534 INFO:tasks.workunit.client.0.vm03.stdout:0/291: mkdir d0/d7/d3e/d5d 0 2026-03-09T16:14:25.535 INFO:tasks.workunit.client.0.vm03.stdout:4/273: write d5/d17/f39 [65248,80402] 0 2026-03-09T16:14:25.546 INFO:tasks.workunit.client.0.vm03.stdout:8/284: symlink da/d10/d28/l50 0 2026-03-09T16:14:25.546 INFO:tasks.workunit.client.0.vm03.stdout:3/233: truncate d5/f10 1114656 0 2026-03-09T16:14:25.546 INFO:tasks.workunit.client.0.vm03.stdout:6/251: creat d9/d42/d45/f4a x:0 0 0 2026-03-09T16:14:25.547 INFO:tasks.workunit.client.0.vm03.stdout:4/274: rmdir d5 39 2026-03-09T16:14:25.570 INFO:tasks.workunit.client.0.vm03.stdout:4/275: rmdir d5/d17 39 2026-03-09T16:14:25.571 INFO:tasks.workunit.client.0.vm03.stdout:4/276: readlink d5/dd/d1f/l27 0 2026-03-09T16:14:25.573 INFO:tasks.workunit.client.0.vm03.stdout:5/316: getdents d2 0 2026-03-09T16:14:25.577 INFO:tasks.workunit.client.0.vm03.stdout:2/275: write db/ff [727060,4336] 0 2026-03-09T16:14:25.580 INFO:tasks.workunit.client.0.vm03.stdout:1/198: dwrite d4/db/f21 [0,4194304] 0 2026-03-09T16:14:25.599 INFO:tasks.workunit.client.0.vm03.stdout:7/213: dwrite d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:25.600 INFO:tasks.workunit.client.0.vm03.stdout:8/285: link da/c49 da/d1d/c51 0 2026-03-09T16:14:25.600 INFO:tasks.workunit.client.0.vm03.stdout:8/286: stat da/db/f44 0 2026-03-09T16:14:25.600 INFO:tasks.workunit.client.0.vm03.stdout:0/292: getdents d0/d7/d3e/d57 0 2026-03-09T16:14:25.600 INFO:tasks.workunit.client.0.vm03.stdout:5/317: symlink d2/d7/d8/d16/l70 0 2026-03-09T16:14:25.600 INFO:tasks.workunit.client.0.vm03.stdout:1/199: creat d4/db/f47 x:0 0 0 2026-03-09T16:14:25.605 INFO:tasks.workunit.client.0.vm03.stdout:6/252: sync 2026-03-09T16:14:25.605 INFO:tasks.workunit.client.0.vm03.stdout:1/200: dwrite d4/d6/d1d/d20/d23/f28 [0,4194304] 0 2026-03-09T16:14:25.631 INFO:tasks.workunit.client.0.vm03.stdout:3/234: rename d5/d1e/d42/d3a to d5/d44 0 2026-03-09T16:14:25.631 INFO:tasks.workunit.client.0.vm03.stdout:3/235: chown d5/d1e/d42 3474178 1 2026-03-09T16:14:25.635 INFO:tasks.workunit.client.0.vm03.stdout:3/236: dread d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:25.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='client.14614 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:23] ENGINE Serving on http://192.168.123.105:8765 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:23] ENGINE Serving on https://192.168.123.105:7150 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:23] ENGINE Bus STARTED 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:23] ENGINE Client ('192.168.123.105', 33516) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:25 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:25.650 INFO:tasks.workunit.client.0.vm03.stdout:9/313: dwrite d2/de/f1c [0,4194304] 0 2026-03-09T16:14:25.657 INFO:tasks.workunit.client.0.vm03.stdout:4/277: symlink d5/d17/d44/l50 0 2026-03-09T16:14:25.668 INFO:tasks.workunit.client.0.vm03.stdout:5/318: creat d2/d7/d8/d16/d5c/f71 x:0 0 0 2026-03-09T16:14:25.668 INFO:tasks.workunit.client.0.vm03.stdout:2/276: readlink db/d12/d2a/l4f 0 2026-03-09T16:14:25.679 INFO:tasks.workunit.client.0.vm03.stdout:7/214: mknod d4/da/d18/d22/d24/d16/d3e/c47 0 2026-03-09T16:14:25.687 INFO:tasks.workunit.client.0.vm03.stdout:3/237: rmdir d5/d44/d3c 39 2026-03-09T16:14:25.689 INFO:tasks.workunit.client.0.vm03.stdout:3/238: write d5/f16 [1669913,41822] 0 2026-03-09T16:14:25.689 INFO:tasks.workunit.client.0.vm03.stdout:3/239: fdatasync d5/f33 0 2026-03-09T16:14:25.689 INFO:tasks.workunit.client.0.vm03.stdout:3/240: chown d5/c2f 386 1 2026-03-09T16:14:25.695 INFO:tasks.workunit.client.0.vm03.stdout:7/215: dread d4/da/f42 [0,4194304] 0 2026-03-09T16:14:25.704 INFO:tasks.workunit.client.0.vm03.stdout:1/201: write d4/fa [546849,14294] 0 2026-03-09T16:14:25.707 INFO:tasks.workunit.client.0.vm03.stdout:4/278: mkdir d5/db/d25/d31/d51 0 2026-03-09T16:14:25.713 INFO:tasks.workunit.client.0.vm03.stdout:0/293: creat d0/d7/d3e/d45/f5e x:0 0 0 2026-03-09T16:14:25.713 INFO:tasks.workunit.client.0.vm03.stdout:0/294: chown d0/d7/d3e/f4f 94 1 2026-03-09T16:14:25.714 INFO:tasks.workunit.client.0.vm03.stdout:0/295: write d0/d7/d3e/d57/d5a/f38 [838999,12092] 0 2026-03-09T16:14:25.736 INFO:tasks.workunit.client.0.vm03.stdout:3/241: symlink d5/d1e/l45 0 2026-03-09T16:14:25.737 INFO:tasks.workunit.client.0.vm03.stdout:3/242: read - d5/f43 zero size 2026-03-09T16:14:25.737 INFO:tasks.workunit.client.0.vm03.stdout:3/243: chown d5/d1e/d42/c32 368 1 2026-03-09T16:14:25.738 INFO:tasks.workunit.client.0.vm03.stdout:3/244: chown d5/f43 374187 1 2026-03-09T16:14:25.740 INFO:tasks.workunit.client.0.vm03.stdout:3/245: chown d5/d1e/d42/c1c 982 1 2026-03-09T16:14:25.742 INFO:tasks.workunit.client.0.vm03.stdout:9/314: creat d2/d54/f5e x:0 0 0 2026-03-09T16:14:25.745 INFO:tasks.workunit.client.0.vm03.stdout:7/216: fdatasync d4/da/f42 0 2026-03-09T16:14:25.745 INFO:tasks.workunit.client.0.vm03.stdout:7/217: write d4/da/d18/d22/d24/d15/f2a [1337702,45551] 0 2026-03-09T16:14:25.755 INFO:tasks.workunit.client.0.vm03.stdout:1/202: dread d4/db/f2e [0,4194304] 0 2026-03-09T16:14:25.756 INFO:tasks.workunit.client.0.vm03.stdout:4/279: rename d5/f3c to d5/d40/d46/f52 0 2026-03-09T16:14:25.762 INFO:tasks.workunit.client.0.vm03.stdout:8/287: creat da/f52 x:0 0 0 2026-03-09T16:14:25.765 INFO:tasks.workunit.client.0.vm03.stdout:8/288: write da/d1d/f4a [4093854,58029] 0 2026-03-09T16:14:25.778 INFO:tasks.workunit.client.0.vm03.stdout:0/296: mkdir d0/d7/d3e/d57/d5a/d5f 0 2026-03-09T16:14:25.789 INFO:tasks.workunit.client.0.vm03.stdout:0/297: sync 2026-03-09T16:14:25.799 INFO:tasks.workunit.client.0.vm03.stdout:5/319: rename d2/d7/d8/d16/d5c/c6b to d2/d7/d8/d16/d5c/c72 0 2026-03-09T16:14:25.800 INFO:tasks.workunit.client.0.vm03.stdout:6/253: rmdir d9/d42/d45/d47/d48 0 2026-03-09T16:14:25.800 INFO:tasks.workunit.client.0.vm03.stdout:6/254: chown d9/d14/f31 7592 1 2026-03-09T16:14:25.806 INFO:tasks.workunit.client.0.vm03.stdout:3/246: mknod d5/d1e/c46 0 2026-03-09T16:14:25.832 INFO:tasks.workunit.client.0.vm03.stdout:9/315: write d2/d4/d11/f41 [1819758,48747] 0 2026-03-09T16:14:25.846 INFO:tasks.workunit.client.0.vm03.stdout:9/316: dwrite d2/f7 [0,4194304] 0 2026-03-09T16:14:25.846 INFO:tasks.workunit.client.0.vm03.stdout:1/203: creat d4/d6/d1d/d20/d23/d3e/d3f/f48 x:0 0 0 2026-03-09T16:14:25.846 INFO:tasks.workunit.client.0.vm03.stdout:9/317: read - d2/f5a zero size 2026-03-09T16:14:25.849 INFO:tasks.workunit.client.0.vm03.stdout:4/280: fsync d5/d40/d46/f49 0 2026-03-09T16:14:25.858 INFO:tasks.workunit.client.0.vm03.stdout:8/289: rmdir da/db 39 2026-03-09T16:14:25.858 INFO:tasks.workunit.client.0.vm03.stdout:2/277: mknod db/d59/d52/c6c 0 2026-03-09T16:14:25.865 INFO:tasks.workunit.client.0.vm03.stdout:6/255: truncate f7 446320 0 2026-03-09T16:14:25.865 INFO:tasks.workunit.client.0.vm03.stdout:6/256: dread - d9/d42/d45/f4a zero size 2026-03-09T16:14:25.865 INFO:tasks.workunit.client.0.vm03.stdout:6/257: readlink d9/d22/l26 0 2026-03-09T16:14:25.889 INFO:tasks.workunit.client.0.vm03.stdout:9/318: unlink d2/d4/d11/d29/d2a/d38/f49 0 2026-03-09T16:14:25.889 INFO:tasks.workunit.client.0.vm03.stdout:4/281: unlink d5/dd/f4b 0 2026-03-09T16:14:25.890 INFO:tasks.workunit.client.0.vm03.stdout:7/218: truncate d4/dc/f1a 3622466 0 2026-03-09T16:14:25.891 INFO:tasks.workunit.client.0.vm03.stdout:7/219: write d4/d2d/f32 [740707,119408] 0 2026-03-09T16:14:25.898 INFO:tasks.workunit.client.0.vm03.stdout:9/319: sync 2026-03-09T16:14:25.899 INFO:tasks.workunit.client.0.vm03.stdout:9/320: sync 2026-03-09T16:14:25.901 INFO:tasks.workunit.client.0.vm03.stdout:6/258: rename d9/ce to d9/d22/c4b 0 2026-03-09T16:14:25.901 INFO:tasks.workunit.client.0.vm03.stdout:3/247: truncate d5/d1e/d42/f20 796116 0 2026-03-09T16:14:25.902 INFO:tasks.workunit.client.0.vm03.stdout:9/321: stat d2/d4/d11/d29/d2a/f4a 0 2026-03-09T16:14:25.905 INFO:tasks.workunit.client.0.vm03.stdout:2/278: dwrite db/d59/d64/d68/f6b [0,4194304] 0 2026-03-09T16:14:25.914 INFO:tasks.workunit.client.0.vm03.stdout:3/248: dwrite d5/d1e/f26 [0,4194304] 0 2026-03-09T16:14:25.916 INFO:tasks.workunit.client.0.vm03.stdout:3/249: write d5/d1e/d42/f25 [1139908,50912] 0 2026-03-09T16:14:25.916 INFO:tasks.workunit.client.0.vm03.stdout:2/279: dread db/d59/d64/d68/f6b [0,4194304] 0 2026-03-09T16:14:25.919 INFO:tasks.workunit.client.0.vm03.stdout:2/280: read - db/d12/f62 zero size 2026-03-09T16:14:25.927 INFO:tasks.workunit.client.0.vm03.stdout:2/281: dwrite db/d12/f62 [0,4194304] 0 2026-03-09T16:14:25.928 INFO:tasks.workunit.client.0.vm03.stdout:1/204: unlink f2 0 2026-03-09T16:14:25.950 INFO:tasks.workunit.client.0.vm03.stdout:4/282: mknod d5/d17/c53 0 2026-03-09T16:14:25.954 INFO:tasks.workunit.client.0.vm03.stdout:7/220: creat d4/da/d18/d22/f48 x:0 0 0 2026-03-09T16:14:25.955 INFO:tasks.workunit.client.0.vm03.stdout:7/221: dread - d4/da/d18/d22/d24/f41 zero size 2026-03-09T16:14:25.955 INFO:tasks.workunit.client.0.vm03.stdout:7/222: write d4/da/d18/d22/d24/f41 [495499,57966] 0 2026-03-09T16:14:25.956 INFO:tasks.workunit.client.0.vm03.stdout:0/298: creat d0/f60 x:0 0 0 2026-03-09T16:14:25.958 INFO:tasks.workunit.client.0.vm03.stdout:7/223: read d4/da/d18/d22/f33 [825354,44313] 0 2026-03-09T16:14:25.963 INFO:tasks.workunit.client.0.vm03.stdout:9/322: mkdir d2/df/d5f 0 2026-03-09T16:14:25.977 INFO:tasks.workunit.client.0.vm03.stdout:2/282: mkdir db/d12/d2a/d61/d6d 0 2026-03-09T16:14:25.980 INFO:tasks.workunit.client.0.vm03.stdout:4/283: sync 2026-03-09T16:14:25.981 INFO:tasks.workunit.client.0.vm03.stdout:5/320: getdents d2/d7/de/d11/d19 0 2026-03-09T16:14:25.982 INFO:tasks.workunit.client.0.vm03.stdout:5/321: readlink d2/l14 0 2026-03-09T16:14:25.984 INFO:tasks.workunit.client.0.vm03.stdout:4/284: dread d5/d17/d44/f4a [0,4194304] 0 2026-03-09T16:14:25.987 INFO:tasks.workunit.client.0.vm03.stdout:9/323: symlink d2/d4/d11/d12/d28/l60 0 2026-03-09T16:14:25.987 INFO:tasks.workunit.client.0.vm03.stdout:9/324: readlink d2/df/l39 0 2026-03-09T16:14:25.988 INFO:tasks.workunit.client.0.vm03.stdout:9/325: fsync d2/d4/d1f/f5b 0 2026-03-09T16:14:25.989 INFO:tasks.workunit.client.0.vm03.stdout:3/250: fsync d5/d1e/d42/f25 0 2026-03-09T16:14:25.992 INFO:tasks.workunit.client.0.vm03.stdout:2/283: dread db/f34 [0,4194304] 0 2026-03-09T16:14:25.998 INFO:tasks.workunit.client.0.vm03.stdout:1/205: unlink d4/ce 0 2026-03-09T16:14:25.998 INFO:tasks.workunit.client.0.vm03.stdout:1/206: fsync d4/d6/f15 0 2026-03-09T16:14:26.002 INFO:tasks.workunit.client.0.vm03.stdout:1/207: dwrite d4/d6/d1d/f40 [0,4194304] 0 2026-03-09T16:14:26.023 INFO:tasks.workunit.client.0.vm03.stdout:0/299: creat d0/d7/d3e/d5d/f61 x:0 0 0 2026-03-09T16:14:26.025 INFO:tasks.workunit.client.0.vm03.stdout:7/224: symlink d4/da/d45/l49 0 2026-03-09T16:14:26.028 INFO:tasks.workunit.client.0.vm03.stdout:4/285: unlink d5/db/c2a 0 2026-03-09T16:14:26.029 INFO:tasks.workunit.client.0.vm03.stdout:3/251: symlink d5/d1e/d42/d34/l47 0 2026-03-09T16:14:26.029 INFO:tasks.workunit.client.0.vm03.stdout:3/252: stat d5/d1e/d42/f1d 0 2026-03-09T16:14:26.036 INFO:tasks.workunit.client.0.vm03.stdout:0/300: sync 2026-03-09T16:14:26.037 INFO:tasks.workunit.client.0.vm03.stdout:0/301: fsync d0/d7/d3e/d57/d5a/f38 0 2026-03-09T16:14:26.037 INFO:tasks.workunit.client.0.vm03.stdout:0/302: write d0/da/d5c/f41 [685104,115412] 0 2026-03-09T16:14:26.044 INFO:tasks.workunit.client.0.vm03.stdout:9/326: dread d2/df/f10 [0,4194304] 0 2026-03-09T16:14:26.053 INFO:tasks.workunit.client.0.vm03.stdout:1/208: creat d4/d6/d1d/d3d/f49 x:0 0 0 2026-03-09T16:14:26.065 INFO:tasks.workunit.client.0.vm03.stdout:8/290: rename da/d15/f40 to da/db/f53 0 2026-03-09T16:14:26.071 INFO:tasks.workunit.client.0.vm03.stdout:6/259: getdents d9/d42 0 2026-03-09T16:14:26.086 INFO:tasks.workunit.client.0.vm03.stdout:3/253: dwrite d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:26.090 INFO:tasks.workunit.client.0.vm03.stdout:0/303: creat d0/d7/d48/d32/f62 x:0 0 0 2026-03-09T16:14:26.090 INFO:tasks.workunit.client.0.vm03.stdout:0/304: fsync d0/f4e 0 2026-03-09T16:14:26.105 INFO:tasks.workunit.client.0.vm03.stdout:9/327: rmdir d2 39 2026-03-09T16:14:26.107 INFO:tasks.workunit.client.0.vm03.stdout:5/322: rename d2/d7/d8/d24/l60 to d2/d7/de/d11/d38/d52/l73 0 2026-03-09T16:14:26.109 INFO:tasks.workunit.client.0.vm03.stdout:8/291: symlink da/d10/l54 0 2026-03-09T16:14:26.111 INFO:tasks.workunit.client.0.vm03.stdout:7/225: symlink d4/da/l4a 0 2026-03-09T16:14:26.112 INFO:tasks.workunit.client.0.vm03.stdout:7/226: write d4/da/d18/d22/d24/d16/f39 [266274,67396] 0 2026-03-09T16:14:26.113 INFO:tasks.workunit.client.0.vm03.stdout:7/227: write d4/da/d18/d22/d24/d16/f39 [133300,127042] 0 2026-03-09T16:14:26.115 INFO:tasks.workunit.client.0.vm03.stdout:7/228: read d4/da/f20 [3769531,78635] 0 2026-03-09T16:14:26.115 INFO:tasks.workunit.client.0.vm03.stdout:7/229: write d4/f3b [4761692,83460] 0 2026-03-09T16:14:26.128 INFO:tasks.workunit.client.0.vm03.stdout:5/323: fdatasync d2/d7/d1a/f4d 0 2026-03-09T16:14:26.132 INFO:tasks.workunit.client.0.vm03.stdout:8/292: fdatasync da/db/d30/f36 0 2026-03-09T16:14:26.136 INFO:tasks.workunit.client.0.vm03.stdout:6/260: mknod d9/c4c 0 2026-03-09T16:14:26.145 INFO:tasks.workunit.client.0.vm03.stdout:7/230: mkdir d4/d2d/d4b 0 2026-03-09T16:14:26.148 INFO:tasks.workunit.client.0.vm03.stdout:3/254: mknod d5/d44/c48 0 2026-03-09T16:14:26.149 INFO:tasks.workunit.client.0.vm03.stdout:2/284: link db/d12/d2a/d61/c28 db/d12/d2a/c6e 0 2026-03-09T16:14:26.156 INFO:tasks.workunit.client.0.vm03.stdout:9/328: mknod d2/d4/d11/d29/d2a/d38/c61 0 2026-03-09T16:14:26.157 INFO:tasks.workunit.client.0.vm03.stdout:9/329: write d2/d4/d1f/f23 [3620600,75122] 0 2026-03-09T16:14:26.157 INFO:tasks.workunit.client.0.vm03.stdout:9/330: chown d2/d4/d11/d12/f3d 1943534 1 2026-03-09T16:14:26.161 INFO:tasks.workunit.client.0.vm03.stdout:2/285: sync 2026-03-09T16:14:26.161 INFO:tasks.workunit.client.0.vm03.stdout:2/286: chown db/d12/d2a/f5f 49826 1 2026-03-09T16:14:26.165 INFO:tasks.workunit.client.0.vm03.stdout:2/287: dwrite db/d12/d2a/d61/f47 [0,4194304] 0 2026-03-09T16:14:26.176 INFO:tasks.workunit.client.0.vm03.stdout:8/293: rename da/db/c19 to da/db/d43/c55 0 2026-03-09T16:14:26.185 INFO:tasks.workunit.client.0.vm03.stdout:4/286: link d5/dd/d1f/f2d d5/f54 0 2026-03-09T16:14:26.185 INFO:tasks.workunit.client.0.vm03.stdout:3/255: unlink d5/d1e/d42/l1a 0 2026-03-09T16:14:26.188 INFO:tasks.workunit.client.0.vm03.stdout:0/305: creat d0/d7/d48/f63 x:0 0 0 2026-03-09T16:14:26.192 INFO:tasks.workunit.client.0.vm03.stdout:0/306: dread d0/d7/f3d [0,4194304] 0 2026-03-09T16:14:26.197 INFO:tasks.workunit.client.0.vm03.stdout:2/288: symlink db/d12/d2a/d61/l6f 0 2026-03-09T16:14:26.201 INFO:tasks.workunit.client.0.vm03.stdout:1/209: mkdir d4/d6/d3b/d4a 0 2026-03-09T16:14:26.210 INFO:tasks.workunit.client.0.vm03.stdout:8/294: creat da/d32/f56 x:0 0 0 2026-03-09T16:14:26.210 INFO:tasks.workunit.client.0.vm03.stdout:8/295: chown da/d1d 4028004 1 2026-03-09T16:14:26.216 INFO:tasks.workunit.client.0.vm03.stdout:5/324: dwrite d2/d7/de/d11/f26 [0,4194304] 0 2026-03-09T16:14:26.220 INFO:tasks.workunit.client.0.vm03.stdout:7/231: link d4/da/d18/d22/f48 d4/d2d/d4b/f4c 0 2026-03-09T16:14:26.223 INFO:tasks.workunit.client.0.vm03.stdout:7/232: dread d4/f3b [0,4194304] 0 2026-03-09T16:14:26.226 INFO:tasks.workunit.client.0.vm03.stdout:6/261: truncate d9/d22/f3f 386509 0 2026-03-09T16:14:26.232 INFO:tasks.workunit.client.0.vm03.stdout:4/287: rename d5/db/d25/d31/d51 to d5/db/d25/d31/d33/d55 0 2026-03-09T16:14:26.245 INFO:tasks.workunit.client.0.vm03.stdout:2/289: dwrite db/d12/d2a/d61/f45 [0,4194304] 0 2026-03-09T16:14:26.246 INFO:tasks.workunit.client.0.vm03.stdout:2/290: dread - db/d12/d2a/f58 zero size 2026-03-09T16:14:26.272 INFO:tasks.workunit.client.0.vm03.stdout:3/256: dwrite d5/d1e/d42/f29 [0,4194304] 0 2026-03-09T16:14:26.276 INFO:tasks.workunit.client.0.vm03.stdout:3/257: dwrite d5/fb [0,4194304] 0 2026-03-09T16:14:26.283 INFO:tasks.workunit.client.0.vm03.stdout:8/296: creat da/d10/d28/f57 x:0 0 0 2026-03-09T16:14:26.296 INFO:tasks.workunit.client.0.vm03.stdout:7/233: truncate d4/da/d18/d22/f33 1279864 0 2026-03-09T16:14:26.301 INFO:tasks.workunit.client.0.vm03.stdout:6/262: rmdir d9/d42/d45 39 2026-03-09T16:14:26.317 INFO:tasks.workunit.client.0.vm03.stdout:9/331: link d2/df/c3f d2/d4/d1f/c62 0 2026-03-09T16:14:26.320 INFO:tasks.workunit.client.0.vm03.stdout:0/307: truncate d0/da/d5c/f39 655046 0 2026-03-09T16:14:26.327 INFO:tasks.workunit.client.0.vm03.stdout:2/291: unlink db/ff 0 2026-03-09T16:14:26.333 INFO:tasks.workunit.client.0.vm03.stdout:1/210: symlink d4/db/l4b 0 2026-03-09T16:14:26.339 INFO:tasks.workunit.client.0.vm03.stdout:8/297: creat da/d15/f58 x:0 0 0 2026-03-09T16:14:26.356 INFO:tasks.workunit.client.0.vm03.stdout:5/325: truncate d2/d7/de/d11/f32 984831 0 2026-03-09T16:14:26.356 INFO:tasks.workunit.client.0.vm03.stdout:5/326: fdatasync d2/d7/d3c/d3d/f56 0 2026-03-09T16:14:26.360 INFO:tasks.workunit.client.0.vm03.stdout:7/234: rename d4/l11 to d4/da/d18/d22/d24/d16/d3e/l4d 0 2026-03-09T16:14:26.365 INFO:tasks.workunit.client.0.vm03.stdout:6/263: truncate d9/d14/f29 4726599 0 2026-03-09T16:14:26.372 INFO:tasks.workunit.client.0.vm03.stdout:9/332: mkdir d2/d4/d11/d29/d63 0 2026-03-09T16:14:26.372 INFO:tasks.workunit.client.0.vm03.stdout:9/333: read - d2/d4/d1f/f5b zero size 2026-03-09T16:14:26.378 INFO:tasks.workunit.client.0.vm03.stdout:4/288: dwrite f1 [0,4194304] 0 2026-03-09T16:14:26.382 INFO:tasks.workunit.client.0.vm03.stdout:6/264: sync 2026-03-09T16:14:26.388 INFO:tasks.workunit.client.0.vm03.stdout:0/308: mknod d0/da/d5c/c64 0 2026-03-09T16:14:26.389 INFO:tasks.workunit.client.0.vm03.stdout:0/309: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:26.397 INFO:tasks.workunit.client.0.vm03.stdout:2/292: rmdir db/d12/d2a 39 2026-03-09T16:14:26.416 INFO:tasks.workunit.client.0.vm03.stdout:1/211: symlink d4/db/l4c 0 2026-03-09T16:14:26.416 INFO:tasks.workunit.client.0.vm03.stdout:1/212: truncate d4/d6/d1d/d3d/f49 314985 0 2026-03-09T16:14:26.419 INFO:tasks.workunit.client.0.vm03.stdout:3/258: symlink d5/l49 0 2026-03-09T16:14:26.422 INFO:tasks.workunit.client.0.vm03.stdout:3/259: dread d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:26.422 INFO:tasks.workunit.client.0.vm03.stdout:3/260: chown d5/f43 3 1 2026-03-09T16:14:26.426 INFO:tasks.workunit.client.0.vm03.stdout:8/298: fsync da/d10/f14 0 2026-03-09T16:14:26.427 INFO:tasks.workunit.client.0.vm03.stdout:8/299: write da/d10/d28/f2c [3278248,19039] 0 2026-03-09T16:14:26.442 INFO:tasks.workunit.client.0.vm03.stdout:5/327: symlink d2/d7/d8/d24/d27/d43/l74 0 2026-03-09T16:14:26.447 INFO:tasks.workunit.client.0.vm03.stdout:7/235: dwrite d4/da/d18/f44 [4194304,4194304] 0 2026-03-09T16:14:26.455 INFO:tasks.workunit.client.0.vm03.stdout:7/236: dwrite d4/da/d18/d22/d24/d16/f39 [0,4194304] 0 2026-03-09T16:14:26.477 INFO:tasks.workunit.client.0.vm03.stdout:0/310: symlink d0/d7/d3e/d45/l65 0 2026-03-09T16:14:26.480 INFO:tasks.workunit.client.0.vm03.stdout:2/293: symlink db/d59/l70 0 2026-03-09T16:14:26.481 INFO:tasks.workunit.client.0.vm03.stdout:2/294: dread db/d59/f53 [0,4194304] 0 2026-03-09T16:14:26.485 INFO:tasks.workunit.client.0.vm03.stdout:1/213: mknod d4/d6/d1d/d20/d23/d3e/c4d 0 2026-03-09T16:14:26.486 INFO:tasks.workunit.client.0.vm03.stdout:1/214: write d4/d6/d1d/d3d/f45 [515785,35243] 0 2026-03-09T16:14:26.488 INFO:tasks.workunit.client.0.vm03.stdout:3/261: dread d5/f11 [0,4194304] 0 2026-03-09T16:14:26.489 INFO:tasks.workunit.client.0.vm03.stdout:3/262: fsync d5/d1e/d42/f2c 0 2026-03-09T16:14:26.501 INFO:tasks.workunit.client.0.vm03.stdout:5/328: mkdir d2/d75 0 2026-03-09T16:14:26.508 INFO:tasks.workunit.client.0.vm03.stdout:7/237: write d4/f3b [4522478,71708] 0 2026-03-09T16:14:26.508 INFO:tasks.workunit.client.0.vm03.stdout:7/238: fsync d4/da/f42 0 2026-03-09T16:14:26.518 INFO:tasks.workunit.client.0.vm03.stdout:9/334: dwrite d2/df/f22 [4194304,4194304] 0 2026-03-09T16:14:26.522 INFO:tasks.workunit.client.0.vm03.stdout:4/289: mkdir d5/d56 0 2026-03-09T16:14:26.532 INFO:tasks.workunit.client.0.vm03.stdout:6/265: creat d9/d42/d45/f4d x:0 0 0 2026-03-09T16:14:26.539 INFO:tasks.workunit.client.0.vm03.stdout:6/266: readlink d9/d22/l19 0 2026-03-09T16:14:26.542 INFO:tasks.workunit.client.0.vm03.stdout:0/311: creat d0/da/d5c/f66 x:0 0 0 2026-03-09T16:14:26.542 INFO:tasks.workunit.client.0.vm03.stdout:6/267: write d9/d22/f1c [2598332,38600] 0 2026-03-09T16:14:26.545 INFO:tasks.workunit.client.0.vm03.stdout:0/312: truncate d0/d7/d3e/d57/d5a/f4b 570288 0 2026-03-09T16:14:26.547 INFO:tasks.workunit.client.0.vm03.stdout:9/335: dread d2/d4/d11/d29/d2a/d4d/f53 [0,4194304] 0 2026-03-09T16:14:26.548 INFO:tasks.workunit.client.0.vm03.stdout:9/336: read - d2/d4/d1f/f51 zero size 2026-03-09T16:14:26.550 INFO:tasks.workunit.client.0.vm03.stdout:9/337: write d2/d4/d11/d29/f5d [3090154,21090] 0 2026-03-09T16:14:26.582 INFO:tasks.workunit.client.0.vm03.stdout:5/329: symlink d2/d7/de/d54/l76 0 2026-03-09T16:14:26.583 INFO:tasks.workunit.client.0.vm03.stdout:5/330: fsync d2/d7/d8/d16/d5c/f64 0 2026-03-09T16:14:26.583 INFO:tasks.workunit.client.0.vm03.stdout:5/331: readlink d2/d7/de/d54/l76 0 2026-03-09T16:14:26.588 INFO:tasks.workunit.client.0.vm03.stdout:7/239: creat d4/da/d45/f4e x:0 0 0 2026-03-09T16:14:26.594 INFO:tasks.workunit.client.0.vm03.stdout:4/290: unlink d5/d40/d46/f52 0 2026-03-09T16:14:26.602 INFO:tasks.workunit.client.0.vm03.stdout:6/268: rename d9/d14/f28 to d9/d22/f4e 0 2026-03-09T16:14:26.604 INFO:tasks.workunit.client.0.vm03.stdout:0/313: symlink d0/d7/d3e/d45/l67 0 2026-03-09T16:14:26.610 INFO:tasks.workunit.client.0.vm03.stdout:4/291: dread d5/f7 [0,4194304] 0 2026-03-09T16:14:26.610 INFO:tasks.workunit.client.0.vm03.stdout:4/292: chown d5/db/d25/d31/d33 438094053 1 2026-03-09T16:14:26.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:26 vm03.local ceph-mon[51019]: pgmap v5: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:26.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:26 vm03.local ceph-mon[51019]: mgrmap e23: vm05.dygxfv(active, since 4s) 2026-03-09T16:14:26.623 INFO:tasks.workunit.client.0.vm03.stdout:9/338: creat d2/df/f64 x:0 0 0 2026-03-09T16:14:26.634 INFO:tasks.workunit.client.0.vm03.stdout:3/263: fdatasync d5/f10 0 2026-03-09T16:14:26.637 INFO:tasks.workunit.client.0.vm03.stdout:3/264: dwrite d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:26.639 INFO:tasks.workunit.client.0.vm03.stdout:3/265: dread - d5/d1e/d42/f2c zero size 2026-03-09T16:14:26.656 INFO:tasks.workunit.client.0.vm03.stdout:5/332: creat d2/d7/de/d11/d19/d29/f77 x:0 0 0 2026-03-09T16:14:26.658 INFO:tasks.workunit.client.0.vm03.stdout:7/240: mknod d4/da/d18/d22/d24/d16/d2b/c4f 0 2026-03-09T16:14:26.659 INFO:tasks.workunit.client.0.vm03.stdout:7/241: write d4/f3b [3265017,92945] 0 2026-03-09T16:14:26.661 INFO:tasks.workunit.client.0.vm03.stdout:5/333: readlink d2/d7/d8/d24/d27/d43/l74 0 2026-03-09T16:14:26.669 INFO:tasks.workunit.client.0.vm03.stdout:7/242: dwrite d4/da/d18/d22/d24/f2f [4194304,4194304] 0 2026-03-09T16:14:26.673 INFO:tasks.workunit.client.0.vm03.stdout:5/334: write d2/d7/d1a/d1c/f5e [701823,118913] 0 2026-03-09T16:14:26.681 INFO:tasks.workunit.client.0.vm03.stdout:7/243: dwrite d4/f8 [0,4194304] 0 2026-03-09T16:14:26.694 INFO:tasks.workunit.client.0.vm03.stdout:5/335: truncate d2/d7/de/d11/d19/d31/f42 6568647 0 2026-03-09T16:14:26.704 INFO:tasks.workunit.client.0.vm03.stdout:4/293: read d5/db/d25/f26 [648087,3215] 0 2026-03-09T16:14:26.706 INFO:tasks.workunit.client.0.vm03.stdout:9/339: mknod d2/d4/d11/d29/d2a/d4d/c65 0 2026-03-09T16:14:26.714 INFO:tasks.workunit.client.0.vm03.stdout:3/266: mknod d5/d2e/c4a 0 2026-03-09T16:14:26.714 INFO:tasks.workunit.client.0.vm03.stdout:8/300: link da/db/d30/c48 da/d15/c59 0 2026-03-09T16:14:26.720 INFO:tasks.workunit.client.0.vm03.stdout:0/314: write d0/da/d1b/f46 [683734,13601] 0 2026-03-09T16:14:26.733 INFO:tasks.workunit.client.0.vm03.stdout:7/244: creat d4/da/d19/f50 x:0 0 0 2026-03-09T16:14:26.744 INFO:tasks.workunit.client.0.vm03.stdout:2/295: link db/l33 db/d59/d64/l71 0 2026-03-09T16:14:26.749 INFO:tasks.workunit.client.0.vm03.stdout:6/269: dwrite d9/ff [4194304,4194304] 0 2026-03-09T16:14:26.753 INFO:tasks.workunit.client.0.vm03.stdout:4/294: dwrite d5/db/f34 [0,4194304] 0 2026-03-09T16:14:26.756 INFO:tasks.workunit.client.0.vm03.stdout:9/340: rmdir d2 39 2026-03-09T16:14:26.764 INFO:tasks.workunit.client.0.vm03.stdout:1/215: link d4/d6/f9 d4/d6/d1d/d24/d25/f4e 0 2026-03-09T16:14:26.765 INFO:tasks.workunit.client.0.vm03.stdout:1/216: read d4/d6/d1d/d20/f2a [666521,48850] 0 2026-03-09T16:14:26.766 INFO:tasks.workunit.client.0.vm03.stdout:1/217: write d4/db/f47 [223995,4157] 0 2026-03-09T16:14:26.775 INFO:tasks.workunit.client.0.vm03.stdout:3/267: mknod d5/d2e/c4b 0 2026-03-09T16:14:26.794 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:26 vm05.local ceph-mon[58702]: pgmap v5: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:26.794 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:26 vm05.local ceph-mon[58702]: mgrmap e23: vm05.dygxfv(active, since 4s) 2026-03-09T16:14:26.794 INFO:tasks.workunit.client.0.vm03.stdout:3/268: readlink d5/l21 0 2026-03-09T16:14:26.794 INFO:tasks.workunit.client.0.vm03.stdout:8/301: truncate da/d10/f1f 3969666 0 2026-03-09T16:14:26.794 INFO:tasks.workunit.client.0.vm03.stdout:7/245: rename d4/da/d19 to d4/da/d45/d51 0 2026-03-09T16:14:26.794 INFO:tasks.workunit.client.0.vm03.stdout:7/246: dwrite d4/da/d18/f37 [0,4194304] 0 2026-03-09T16:14:26.794 INFO:tasks.workunit.client.0.vm03.stdout:2/296: write db/d12/d2a/f38 [225342,114573] 0 2026-03-09T16:14:26.801 INFO:tasks.workunit.client.0.vm03.stdout:1/218: fdatasync d4/d6/d1d/d24/d25/f4e 0 2026-03-09T16:14:26.804 INFO:tasks.workunit.client.0.vm03.stdout:3/269: mkdir d5/d1e/d42/d4c 0 2026-03-09T16:14:26.804 INFO:tasks.workunit.client.0.vm03.stdout:8/302: mknod da/d1d/d3b/c5a 0 2026-03-09T16:14:26.809 INFO:tasks.workunit.client.0.vm03.stdout:6/270: symlink d9/d42/l4f 0 2026-03-09T16:14:26.809 INFO:tasks.workunit.client.0.vm03.stdout:6/271: fdatasync d9/f20 0 2026-03-09T16:14:26.812 INFO:tasks.workunit.client.0.vm03.stdout:3/270: mknod d5/d2e/c4d 0 2026-03-09T16:14:26.813 INFO:tasks.workunit.client.0.vm03.stdout:3/271: write d5/f33 [793365,94767] 0 2026-03-09T16:14:26.813 INFO:tasks.workunit.client.0.vm03.stdout:3/272: stat d5/f11 0 2026-03-09T16:14:26.815 INFO:tasks.workunit.client.0.vm03.stdout:0/315: link d0/d7/f3d d0/d7/d3e/d57/d5a/d52/f68 0 2026-03-09T16:14:26.816 INFO:tasks.workunit.client.0.vm03.stdout:0/316: chown d0/da/d5c/c64 131824286 1 2026-03-09T16:14:26.817 INFO:tasks.workunit.client.0.vm03.stdout:5/336: getdents d2/d7 0 2026-03-09T16:14:26.817 INFO:tasks.workunit.client.0.vm03.stdout:5/337: fsync d2/d7/d8/d16/d5c/f64 0 2026-03-09T16:14:26.822 INFO:tasks.workunit.client.0.vm03.stdout:6/272: mkdir d9/d42/d45/d50 0 2026-03-09T16:14:26.822 INFO:tasks.workunit.client.0.vm03.stdout:6/273: write d9/f35 [595930,87805] 0 2026-03-09T16:14:26.837 INFO:tasks.workunit.client.0.vm03.stdout:8/303: rmdir da/d1d 39 2026-03-09T16:14:26.839 INFO:tasks.workunit.client.0.vm03.stdout:4/295: link d5/d17/c53 d5/db/c57 0 2026-03-09T16:14:26.841 INFO:tasks.workunit.client.0.vm03.stdout:1/219: creat d4/d31/f4f x:0 0 0 2026-03-09T16:14:26.847 INFO:tasks.workunit.client.0.vm03.stdout:6/274: rmdir d9/d14 39 2026-03-09T16:14:26.850 INFO:tasks.workunit.client.0.vm03.stdout:6/275: dwrite d9/f33 [0,4194304] 0 2026-03-09T16:14:26.853 INFO:tasks.workunit.client.0.vm03.stdout:9/341: creat d2/d4/d11/f66 x:0 0 0 2026-03-09T16:14:26.854 INFO:tasks.workunit.client.0.vm03.stdout:9/342: chown d2/df/f14 498 1 2026-03-09T16:14:26.854 INFO:tasks.workunit.client.0.vm03.stdout:9/343: fsync d2/df/f22 0 2026-03-09T16:14:26.864 INFO:tasks.workunit.client.0.vm03.stdout:2/297: truncate db/f55 831395 0 2026-03-09T16:14:26.870 INFO:tasks.workunit.client.0.vm03.stdout:9/344: mknod d2/d4/c67 0 2026-03-09T16:14:26.879 INFO:tasks.workunit.client.0.vm03.stdout:0/317: link d0/da/ff d0/d7/d48/d32/f69 0 2026-03-09T16:14:26.879 INFO:tasks.workunit.client.0.vm03.stdout:9/345: creat d2/d4/d11/d12/f68 x:0 0 0 2026-03-09T16:14:26.879 INFO:tasks.workunit.client.0.vm03.stdout:8/304: link da/d10/d28/l50 da/d10/d28/l5b 0 2026-03-09T16:14:26.879 INFO:tasks.workunit.client.0.vm03.stdout:3/273: getdents d5/d1e/d42/d34 0 2026-03-09T16:14:26.880 INFO:tasks.workunit.client.0.vm03.stdout:0/318: mknod d0/d7/d3e/d57/d5a/d52/c6a 0 2026-03-09T16:14:26.880 INFO:tasks.workunit.client.0.vm03.stdout:0/319: dread - d0/d7/f56 zero size 2026-03-09T16:14:26.881 INFO:tasks.workunit.client.0.vm03.stdout:6/276: fsync d9/d14/f1d 0 2026-03-09T16:14:26.881 INFO:tasks.workunit.client.0.vm03.stdout:6/277: fsync d9/d22/f37 0 2026-03-09T16:14:26.888 INFO:tasks.workunit.client.0.vm03.stdout:9/346: unlink d2/d4/c67 0 2026-03-09T16:14:26.889 INFO:tasks.workunit.client.0.vm03.stdout:9/347: dread - d2/d54/f5e zero size 2026-03-09T16:14:26.892 INFO:tasks.workunit.client.0.vm03.stdout:1/220: sync 2026-03-09T16:14:26.893 INFO:tasks.workunit.client.0.vm03.stdout:1/221: dread - d4/d31/f4f zero size 2026-03-09T16:14:26.896 INFO:tasks.workunit.client.0.vm03.stdout:9/348: dread d2/d4/d11/f41 [0,4194304] 0 2026-03-09T16:14:26.903 INFO:tasks.workunit.client.0.vm03.stdout:7/247: truncate d4/f8 1099441 0 2026-03-09T16:14:26.904 INFO:tasks.workunit.client.0.vm03.stdout:5/338: write d2/d7/de/f48 [417580,115038] 0 2026-03-09T16:14:26.912 INFO:tasks.workunit.client.0.vm03.stdout:8/305: dwrite da/d1d/f4a [0,4194304] 0 2026-03-09T16:14:26.913 INFO:tasks.workunit.client.0.vm03.stdout:8/306: fsync da/d15/f58 0 2026-03-09T16:14:26.935 INFO:tasks.workunit.client.0.vm03.stdout:4/296: link d5/dd/f1e d5/dd/d1f/f58 0 2026-03-09T16:14:26.936 INFO:tasks.workunit.client.0.vm03.stdout:2/298: creat db/d12/f72 x:0 0 0 2026-03-09T16:14:26.936 INFO:tasks.workunit.client.0.vm03.stdout:3/274: readlink d5/d44/d3c/l28 0 2026-03-09T16:14:26.941 INFO:tasks.workunit.client.0.vm03.stdout:3/275: dwrite d5/d1e/d42/f29 [0,4194304] 0 2026-03-09T16:14:26.942 INFO:tasks.workunit.client.0.vm03.stdout:3/276: chown d5/f16 596409 1 2026-03-09T16:14:26.966 INFO:tasks.workunit.client.0.vm03.stdout:1/222: mkdir d4/d6/d1d/d24/d25/d50 0 2026-03-09T16:14:26.968 INFO:tasks.workunit.client.0.vm03.stdout:1/223: dread d4/d6/d1d/f40 [0,4194304] 0 2026-03-09T16:14:26.985 INFO:tasks.workunit.client.0.vm03.stdout:7/248: rmdir d4 39 2026-03-09T16:14:26.999 INFO:tasks.workunit.client.0.vm03.stdout:5/339: creat d2/d7/de/f78 x:0 0 0 2026-03-09T16:14:27.001 INFO:tasks.workunit.client.0.vm03.stdout:8/307: creat da/d10/d28/f5c x:0 0 0 2026-03-09T16:14:27.003 INFO:tasks.workunit.client.0.vm03.stdout:8/308: chown da/d10/d28/f57 6245 1 2026-03-09T16:14:27.009 INFO:tasks.workunit.client.0.vm03.stdout:8/309: dwrite da/d15/f2f [0,4194304] 0 2026-03-09T16:14:27.024 INFO:tasks.workunit.client.0.vm03.stdout:0/320: symlink d0/l6b 0 2026-03-09T16:14:27.034 INFO:tasks.workunit.client.0.vm03.stdout:4/297: fdatasync d5/dd/f23 0 2026-03-09T16:14:27.035 INFO:tasks.workunit.client.0.vm03.stdout:6/278: creat d9/d42/d45/d50/f51 x:0 0 0 2026-03-09T16:14:27.036 INFO:tasks.workunit.client.0.vm03.stdout:6/279: write d9/d22/f27 [89316,83034] 0 2026-03-09T16:14:27.048 INFO:tasks.workunit.client.0.vm03.stdout:6/280: dread d9/ff [4194304,4194304] 0 2026-03-09T16:14:27.051 INFO:tasks.workunit.client.0.vm03.stdout:1/224: dwrite d4/d6/d1d/d20/d23/f30 [0,4194304] 0 2026-03-09T16:14:27.068 INFO:tasks.workunit.client.0.vm03.stdout:8/310: mknod da/d1d/d3b/c5d 0 2026-03-09T16:14:27.069 INFO:tasks.workunit.client.0.vm03.stdout:5/340: dread d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:27.078 INFO:tasks.workunit.client.0.vm03.stdout:5/341: dread d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:27.083 INFO:tasks.workunit.client.0.vm03.stdout:0/321: write d0/d7/d48/f18 [724621,71698] 0 2026-03-09T16:14:27.083 INFO:tasks.workunit.client.0.vm03.stdout:2/299: write f7 [1085757,126256] 0 2026-03-09T16:14:27.085 INFO:tasks.workunit.client.0.vm03.stdout:2/300: chown db/d12/l32 62000 1 2026-03-09T16:14:27.087 INFO:tasks.workunit.client.0.vm03.stdout:0/322: truncate d0/d7/d3e/d45/f5e 104171 0 2026-03-09T16:14:27.087 INFO:tasks.workunit.client.0.vm03.stdout:5/342: dread d2/d7/de/d11/d19/d31/f42 [0,4194304] 0 2026-03-09T16:14:27.087 INFO:tasks.workunit.client.0.vm03.stdout:0/323: fdatasync d0/da/f2c 0 2026-03-09T16:14:27.100 INFO:tasks.workunit.client.0.vm03.stdout:5/343: read d2/d7/d1a/d1c/d3f/f67 [2604164,128286] 0 2026-03-09T16:14:27.109 INFO:tasks.workunit.client.0.vm03.stdout:6/281: chown f7 204024751 1 2026-03-09T16:14:27.110 INFO:tasks.workunit.client.0.vm03.stdout:3/277: getdents d5/d44/d3c 0 2026-03-09T16:14:27.112 INFO:tasks.workunit.client.0.vm03.stdout:9/349: link d2/d4/d11/d29/d2a/d4d/f53 d2/d54/f69 0 2026-03-09T16:14:27.112 INFO:tasks.workunit.client.0.vm03.stdout:7/249: rmdir d4/da/d18/d22/d24/d16/d2b 39 2026-03-09T16:14:27.116 INFO:tasks.workunit.client.0.vm03.stdout:6/282: dread - d9/f40 zero size 2026-03-09T16:14:27.119 INFO:tasks.workunit.client.0.vm03.stdout:5/344: dwrite d2/d7/d8/d16/d5c/f71 [0,4194304] 0 2026-03-09T16:14:27.119 INFO:tasks.workunit.client.0.vm03.stdout:8/311: creat da/d1d/f5e x:0 0 0 2026-03-09T16:14:27.143 INFO:tasks.workunit.client.0.vm03.stdout:4/298: rename d5/dd/f41 to d5/dd/d1f/f59 0 2026-03-09T16:14:27.145 INFO:tasks.workunit.client.0.vm03.stdout:2/301: rename db/d59/d64 to db/d59/d64/d68/d73 22 2026-03-09T16:14:27.145 INFO:tasks.workunit.client.0.vm03.stdout:2/302: readlink db/d12/l16 0 2026-03-09T16:14:27.172 INFO:tasks.workunit.client.0.vm03.stdout:7/250: creat d4/d2d/f52 x:0 0 0 2026-03-09T16:14:27.184 INFO:tasks.workunit.client.0.vm03.stdout:5/345: creat d2/d7/d1a/d1c/d6c/f79 x:0 0 0 2026-03-09T16:14:27.187 INFO:tasks.workunit.client.0.vm03.stdout:6/283: dread d9/d22/f3f [0,4194304] 0 2026-03-09T16:14:27.191 INFO:tasks.workunit.client.0.vm03.stdout:5/346: dread d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:27.191 INFO:tasks.workunit.client.0.vm03.stdout:0/324: symlink d0/d7/d3e/d57/d5a/d5f/l6c 0 2026-03-09T16:14:27.192 INFO:tasks.workunit.client.0.vm03.stdout:5/347: stat d2/c49 0 2026-03-09T16:14:27.192 INFO:tasks.workunit.client.0.vm03.stdout:5/348: write d2/d7/d3c/d3d/f56 [2382461,40263] 0 2026-03-09T16:14:27.215 INFO:tasks.workunit.client.0.vm03.stdout:1/225: symlink d4/d39/l51 0 2026-03-09T16:14:27.221 INFO:tasks.workunit.client.0.vm03.stdout:9/350: creat d2/df/d5f/f6a x:0 0 0 2026-03-09T16:14:27.222 INFO:tasks.workunit.client.0.vm03.stdout:9/351: truncate d2/f33 4745396 0 2026-03-09T16:14:27.222 INFO:tasks.workunit.client.0.vm03.stdout:9/352: fsync d2/f7 0 2026-03-09T16:14:27.224 INFO:tasks.workunit.client.0.vm03.stdout:2/303: dwrite db/f23 [0,4194304] 0 2026-03-09T16:14:27.225 INFO:tasks.workunit.client.0.vm03.stdout:8/312: symlink da/db/l5f 0 2026-03-09T16:14:27.227 INFO:tasks.workunit.client.0.vm03.stdout:7/251: dwrite d4/f26 [4194304,4194304] 0 2026-03-09T16:14:27.244 INFO:tasks.workunit.client.0.vm03.stdout:6/284: truncate d9/d22/f2d 1307509 0 2026-03-09T16:14:27.253 INFO:tasks.workunit.client.0.vm03.stdout:6/285: dwrite d9/d22/f3e [0,4194304] 0 2026-03-09T16:14:27.257 INFO:tasks.workunit.client.0.vm03.stdout:5/349: creat d2/d7/d8/f7a x:0 0 0 2026-03-09T16:14:27.257 INFO:tasks.workunit.client.0.vm03.stdout:5/350: fsync d2/f5a 0 2026-03-09T16:14:27.258 INFO:tasks.workunit.client.0.vm03.stdout:0/325: dread d0/d7/d48/f2e [4194304,4194304] 0 2026-03-09T16:14:27.258 INFO:tasks.workunit.client.0.vm03.stdout:4/299: dwrite d5/db/d25/f26 [0,4194304] 0 2026-03-09T16:14:27.259 INFO:tasks.workunit.client.0.vm03.stdout:0/326: chown d0/da/c30 105934599 1 2026-03-09T16:14:27.263 INFO:tasks.workunit.client.0.vm03.stdout:4/300: chown d5/db/d25/c3e 15561625 1 2026-03-09T16:14:27.265 INFO:tasks.workunit.client.0.vm03.stdout:7/252: mkdir d4/dc/d53 0 2026-03-09T16:14:27.265 INFO:tasks.workunit.client.0.vm03.stdout:6/286: mknod d9/d42/d45/d47/c52 0 2026-03-09T16:14:27.265 INFO:tasks.workunit.client.0.vm03.stdout:1/226: write d4/fa [899076,38973] 0 2026-03-09T16:14:27.272 INFO:tasks.workunit.client.0.vm03.stdout:0/327: chown d0/f29 32508206 1 2026-03-09T16:14:27.281 INFO:tasks.workunit.client.0.vm03.stdout:0/328: stat d0/d7/d3e/d57/d5a/d52 0 2026-03-09T16:14:27.281 INFO:tasks.workunit.client.0.vm03.stdout:7/253: dwrite d4/d2d/f32 [0,4194304] 0 2026-03-09T16:14:27.281 INFO:tasks.workunit.client.0.vm03.stdout:0/329: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:27.282 INFO:tasks.workunit.client.0.vm03.stdout:0/330: fsync d0/da/d5c/f66 0 2026-03-09T16:14:27.291 INFO:tasks.workunit.client.0.vm03.stdout:8/313: dwrite da/d10/f23 [4194304,4194304] 0 2026-03-09T16:14:27.296 INFO:tasks.workunit.client.0.vm03.stdout:3/278: getdents d5/d1e/d42 0 2026-03-09T16:14:27.297 INFO:tasks.workunit.client.0.vm03.stdout:8/314: write f8 [1661940,2637] 0 2026-03-09T16:14:27.297 INFO:tasks.workunit.client.0.vm03.stdout:3/279: write d5/d1e/d42/f25 [548891,62257] 0 2026-03-09T16:14:27.297 INFO:tasks.workunit.client.0.vm03.stdout:0/331: dwrite d0/f4e [0,4194304] 0 2026-03-09T16:14:27.300 INFO:tasks.workunit.client.0.vm03.stdout:5/351: mknod d2/d7/de/d11/d19/d31/c7b 0 2026-03-09T16:14:27.301 INFO:tasks.workunit.client.0.vm03.stdout:5/352: truncate d2/d7/d1a/d1c/d6c/f6d 136258 0 2026-03-09T16:14:27.310 INFO:tasks.workunit.client.0.vm03.stdout:3/280: dread d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:27.314 INFO:tasks.workunit.client.0.vm03.stdout:4/301: sync 2026-03-09T16:14:27.321 INFO:tasks.workunit.client.0.vm03.stdout:4/302: dread d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:27.345 INFO:tasks.workunit.client.0.vm03.stdout:8/315: mknod da/db/d30/c60 0 2026-03-09T16:14:27.353 INFO:tasks.workunit.client.0.vm03.stdout:8/316: dread da/d10/d28/f29 [0,4194304] 0 2026-03-09T16:14:27.370 INFO:tasks.workunit.client.0.vm03.stdout:5/353: mknod d2/d7/de/d11/d19/d29/c7c 0 2026-03-09T16:14:27.374 INFO:tasks.workunit.client.0.vm03.stdout:4/303: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:27.378 INFO:tasks.workunit.client.0.vm03.stdout:4/304: write d5/db/f3a [747062,25401] 0 2026-03-09T16:14:27.395 INFO:tasks.workunit.client.0.vm03.stdout:8/317: creat da/d32/f61 x:0 0 0 2026-03-09T16:14:27.395 INFO:tasks.workunit.client.0.vm03.stdout:9/353: getdents d2/d4/d11/d12 0 2026-03-09T16:14:27.396 INFO:tasks.workunit.client.0.vm03.stdout:2/304: dwrite db/d12/d2a/d61/f5d [0,4194304] 0 2026-03-09T16:14:27.398 INFO:tasks.workunit.client.0.vm03.stdout:8/318: write da/db/fe [4097517,12621] 0 2026-03-09T16:14:27.425 INFO:tasks.workunit.client.0.vm03.stdout:1/227: link d4/d39/l51 d4/d31/l52 0 2026-03-09T16:14:27.435 INFO:tasks.workunit.client.0.vm03.stdout:5/354: creat d2/d7/de/d11/d38/d52/f7d x:0 0 0 2026-03-09T16:14:27.435 INFO:tasks.workunit.client.0.vm03.stdout:5/355: chown d2/d7/d1a/d1c/l53 899717 1 2026-03-09T16:14:27.435 INFO:tasks.workunit.client.0.vm03.stdout:5/356: stat d2/d7/d8/f36 0 2026-03-09T16:14:27.436 INFO:tasks.workunit.client.0.vm03.stdout:4/305: symlink d5/d40/l5a 0 2026-03-09T16:14:27.439 INFO:tasks.workunit.client.0.vm03.stdout:9/354: mknod d2/d4/d11/d29/d2a/d46/c6b 0 2026-03-09T16:14:27.443 INFO:tasks.workunit.client.0.vm03.stdout:7/254: truncate d4/da/d18/f37 688462 0 2026-03-09T16:14:27.444 INFO:tasks.workunit.client.0.vm03.stdout:2/305: creat db/d12/d2a/d61/f74 x:0 0 0 2026-03-09T16:14:27.445 INFO:tasks.workunit.client.0.vm03.stdout:8/319: creat da/db/d30/f62 x:0 0 0 2026-03-09T16:14:27.445 INFO:tasks.workunit.client.0.vm03.stdout:8/320: write da/db/fe [2271459,130405] 0 2026-03-09T16:14:27.448 INFO:tasks.workunit.client.0.vm03.stdout:6/287: getdents d9/d42/d45 0 2026-03-09T16:14:27.449 INFO:tasks.workunit.client.0.vm03.stdout:5/357: creat d2/d7/de/d11/d19/d31/f7e x:0 0 0 2026-03-09T16:14:27.453 INFO:tasks.workunit.client.0.vm03.stdout:5/358: dwrite d2/d7/de/d11/f26 [4194304,4194304] 0 2026-03-09T16:14:27.454 INFO:tasks.workunit.client.0.vm03.stdout:7/255: mknod d4/dc/c54 0 2026-03-09T16:14:27.454 INFO:tasks.workunit.client.0.vm03.stdout:7/256: dread - d4/da/d45/f4e zero size 2026-03-09T16:14:27.458 INFO:tasks.workunit.client.0.vm03.stdout:5/359: dread d2/d7/d1a/f4d [0,4194304] 0 2026-03-09T16:14:27.460 INFO:tasks.workunit.client.0.vm03.stdout:7/257: dwrite d4/d2d/f52 [0,4194304] 0 2026-03-09T16:14:27.467 INFO:tasks.workunit.client.0.vm03.stdout:5/360: dwrite d2/d7/d1a/d1c/d6c/f79 [0,4194304] 0 2026-03-09T16:14:27.484 INFO:tasks.workunit.client.0.vm03.stdout:1/228: mknod d4/c53 0 2026-03-09T16:14:27.486 INFO:tasks.workunit.client.0.vm03.stdout:4/306: mkdir d5/db/d25/d31/d4d/d5b 0 2026-03-09T16:14:27.487 INFO:tasks.workunit.client.0.vm03.stdout:2/306: symlink db/l75 0 2026-03-09T16:14:27.487 INFO:tasks.workunit.client.0.vm03.stdout:7/258: mknod d4/da/d45/d51/c55 0 2026-03-09T16:14:27.488 INFO:tasks.workunit.client.0.vm03.stdout:6/288: dread d9/d14/f29 [0,4194304] 0 2026-03-09T16:14:27.488 INFO:tasks.workunit.client.0.vm03.stdout:2/307: write db/d12/f4b [474900,84329] 0 2026-03-09T16:14:27.490 INFO:tasks.workunit.client.0.vm03.stdout:2/308: truncate db/d12/f69 50002 0 2026-03-09T16:14:27.491 INFO:tasks.workunit.client.0.vm03.stdout:4/307: dread d5/db/f34 [0,4194304] 0 2026-03-09T16:14:27.495 INFO:tasks.workunit.client.0.vm03.stdout:9/355: creat d2/d4/d11/f6c x:0 0 0 2026-03-09T16:14:27.498 INFO:tasks.workunit.client.0.vm03.stdout:9/356: dwrite d2/d4/d11/f6c [0,4194304] 0 2026-03-09T16:14:27.506 INFO:tasks.workunit.client.0.vm03.stdout:7/259: dread d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:27.507 INFO:tasks.workunit.client.0.vm03.stdout:6/289: mknod d9/d42/d45/c53 0 2026-03-09T16:14:27.524 INFO:tasks.workunit.client.0.vm03.stdout:9/357: mkdir d2/d54/d6d 0 2026-03-09T16:14:27.525 INFO:tasks.workunit.client.0.vm03.stdout:6/290: rmdir d9/d42 39 2026-03-09T16:14:27.525 INFO:tasks.workunit.client.0.vm03.stdout:9/358: dwrite d2/d4/d1f/f5b [0,4194304] 0 2026-03-09T16:14:27.532 INFO:tasks.workunit.client.0.vm03.stdout:7/260: creat d4/da/d18/d22/d24/d16/d2b/f56 x:0 0 0 2026-03-09T16:14:27.538 INFO:tasks.workunit.client.0.vm03.stdout:9/359: symlink d2/d54/l6e 0 2026-03-09T16:14:27.542 INFO:tasks.workunit.client.0.vm03.stdout:9/360: readlink d2/d4/d11/d12/l43 0 2026-03-09T16:14:27.542 INFO:tasks.workunit.client.0.vm03.stdout:1/229: link d4/d6/d1d/d24/c2c d4/d6/d1d/d20/d23/d3e/d3f/c54 0 2026-03-09T16:14:27.542 INFO:tasks.workunit.client.0.vm03.stdout:4/308: getdents d5/dd 0 2026-03-09T16:14:27.547 INFO:tasks.workunit.client.0.vm03.stdout:1/230: symlink d4/db/l55 0 2026-03-09T16:14:27.553 INFO:tasks.workunit.client.0.vm03.stdout:4/309: symlink d5/d17/d44/l5c 0 2026-03-09T16:14:27.554 INFO:tasks.workunit.client.0.vm03.stdout:2/309: sync 2026-03-09T16:14:27.555 INFO:tasks.workunit.client.0.vm03.stdout:6/291: sync 2026-03-09T16:14:27.555 INFO:tasks.workunit.client.0.vm03.stdout:6/292: readlink d9/l2a 0 2026-03-09T16:14:27.555 INFO:tasks.workunit.client.0.vm03.stdout:1/231: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:27.556 INFO:tasks.workunit.client.0.vm03.stdout:1/232: write d4/d6/f9 [3055660,3741] 0 2026-03-09T16:14:27.560 INFO:tasks.workunit.client.0.vm03.stdout:4/310: dwrite d5/db/f3a [0,4194304] 0 2026-03-09T16:14:27.562 INFO:tasks.workunit.client.0.vm03.stdout:4/311: read d5/dd/d1f/f58 [223776,7339] 0 2026-03-09T16:14:27.570 INFO:tasks.workunit.client.0.vm03.stdout:9/361: creat d2/d4/d11/d29/d63/f6f x:0 0 0 2026-03-09T16:14:27.571 INFO:tasks.workunit.client.0.vm03.stdout:0/332: write d0/da/d1b/fd [239804,17604] 0 2026-03-09T16:14:27.573 INFO:tasks.workunit.client.0.vm03.stdout:3/281: truncate d5/d1e/d42/f29 3560942 0 2026-03-09T16:14:27.574 INFO:tasks.workunit.client.0.vm03.stdout:3/282: write d5/d1e/d42/f2c [124034,105463] 0 2026-03-09T16:14:27.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:27 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:27.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:27 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:27.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:27 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:27.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:27 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:27.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:27 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:27.589 INFO:tasks.workunit.client.0.vm03.stdout:1/233: mknod d4/d6/d1d/d20/d23/d3e/d3f/c56 0 2026-03-09T16:14:27.591 INFO:tasks.workunit.client.0.vm03.stdout:4/312: unlink d5/f8 0 2026-03-09T16:14:27.595 INFO:tasks.workunit.client.0.vm03.stdout:4/313: read d5/fa [4125756,44109] 0 2026-03-09T16:14:27.599 INFO:tasks.workunit.client.0.vm03.stdout:7/261: rename d4/da/d18/d22/d24/d16/d2b/c3c to d4/da/d18/d22/d24/c57 0 2026-03-09T16:14:27.619 INFO:tasks.workunit.client.0.vm03.stdout:8/321: dwrite da/d32/f4d [0,4194304] 0 2026-03-09T16:14:27.637 INFO:tasks.workunit.client.0.vm03.stdout:5/361: dwrite d2/d7/d1a/f4d [4194304,4194304] 0 2026-03-09T16:14:27.641 INFO:tasks.workunit.client.0.vm03.stdout:0/333: dread d0/d7/f3d [0,4194304] 0 2026-03-09T16:14:27.658 INFO:tasks.workunit.client.0.vm03.stdout:9/362: dwrite d2/d4/f3e [0,4194304] 0 2026-03-09T16:14:27.665 INFO:tasks.workunit.client.0.vm03.stdout:3/283: mknod d5/d1e/d42/d37/c4e 0 2026-03-09T16:14:27.665 INFO:tasks.workunit.client.0.vm03.stdout:2/310: fsync db/f14 0 2026-03-09T16:14:27.666 INFO:tasks.workunit.client.0.vm03.stdout:2/311: write f0 [4101114,55351] 0 2026-03-09T16:14:27.670 INFO:tasks.workunit.client.0.vm03.stdout:7/262: creat d4/da/d45/f58 x:0 0 0 2026-03-09T16:14:27.673 INFO:tasks.workunit.client.0.vm03.stdout:6/293: rename d9/d42/l4f to d9/d42/d45/d47/l54 0 2026-03-09T16:14:27.673 INFO:tasks.workunit.client.0.vm03.stdout:6/294: write d9/d42/d45/f4d [299574,113180] 0 2026-03-09T16:14:27.674 INFO:tasks.workunit.client.0.vm03.stdout:8/322: mkdir da/d10/d63 0 2026-03-09T16:14:27.675 INFO:tasks.workunit.client.0.vm03.stdout:5/362: creat d2/f7f x:0 0 0 2026-03-09T16:14:27.681 INFO:tasks.workunit.client.0.vm03.stdout:0/334: mknod d0/d7/d3e/d5d/c6d 0 2026-03-09T16:14:27.685 INFO:tasks.workunit.client.0.vm03.stdout:0/335: dwrite d0/f4e [0,4194304] 0 2026-03-09T16:14:27.689 INFO:tasks.workunit.client.0.vm03.stdout:6/295: sync 2026-03-09T16:14:27.697 INFO:tasks.workunit.client.0.vm03.stdout:9/363: creat d2/d4/d11/d29/f70 x:0 0 0 2026-03-09T16:14:27.722 INFO:tasks.workunit.client.0.vm03.stdout:3/284: unlink d5/f10 0 2026-03-09T16:14:27.722 INFO:tasks.workunit.client.0.vm03.stdout:3/285: chown d5/l21 32815377 1 2026-03-09T16:14:27.726 INFO:tasks.workunit.client.0.vm03.stdout:2/312: dread - db/d12/f49 zero size 2026-03-09T16:14:27.743 INFO:tasks.workunit.client.0.vm03.stdout:8/323: mkdir da/d10/d28/d64 0 2026-03-09T16:14:27.746 INFO:tasks.workunit.client.0.vm03.stdout:8/324: dwrite da/d1d/f4a [0,4194304] 0 2026-03-09T16:14:27.756 INFO:tasks.workunit.client.0.vm03.stdout:5/363: creat d2/d7/de/d11/f80 x:0 0 0 2026-03-09T16:14:27.767 INFO:tasks.workunit.client.0.vm03.stdout:0/336: creat d0/da/d1b/f6e x:0 0 0 2026-03-09T16:14:27.769 INFO:tasks.workunit.client.0.vm03.stdout:0/337: readlink d0/d7/l5b 0 2026-03-09T16:14:27.770 INFO:tasks.workunit.client.0.vm03.stdout:0/338: dwrite d0/d7/d48/d32/f59 [0,4194304] 0 2026-03-09T16:14:27.781 INFO:tasks.workunit.client.0.vm03.stdout:0/339: dwrite d0/f3 [0,4194304] 0 2026-03-09T16:14:27.790 INFO:tasks.workunit.client.0.vm03.stdout:6/296: symlink d9/d14/l55 0 2026-03-09T16:14:27.796 INFO:tasks.workunit.client.0.vm03.stdout:6/297: chown d9/d42/d45/d50 78182 1 2026-03-09T16:14:27.801 INFO:tasks.workunit.client.0.vm03.stdout:9/364: mknod d2/d4/d11/d29/d2a/d38/c71 0 2026-03-09T16:14:27.827 INFO:tasks.workunit.client.0.vm03.stdout:2/313: rmdir db/d12/d2a/d61 39 2026-03-09T16:14:27.841 INFO:tasks.workunit.client.0.vm03.stdout:4/314: link d5/fa d5/db/f5d 0 2026-03-09T16:14:27.867 INFO:tasks.workunit.client.0.vm03.stdout:5/364: rename d2/d7/d8/d16/l70 to d2/d7/d1a/l81 0 2026-03-09T16:14:27.875 INFO:tasks.workunit.client.0.vm03.stdout:0/340: mkdir d0/d7/d48/d32/d6f 0 2026-03-09T16:14:27.875 INFO:tasks.workunit.client.0.vm03.stdout:1/234: getdents d4/d6/d1d/d20/d23/d3e/d3f 0 2026-03-09T16:14:27.884 INFO:tasks.workunit.client.0.vm03.stdout:4/315: truncate d5/db/f34 2246707 0 2026-03-09T16:14:27.886 INFO:tasks.workunit.client.0.vm03.stdout:7/263: creat d4/da/d18/d22/d24/f59 x:0 0 0 2026-03-09T16:14:27.895 INFO:tasks.workunit.client.0.vm03.stdout:9/365: rename d2/d4/d1f/f5b to d2/d4/d11/d29/d2a/d38/f72 0 2026-03-09T16:14:27.896 INFO:tasks.workunit.client.0.vm03.stdout:9/366: write d2/d4/d11/d29/f5d [8442371,930] 0 2026-03-09T16:14:27.900 INFO:tasks.workunit.client.0.vm03.stdout:5/365: creat d2/d7/d8/d24/d27/d43/f82 x:0 0 0 2026-03-09T16:14:27.905 INFO:tasks.workunit.client.0.vm03.stdout:5/366: read d2/d7/de/f48 [232930,117488] 0 2026-03-09T16:14:27.915 INFO:tasks.workunit.client.0.vm03.stdout:2/314: mknod db/c76 0 2026-03-09T16:14:27.919 INFO:tasks.workunit.client.0.vm03.stdout:4/316: creat d5/dd/d1f/f5e x:0 0 0 2026-03-09T16:14:27.919 INFO:tasks.workunit.client.0.vm03.stdout:4/317: fdatasync d5/dd/d1f/f48 0 2026-03-09T16:14:27.922 INFO:tasks.workunit.client.0.vm03.stdout:7/264: creat d4/da/d18/d22/d24/d16/d2b/f5a x:0 0 0 2026-03-09T16:14:27.925 INFO:tasks.workunit.client.0.vm03.stdout:8/325: link da/db/d43/c4e da/db/d30/c65 0 2026-03-09T16:14:27.929 INFO:tasks.workunit.client.0.vm03.stdout:3/286: dwrite d5/d1e/d42/f29 [0,4194304] 0 2026-03-09T16:14:27.930 INFO:tasks.workunit.client.0.vm03.stdout:3/287: chown d5/d2e/l3b 47073275 1 2026-03-09T16:14:27.931 INFO:tasks.workunit.client.0.vm03.stdout:3/288: chown d5/l12 3622 1 2026-03-09T16:14:27.935 INFO:tasks.workunit.client.0.vm03.stdout:8/326: dwrite da/f52 [0,4194304] 0 2026-03-09T16:14:27.937 INFO:tasks.workunit.client.0.vm03.stdout:8/327: write da/d1d/f5e [1047334,106325] 0 2026-03-09T16:14:27.961 INFO:tasks.workunit.client.0.vm03.stdout:9/367: mknod d2/d4/d1f/c73 0 2026-03-09T16:14:27.961 INFO:tasks.workunit.client.0.vm03.stdout:9/368: fsync d2/d4/d11/d12/f50 0 2026-03-09T16:14:27.966 INFO:tasks.workunit.client.0.vm03.stdout:9/369: truncate d2/d4/d11/d12/f3d 827631 0 2026-03-09T16:14:27.968 INFO:tasks.workunit.client.0.vm03.stdout:9/370: chown d2/de 0 1 2026-03-09T16:14:27.970 INFO:tasks.workunit.client.0.vm03.stdout:5/367: symlink d2/d7/d3c/l83 0 2026-03-09T16:14:27.973 INFO:tasks.workunit.client.0.vm03.stdout:5/368: write d2/d7/de/d54/f4a [1433814,56981] 0 2026-03-09T16:14:27.976 INFO:tasks.workunit.client.0.vm03.stdout:6/298: dwrite d9/d22/f4e [0,4194304] 0 2026-03-09T16:14:27.979 INFO:tasks.workunit.client.0.vm03.stdout:0/341: truncate d0/d7/d48/d32/f69 4917150 0 2026-03-09T16:14:27.982 INFO:tasks.workunit.client.0.vm03.stdout:6/299: read d9/d14/f29 [6003967,26774] 0 2026-03-09T16:14:27.990 INFO:tasks.workunit.client.0.vm03.stdout:4/318: truncate d5/db/f5d 5133188 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:1/235: creat d4/d6/d3b/d4a/f57 x:0 0 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:7/265: creat d4/da/d45/d51/f5b x:0 0 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:6/300: dread d9/ff [4194304,4194304] 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:6/301: dread - d9/f3b zero size 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:8/328: creat da/d32/f66 x:0 0 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:8/329: dread - da/d10/d28/f57 zero size 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:3/289: symlink d5/d1e/d42/d37/l4f 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:4/319: mkdir d5/dd/d1f/d5f 0 2026-03-09T16:14:28.005 INFO:tasks.workunit.client.0.vm03.stdout:8/330: chown da/db/d30/c48 130312357 1 2026-03-09T16:14:28.006 INFO:tasks.workunit.client.0.vm03.stdout:6/302: symlink d9/d42/d45/l56 0 2026-03-09T16:14:28.006 INFO:tasks.workunit.client.0.vm03.stdout:9/371: sync 2026-03-09T16:14:28.007 INFO:tasks.workunit.client.0.vm03.stdout:4/320: creat d5/dd/d1f/f60 x:0 0 0 2026-03-09T16:14:28.008 INFO:tasks.workunit.client.0.vm03.stdout:4/321: read d5/db/f3a [1071155,86475] 0 2026-03-09T16:14:28.009 INFO:tasks.workunit.client.0.vm03.stdout:8/331: unlink da/d1d/d3b/c5d 0 2026-03-09T16:14:28.011 INFO:tasks.workunit.client.0.vm03.stdout:4/322: fsync d5/dd/f23 0 2026-03-09T16:14:28.013 INFO:tasks.workunit.client.0.vm03.stdout:6/303: dwrite d9/f40 [0,4194304] 0 2026-03-09T16:14:28.013 INFO:tasks.workunit.client.0.vm03.stdout:2/315: getdents db/d59 0 2026-03-09T16:14:28.018 INFO:tasks.workunit.client.0.vm03.stdout:2/316: write f0 [2582582,61147] 0 2026-03-09T16:14:28.018 INFO:tasks.workunit.client.0.vm03.stdout:7/266: rename d4/da/d18/d22/d24/d16/l1d to d4/da/d18/d22/d24/d16/l5c 0 2026-03-09T16:14:28.019 INFO:tasks.workunit.client.0.vm03.stdout:3/290: dwrite d5/f33 [0,4194304] 0 2026-03-09T16:14:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:27 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:27 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:27 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:27 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:27 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.027 INFO:tasks.workunit.client.0.vm03.stdout:7/267: truncate d4/d2d/f52 5055916 0 2026-03-09T16:14:28.034 INFO:tasks.workunit.client.0.vm03.stdout:0/342: dread d0/f1e [0,4194304] 0 2026-03-09T16:14:28.037 INFO:tasks.workunit.client.0.vm03.stdout:8/332: symlink da/d32/l67 0 2026-03-09T16:14:28.047 INFO:tasks.workunit.client.0.vm03.stdout:5/369: getdents d2/d7/de/d11/d19 0 2026-03-09T16:14:28.047 INFO:tasks.workunit.client.0.vm03.stdout:4/323: fdatasync d5/d17/d44/f4a 0 2026-03-09T16:14:28.047 INFO:tasks.workunit.client.0.vm03.stdout:1/236: getdents d4/d6/d3b 0 2026-03-09T16:14:28.047 INFO:tasks.workunit.client.0.vm03.stdout:1/237: chown f1 477 1 2026-03-09T16:14:28.047 INFO:tasks.workunit.client.0.vm03.stdout:4/324: write d5/d17/f18 [1429593,9284] 0 2026-03-09T16:14:28.053 INFO:tasks.workunit.client.0.vm03.stdout:6/304: fsync d9/d22/f3f 0 2026-03-09T16:14:28.053 INFO:tasks.workunit.client.0.vm03.stdout:3/291: rename d5/d1e/d42/d34/l39 to d5/d1e/d42/d34/l50 0 2026-03-09T16:14:28.053 INFO:tasks.workunit.client.0.vm03.stdout:5/370: dwrite d2/d7/d1a/f6e [0,4194304] 0 2026-03-09T16:14:28.054 INFO:tasks.workunit.client.0.vm03.stdout:2/317: write db/d12/d2a/d61/f45 [3353597,35857] 0 2026-03-09T16:14:28.066 INFO:tasks.workunit.client.0.vm03.stdout:0/343: rename d0/da/l14 to d0/d7/d48/d32/d6f/l70 0 2026-03-09T16:14:28.074 INFO:tasks.workunit.client.0.vm03.stdout:8/333: truncate da/f35 389159 0 2026-03-09T16:14:28.086 INFO:tasks.workunit.client.0.vm03.stdout:9/372: write d2/df/f10 [4759420,58523] 0 2026-03-09T16:14:28.088 INFO:tasks.workunit.client.0.vm03.stdout:9/373: read d2/df/f42 [810548,41848] 0 2026-03-09T16:14:28.089 INFO:tasks.workunit.client.0.vm03.stdout:9/374: readlink d2/d54/l6e 0 2026-03-09T16:14:28.089 INFO:tasks.workunit.client.0.vm03.stdout:9/375: stat d2/d4/d11/d12/l2e 0 2026-03-09T16:14:28.090 INFO:tasks.workunit.client.0.vm03.stdout:9/376: dread d2/d54/f69 [0,4194304] 0 2026-03-09T16:14:28.092 INFO:tasks.workunit.client.0.vm03.stdout:1/238: symlink d4/d31/l58 0 2026-03-09T16:14:28.092 INFO:tasks.workunit.client.0.vm03.stdout:1/239: write d4/d6/d1d/d3d/f45 [742835,68101] 0 2026-03-09T16:14:28.097 INFO:tasks.workunit.client.0.vm03.stdout:4/325: creat d5/d17/d44/f61 x:0 0 0 2026-03-09T16:14:28.097 INFO:tasks.workunit.client.0.vm03.stdout:3/292: rmdir d5/d2e 39 2026-03-09T16:14:28.123 INFO:tasks.workunit.client.0.vm03.stdout:2/318: rmdir db/d59/d52 39 2026-03-09T16:14:28.124 INFO:tasks.workunit.client.0.vm03.stdout:7/268: write d4/da/d18/d22/d24/f30 [555355,87451] 0 2026-03-09T16:14:28.124 INFO:tasks.workunit.client.0.vm03.stdout:7/269: write d4/f26 [5928378,47604] 0 2026-03-09T16:14:28.128 INFO:tasks.workunit.client.0.vm03.stdout:6/305: write d9/d22/f3f [1060613,123099] 0 2026-03-09T16:14:28.130 INFO:tasks.workunit.client.0.vm03.stdout:6/306: write d9/f36 [351500,127949] 0 2026-03-09T16:14:28.131 INFO:tasks.workunit.client.0.vm03.stdout:5/371: symlink d2/d7/d1a/d1c/l84 0 2026-03-09T16:14:28.150 INFO:tasks.workunit.client.0.vm03.stdout:0/344: creat d0/d7/d3e/d57/d5a/d5f/f71 x:0 0 0 2026-03-09T16:14:28.158 INFO:tasks.workunit.client.0.vm03.stdout:9/377: creat d2/d4/d11/d29/d2a/d38/f74 x:0 0 0 2026-03-09T16:14:28.160 INFO:tasks.workunit.client.0.vm03.stdout:1/240: mkdir d4/db/d59 0 2026-03-09T16:14:28.170 INFO:tasks.workunit.client.0.vm03.stdout:2/319: dread - db/d12/d2a/d61/f20 zero size 2026-03-09T16:14:28.171 INFO:tasks.workunit.client.0.vm03.stdout:5/372: rename d2/d7/d1a/d1c/l84 to d2/d7/d1a/d1c/l85 0 2026-03-09T16:14:28.172 INFO:tasks.workunit.client.0.vm03.stdout:2/320: chown db/d12/d2a/d61/f20 23 1 2026-03-09T16:14:28.172 INFO:tasks.workunit.client.0.vm03.stdout:5/373: chown d2/d7/de/d11/c61 1658676 1 2026-03-09T16:14:28.178 INFO:tasks.workunit.client.0.vm03.stdout:8/334: mkdir da/d10/d28/d4f/d68 0 2026-03-09T16:14:28.185 INFO:tasks.workunit.client.0.vm03.stdout:9/378: unlink d2/df/c3c 0 2026-03-09T16:14:28.186 INFO:tasks.workunit.client.0.vm03.stdout:4/326: rename d5/dd/d1f/f2d to d5/db/d25/d31/d33/d55/f62 0 2026-03-09T16:14:28.194 INFO:tasks.workunit.client.0.vm03.stdout:3/293: dwrite d5/f11 [0,4194304] 0 2026-03-09T16:14:28.194 INFO:tasks.workunit.client.0.vm03.stdout:7/270: dwrite d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:28.198 INFO:tasks.workunit.client.0.vm03.stdout:3/294: write d5/f43 [851735,89662] 0 2026-03-09T16:14:28.203 INFO:tasks.workunit.client.0.vm03.stdout:3/295: read d5/f16 [2427054,107985] 0 2026-03-09T16:14:28.205 INFO:tasks.workunit.client.0.vm03.stdout:7/271: dwrite d4/da/d18/d22/d24/f2f [0,4194304] 0 2026-03-09T16:14:28.220 INFO:tasks.workunit.client.0.vm03.stdout:0/345: link d0/da/d1b/f6e d0/d7/d3e/f72 0 2026-03-09T16:14:28.223 INFO:tasks.workunit.client.0.vm03.stdout:5/374: readlink d2/d7/d8/d24/l51 0 2026-03-09T16:14:28.226 INFO:tasks.workunit.client.0.vm03.stdout:9/379: rmdir d2/d4/d11/d29/d2a/d38 39 2026-03-09T16:14:28.227 INFO:tasks.workunit.client.0.vm03.stdout:9/380: stat d2/d4/d11/d12 0 2026-03-09T16:14:28.236 INFO:tasks.workunit.client.0.vm03.stdout:6/307: rename d9/f33 to d9/d42/d45/f57 0 2026-03-09T16:14:28.248 INFO:tasks.workunit.client.0.vm03.stdout:5/375: creat d2/d7/d8/f86 x:0 0 0 2026-03-09T16:14:28.250 INFO:tasks.workunit.client.0.vm03.stdout:9/381: creat d2/d4/d11/d29/d63/f75 x:0 0 0 2026-03-09T16:14:28.250 INFO:tasks.workunit.client.0.vm03.stdout:5/376: truncate d2/d7/de/d11/d19/d29/f77 266322 0 2026-03-09T16:14:28.254 INFO:tasks.workunit.client.0.vm03.stdout:3/296: dwrite d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:28.257 INFO:tasks.workunit.client.0.vm03.stdout:1/241: rename d4/d6/d1d/f40 to d4/d39/f5a 0 2026-03-09T16:14:28.261 INFO:tasks.workunit.client.0.vm03.stdout:0/346: sync 2026-03-09T16:14:28.261 INFO:tasks.workunit.client.0.vm03.stdout:4/327: mknod d5/dd/d1f/d5f/c63 0 2026-03-09T16:14:28.267 INFO:tasks.workunit.client.0.vm03.stdout:2/321: creat db/d12/f77 x:0 0 0 2026-03-09T16:14:28.267 INFO:tasks.workunit.client.0.vm03.stdout:5/377: dread d2/d7/d1a/f4d [4194304,4194304] 0 2026-03-09T16:14:28.271 INFO:tasks.workunit.client.0.vm03.stdout:2/322: dread db/f14 [0,4194304] 0 2026-03-09T16:14:28.275 INFO:tasks.workunit.client.0.vm03.stdout:5/378: write d2/d7/de/d11/d38/d3b/f68 [995804,93896] 0 2026-03-09T16:14:28.276 INFO:tasks.workunit.client.0.vm03.stdout:6/308: mknod d9/d42/d45/d50/c58 0 2026-03-09T16:14:28.277 INFO:tasks.workunit.client.0.vm03.stdout:5/379: chown d2/c41 1407 1 2026-03-09T16:14:28.277 INFO:tasks.workunit.client.0.vm03.stdout:2/323: write db/d12/f77 [166902,86008] 0 2026-03-09T16:14:28.284 INFO:tasks.workunit.client.0.vm03.stdout:7/272: mkdir d4/da/d5d 0 2026-03-09T16:14:28.289 INFO:tasks.workunit.client.0.vm03.stdout:9/382: creat d2/df/f76 x:0 0 0 2026-03-09T16:14:28.289 INFO:tasks.workunit.client.0.vm03.stdout:9/383: write d2/d4/d11/d29/f70 [151731,115153] 0 2026-03-09T16:14:28.290 INFO:tasks.workunit.client.0.vm03.stdout:9/384: fsync d2/df/d5f/f6a 0 2026-03-09T16:14:28.294 INFO:tasks.workunit.client.0.vm03.stdout:0/347: fdatasync d0/d7/f3d 0 2026-03-09T16:14:28.298 INFO:tasks.workunit.client.0.vm03.stdout:6/309: mknod d9/d42/d45/d47/c59 0 2026-03-09T16:14:28.298 INFO:tasks.workunit.client.0.vm03.stdout:0/348: fdatasync d0/da/d1b/fd 0 2026-03-09T16:14:28.311 INFO:tasks.workunit.client.0.vm03.stdout:0/349: dwrite d0/d7/f56 [0,4194304] 0 2026-03-09T16:14:28.321 INFO:tasks.workunit.client.0.vm03.stdout:8/335: truncate da/d10/f23 2838058 0 2026-03-09T16:14:28.322 INFO:tasks.workunit.client.0.vm03.stdout:8/336: write da/d32/f61 [636408,117364] 0 2026-03-09T16:14:28.326 INFO:tasks.workunit.client.0.vm03.stdout:5/380: mkdir d2/d7/de/d11/d19/d31/d35/d87 0 2026-03-09T16:14:28.332 INFO:tasks.workunit.client.0.vm03.stdout:7/273: fdatasync d4/da/d18/d22/d24/d15/f34 0 2026-03-09T16:14:28.336 INFO:tasks.workunit.client.0.vm03.stdout:4/328: dwrite d5/db/f2f [0,4194304] 0 2026-03-09T16:14:28.346 INFO:tasks.workunit.client.0.vm03.stdout:7/274: sync 2026-03-09T16:14:28.347 INFO:tasks.workunit.client.0.vm03.stdout:9/385: rename d2/de/l2b to d2/d4/d11/d29/l77 0 2026-03-09T16:14:28.347 INFO:tasks.workunit.client.0.vm03.stdout:9/386: write d2/df/f64 [586949,90916] 0 2026-03-09T16:14:28.349 INFO:tasks.workunit.client.0.vm03.stdout:9/387: truncate d2/d4/d11/d29/d63/f75 835328 0 2026-03-09T16:14:28.349 INFO:tasks.workunit.client.0.vm03.stdout:9/388: write d2/d4/d11/d12/f68 [49032,4331] 0 2026-03-09T16:14:28.354 INFO:tasks.workunit.client.0.vm03.stdout:3/297: creat d5/d1e/d42/d4c/f51 x:0 0 0 2026-03-09T16:14:28.362 INFO:tasks.workunit.client.0.vm03.stdout:2/324: mknod db/c78 0 2026-03-09T16:14:28.369 INFO:tasks.workunit.client.0.vm03.stdout:6/310: fsync d9/d22/f43 0 2026-03-09T16:14:28.380 INFO:tasks.workunit.client.0.vm03.stdout:0/350: creat d0/da/d1b/f73 x:0 0 0 2026-03-09T16:14:28.380 INFO:tasks.workunit.client.0.vm03.stdout:8/337: mknod da/d10/d28/c69 0 2026-03-09T16:14:28.382 INFO:tasks.workunit.client.0.vm03.stdout:5/381: creat d2/d7/de/d11/d38/d3b/f88 x:0 0 0 2026-03-09T16:14:28.383 INFO:tasks.workunit.client.0.vm03.stdout:5/382: readlink d2/d7/de/d54/l40 0 2026-03-09T16:14:28.408 INFO:tasks.workunit.client.0.vm03.stdout:4/329: rmdir d5/db/d25/d31/d33 39 2026-03-09T16:14:28.420 INFO:tasks.workunit.client.0.vm03.stdout:3/298: readlink d5/d2e/l36 0 2026-03-09T16:14:28.422 INFO:tasks.workunit.client.0.vm03.stdout:6/311: symlink d9/d42/d45/d47/l5a 0 2026-03-09T16:14:28.430 INFO:tasks.workunit.client.0.vm03.stdout:0/351: rmdir d0/da/d5c 39 2026-03-09T16:14:28.430 INFO:tasks.workunit.client.0.vm03.stdout:0/352: chown d0/d7/d3e/d57/l58 32493530 1 2026-03-09T16:14:28.435 INFO:tasks.workunit.client.0.vm03.stdout:2/325: dwrite db/d12/f37 [0,4194304] 0 2026-03-09T16:14:28.436 INFO:tasks.workunit.client.0.vm03.stdout:2/326: chown db 88666 1 2026-03-09T16:14:28.454 INFO:tasks.workunit.client.0.vm03.stdout:4/330: creat d5/d17/d44/f64 x:0 0 0 2026-03-09T16:14:28.459 INFO:tasks.workunit.client.0.vm03.stdout:7/275: rename d4/da/d18/d22/d24/c1e to d4/dc/d53/c5e 0 2026-03-09T16:14:28.462 INFO:tasks.workunit.client.0.vm03.stdout:7/276: dwrite d4/da/d18/d22/d24/f2f [4194304,4194304] 0 2026-03-09T16:14:28.479 INFO:tasks.workunit.client.0.vm03.stdout:3/299: symlink d5/d2e/l52 0 2026-03-09T16:14:28.481 INFO:tasks.workunit.client.0.vm03.stdout:1/242: getdents d4/d6 0 2026-03-09T16:14:28.482 INFO:tasks.workunit.client.0.vm03.stdout:6/312: unlink d9/ff 0 2026-03-09T16:14:28.486 INFO:tasks.workunit.client.0.vm03.stdout:6/313: write d9/f36 [793976,1640] 0 2026-03-09T16:14:28.496 INFO:tasks.workunit.client.0.vm03.stdout:2/327: mkdir db/d12/d2a/d61/d79 0 2026-03-09T16:14:28.497 INFO:tasks.workunit.client.0.vm03.stdout:2/328: write db/d12/f69 [560955,22048] 0 2026-03-09T16:14:28.507 INFO:tasks.workunit.client.0.vm03.stdout:4/331: mknod d5/d40/c65 0 2026-03-09T16:14:28.507 INFO:tasks.workunit.client.0.vm03.stdout:4/332: read d5/dd/f22 [415218,8949] 0 2026-03-09T16:14:28.510 INFO:tasks.workunit.client.0.vm03.stdout:4/333: dwrite d5/dd/f23 [0,4194304] 0 2026-03-09T16:14:28.517 INFO:tasks.workunit.client.0.vm03.stdout:9/389: rename d2/d4/d11/d12/l43 to d2/d4/d11/d29/d63/l78 0 2026-03-09T16:14:28.530 INFO:tasks.workunit.client.0.vm03.stdout:8/338: creat da/db/f6a x:0 0 0 2026-03-09T16:14:28.540 INFO:tasks.workunit.client.0.vm03.stdout:0/353: rename d0/d7/d48/d32/d6f to d0/d7/d3e/d57/d5a/d74 0 2026-03-09T16:14:28.544 INFO:tasks.workunit.client.0.vm03.stdout:9/390: truncate d2/d4/d11/d12/f1e 3211300 0 2026-03-09T16:14:28.550 INFO:tasks.workunit.client.0.vm03.stdout:3/300: mkdir d5/d53 0 2026-03-09T16:14:28.560 INFO:tasks.workunit.client.0.vm03.stdout:1/243: symlink d4/db/d59/l5b 0 2026-03-09T16:14:28.569 INFO:tasks.workunit.client.0.vm03.stdout:6/314: symlink d9/l5b 0 2026-03-09T16:14:28.574 INFO:tasks.workunit.client.0.vm03.stdout:6/315: dwrite d9/d42/d45/f4d [0,4194304] 0 2026-03-09T16:14:28.581 INFO:tasks.workunit.client.0.vm03.stdout:8/339: mknod da/d45/c6b 0 2026-03-09T16:14:28.582 INFO:tasks.workunit.client.0.vm03.stdout:5/383: getdents d2/d7/d3c 0 2026-03-09T16:14:28.582 INFO:tasks.workunit.client.0.vm03.stdout:4/334: mkdir d5/db/d25/d31/d66 0 2026-03-09T16:14:28.582 INFO:tasks.workunit.client.0.vm03.stdout:5/384: chown d2/d7/d1a/d1c 123 1 2026-03-09T16:14:28.582 INFO:tasks.workunit.client.0.vm03.stdout:4/335: dread - d5/dd/d1f/f4c zero size 2026-03-09T16:14:28.583 INFO:tasks.workunit.client.0.vm03.stdout:2/329: rename db/d12 to db/d12/d7a 22 2026-03-09T16:14:28.583 INFO:tasks.workunit.client.0.vm03.stdout:7/277: creat d4/da/f5f x:0 0 0 2026-03-09T16:14:28.583 INFO:tasks.workunit.client.0.vm03.stdout:5/385: write d2/d7/de/d11/d38/d3b/f88 [393283,100046] 0 2026-03-09T16:14:28.586 INFO:tasks.workunit.client.0.vm03.stdout:1/244: mkdir d4/d31/d5c 0 2026-03-09T16:14:28.590 INFO:tasks.workunit.client.0.vm03.stdout:6/316: fsync d9/f15 0 2026-03-09T16:14:28.598 INFO:tasks.workunit.client.0.vm03.stdout:7/278: mknod d4/da/d18/c60 0 2026-03-09T16:14:28.600 INFO:tasks.workunit.client.0.vm03.stdout:7/279: dread - d4/da/d18/d22/d24/f59 zero size 2026-03-09T16:14:28.600 INFO:tasks.workunit.client.0.vm03.stdout:5/386: symlink d2/d7/l89 0 2026-03-09T16:14:28.608 INFO:tasks.workunit.client.0.vm03.stdout:1/245: symlink d4/d6/l5d 0 2026-03-09T16:14:28.616 INFO:tasks.workunit.client.0.vm03.stdout:8/340: mkdir da/d6c 0 2026-03-09T16:14:28.629 INFO:tasks.workunit.client.0.vm03.stdout:3/301: creat d5/d44/f54 x:0 0 0 2026-03-09T16:14:28.631 INFO:tasks.workunit.client.0.vm03.stdout:9/391: truncate d2/d4/d11/d12/f3d 297663 0 2026-03-09T16:14:28.631 INFO:tasks.workunit.client.0.vm03.stdout:4/336: write d5/db/f28 [568553,70412] 0 2026-03-09T16:14:28.631 INFO:tasks.workunit.client.0.vm03.stdout:2/330: write db/d59/f3f [593980,20965] 0 2026-03-09T16:14:28.632 INFO:tasks.workunit.client.0.vm03.stdout:2/331: truncate db/d12/f69 790454 0 2026-03-09T16:14:28.633 INFO:tasks.workunit.client.0.vm03.stdout:4/337: write d5/db/f2f [3919878,80622] 0 2026-03-09T16:14:28.636 INFO:tasks.workunit.client.0.vm03.stdout:4/338: dwrite d5/d17/d44/f64 [0,4194304] 0 2026-03-09T16:14:28.641 INFO:tasks.workunit.client.0.vm03.stdout:7/280: rmdir d4/da/d18/d22 39 2026-03-09T16:14:28.648 INFO:tasks.workunit.client.0.vm03.stdout:5/387: truncate d2/d7/d8/f36 363191 0 2026-03-09T16:14:28.652 INFO:tasks.workunit.client.0.vm03.stdout:0/354: rename d0/d7/d48/d32 to d0/d7/d75 0 2026-03-09T16:14:28.653 INFO:tasks.workunit.client.0.vm03.stdout:0/355: chown d0/l6b 3762927 1 2026-03-09T16:14:28.666 INFO:tasks.workunit.client.0.vm03.stdout:3/302: mkdir d5/d1e/d42/d55 0 2026-03-09T16:14:28.673 INFO:tasks.workunit.client.0.vm03.stdout:4/339: chown d5/db/f5d 3 1 2026-03-09T16:14:28.677 INFO:tasks.workunit.client.0.vm03.stdout:1/246: mknod d4/c5e 0 2026-03-09T16:14:28.681 INFO:tasks.workunit.client.0.vm03.stdout:7/281: dwrite d4/da/d18/d22/d24/d16/f39 [0,4194304] 0 2026-03-09T16:14:28.695 INFO:tasks.workunit.client.0.vm03.stdout:6/317: creat d9/f5c x:0 0 0 2026-03-09T16:14:28.707 INFO:tasks.workunit.client.0.vm03.stdout:0/356: dread d0/d7/d3e/d57/d5a/d52/f68 [0,4194304] 0 2026-03-09T16:14:28.730 INFO:tasks.workunit.client.0.vm03.stdout:6/318: sync 2026-03-09T16:14:28.731 INFO:tasks.workunit.client.0.vm03.stdout:6/319: fdatasync d9/f20 0 2026-03-09T16:14:28.731 INFO:tasks.workunit.client.0.vm03.stdout:6/320: dread - d9/d42/d45/d50/f51 zero size 2026-03-09T16:14:28.732 INFO:tasks.workunit.client.0.vm03.stdout:6/321: truncate d9/d42/d45/f4a 1000328 0 2026-03-09T16:14:28.735 INFO:tasks.workunit.client.0.vm03.stdout:6/322: write d9/f3b [947067,25349] 0 2026-03-09T16:14:28.745 INFO:tasks.workunit.client.0.vm03.stdout:6/323: dwrite d9/f5c [0,4194304] 0 2026-03-09T16:14:28.746 INFO:tasks.workunit.client.0.vm03.stdout:6/324: readlink d9/l46 0 2026-03-09T16:14:28.784 INFO:tasks.workunit.client.0.vm03.stdout:9/392: truncate d2/d4/f3e 344353 0 2026-03-09T16:14:28.799 INFO:tasks.workunit.client.0.vm03.stdout:7/282: rename d4/dc/d53 to d4/dc/d61 0 2026-03-09T16:14:28.802 INFO:tasks.workunit.client.0.vm03.stdout:8/341: link da/db/d30/c48 da/d10/d28/c6d 0 2026-03-09T16:14:28.804 INFO:tasks.workunit.client.0.vm03.stdout:0/357: creat d0/d7/d3e/d45/f76 x:0 0 0 2026-03-09T16:14:28.810 INFO:tasks.workunit.client.0.vm03.stdout:6/325: symlink d9/d42/d45/l5d 0 2026-03-09T16:14:28.811 INFO:tasks.workunit.client.0.vm03.stdout:6/326: chown d9/d22/f27 27119723 1 2026-03-09T16:14:28.819 INFO:tasks.workunit.client.0.vm03.stdout:4/340: symlink d5/dd/d1f/l67 0 2026-03-09T16:14:28.820 INFO:tasks.workunit.client.0.vm03.stdout:4/341: chown d5/dd/d1f/f5e 44641931 1 2026-03-09T16:14:28.820 INFO:tasks.workunit.client.0.vm03.stdout:4/342: chown d5/fa 12109442 1 2026-03-09T16:14:28.826 INFO:tasks.workunit.client.0.vm03.stdout:4/343: dread d5/db/d25/f26 [0,4194304] 0 2026-03-09T16:14:28.843 INFO:tasks.workunit.client.0.vm03.stdout:2/332: creat db/d59/f7b x:0 0 0 2026-03-09T16:14:28.845 INFO:tasks.workunit.client.0.vm03.stdout:1/247: chown d4/d6/d3b/d4a/f57 1720782 1 2026-03-09T16:14:28.851 INFO:tasks.workunit.client.0.vm03.stdout:5/388: link d2/d7/d8/d24/d27/l6a d2/d75/l8a 0 2026-03-09T16:14:28.858 INFO:tasks.workunit.client.0.vm03.stdout:8/342: creat da/d10/f6e x:0 0 0 2026-03-09T16:14:28.860 INFO:tasks.workunit.client.0.vm03.stdout:0/358: creat d0/d7/d3e/d45/f77 x:0 0 0 2026-03-09T16:14:28.868 INFO:tasks.workunit.client.0.vm03.stdout:3/303: creat d5/d44/f56 x:0 0 0 2026-03-09T16:14:28.869 INFO:tasks.workunit.client.0.vm03.stdout:6/327: rmdir d9/d42/d45 39 2026-03-09T16:14:28.870 INFO:tasks.workunit.client.0.vm03.stdout:6/328: write d9/d22/f24 [5440320,73559] 0 2026-03-09T16:14:28.879 INFO:tasks.workunit.client.0.vm03.stdout:7/283: unlink d4/da/d18/d22/d24/f2f 0 2026-03-09T16:14:28.879 INFO:tasks.workunit.client.0.vm03.stdout:7/284: dread - d4/da/d18/d22/d24/d16/d2b/f5a zero size 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: pgmap v6: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:28.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:28 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:28.894 INFO:tasks.workunit.client.0.vm03.stdout:7/285: dread d4/da/f20 [0,4194304] 0 2026-03-09T16:14:28.897 INFO:tasks.workunit.client.0.vm03.stdout:8/343: readlink da/d10/d28/l50 0 2026-03-09T16:14:28.899 INFO:tasks.workunit.client.0.vm03.stdout:7/286: dwrite d4/da/d45/d51/f5b [0,4194304] 0 2026-03-09T16:14:28.903 INFO:tasks.workunit.client.0.vm03.stdout:9/393: write d2/d4/d11/d12/f1e [3945076,76449] 0 2026-03-09T16:14:28.912 INFO:tasks.workunit.client.0.vm03.stdout:0/359: truncate d0/da/f1c 415665 0 2026-03-09T16:14:28.919 INFO:tasks.workunit.client.0.vm03.stdout:2/333: mknod db/d12/d2a/d61/d6d/c7c 0 2026-03-09T16:14:28.927 INFO:tasks.workunit.client.0.vm03.stdout:1/248: rename d4/d6/d3b/d4a to d4/d6/d1d/d20/d5f 0 2026-03-09T16:14:28.927 INFO:tasks.workunit.client.0.vm03.stdout:1/249: stat d4/d6/d1d/d20 0 2026-03-09T16:14:28.933 INFO:tasks.workunit.client.0.vm03.stdout:3/304: write d5/f16 [298896,122671] 0 2026-03-09T16:14:28.942 INFO:tasks.workunit.client.0.vm03.stdout:5/389: link d2/d7/d8/f86 d2/d7/de/d33/f8b 0 2026-03-09T16:14:28.960 INFO:tasks.workunit.client.0.vm03.stdout:9/394: symlink d2/de/l79 0 2026-03-09T16:14:28.961 INFO:tasks.workunit.client.0.vm03.stdout:6/329: write d9/d14/f3d [372239,2531] 0 2026-03-09T16:14:28.962 INFO:tasks.workunit.client.0.vm03.stdout:2/334: dread db/f34 [0,4194304] 0 2026-03-09T16:14:28.968 INFO:tasks.workunit.client.0.vm03.stdout:1/250: creat d4/db/f60 x:0 0 0 2026-03-09T16:14:28.968 INFO:tasks.workunit.client.0.vm03.stdout:2/335: dread db/d59/f3f [0,4194304] 0 2026-03-09T16:14:28.972 INFO:tasks.workunit.client.0.vm03.stdout:9/395: mkdir d2/d4/d11/d29/d63/d7a 0 2026-03-09T16:14:28.977 INFO:tasks.workunit.client.0.vm03.stdout:4/344: getdents d5/db/d25 0 2026-03-09T16:14:28.977 INFO:tasks.workunit.client.0.vm03.stdout:4/345: fsync d5/d17/d44/f64 0 2026-03-09T16:14:28.978 INFO:tasks.workunit.client.0.vm03.stdout:2/336: dwrite db/d12/d2a/d61/f5c [0,4194304] 0 2026-03-09T16:14:28.988 INFO:tasks.workunit.client.0.vm03.stdout:6/330: chown d9/d42/d45/d47/c52 0 1 2026-03-09T16:14:28.989 INFO:tasks.workunit.client.0.vm03.stdout:6/331: dread - d9/d42/d45/d50/f51 zero size 2026-03-09T16:14:28.990 INFO:tasks.workunit.client.0.vm03.stdout:2/337: dread db/d12/f4b [0,4194304] 0 2026-03-09T16:14:28.990 INFO:tasks.workunit.client.0.vm03.stdout:6/332: fdatasync d9/d22/f27 0 2026-03-09T16:14:28.997 INFO:tasks.workunit.client.0.vm03.stdout:3/305: creat d5/d1e/d42/d55/f57 x:0 0 0 2026-03-09T16:14:29.000 INFO:tasks.workunit.client.0.vm03.stdout:2/338: dwrite db/d59/f3f [0,4194304] 0 2026-03-09T16:14:29.000 INFO:tasks.workunit.client.0.vm03.stdout:8/344: link da/db/d43/c4e da/d10/d28/d64/c6f 0 2026-03-09T16:14:29.007 INFO:tasks.workunit.client.0.vm03.stdout:7/287: rename d4/da/d18/d22/d24/d15/l27 to d4/da/d18/l62 0 2026-03-09T16:14:29.008 INFO:tasks.workunit.client.0.vm03.stdout:6/333: mknod d9/d42/d45/c5e 0 2026-03-09T16:14:29.010 INFO:tasks.workunit.client.0.vm03.stdout:9/396: mknod d2/d54/d6d/c7b 0 2026-03-09T16:14:29.011 INFO:tasks.workunit.client.0.vm03.stdout:6/334: chown d9/d22/f3f 39510 1 2026-03-09T16:14:29.016 INFO:tasks.workunit.client.0.vm03.stdout:1/251: mknod d4/d6/c61 0 2026-03-09T16:14:29.021 INFO:tasks.workunit.client.0.vm03.stdout:9/397: dwrite d2/d4/d11/d12/f50 [0,4194304] 0 2026-03-09T16:14:29.021 INFO:tasks.workunit.client.0.vm03.stdout:8/345: symlink da/db/d30/l70 0 2026-03-09T16:14:29.021 INFO:tasks.workunit.client.0.vm03.stdout:2/339: symlink db/d12/d2a/d61/d6d/l7d 0 2026-03-09T16:14:29.025 INFO:tasks.workunit.client.0.vm03.stdout:0/360: getdents d0/d7/d3e/d5d 0 2026-03-09T16:14:29.025 INFO:tasks.workunit.client.0.vm03.stdout:4/346: symlink d5/l68 0 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: pgmap v6: 65 pgs: 65 active+clean; 541 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:29.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:28 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:29.036 INFO:tasks.workunit.client.0.vm03.stdout:9/398: truncate d2/d4/d11/f41 527172 0 2026-03-09T16:14:29.036 INFO:tasks.workunit.client.0.vm03.stdout:1/252: write d4/d39/f5a [1359586,117847] 0 2026-03-09T16:14:29.038 INFO:tasks.workunit.client.0.vm03.stdout:2/340: fdatasync db/d12/f57 0 2026-03-09T16:14:29.043 INFO:tasks.workunit.client.0.vm03.stdout:0/361: mkdir d0/d7/d75/d78 0 2026-03-09T16:14:29.045 INFO:tasks.workunit.client.0.vm03.stdout:4/347: unlink d5/db/f3a 0 2026-03-09T16:14:29.046 INFO:tasks.workunit.client.0.vm03.stdout:6/335: mknod d9/d42/c5f 0 2026-03-09T16:14:29.048 INFO:tasks.workunit.client.0.vm03.stdout:6/336: write d9/d22/f37 [175525,99946] 0 2026-03-09T16:14:29.049 INFO:tasks.workunit.client.0.vm03.stdout:6/337: fsync d9/d22/f43 0 2026-03-09T16:14:29.052 INFO:tasks.workunit.client.0.vm03.stdout:8/346: mknod da/d10/d28/d64/c71 0 2026-03-09T16:14:29.058 INFO:tasks.workunit.client.0.vm03.stdout:5/390: write d2/d7/de/d11/f32 [746132,76792] 0 2026-03-09T16:14:29.068 INFO:tasks.workunit.client.0.vm03.stdout:7/288: write d4/da/d18/d22/f33 [502133,130831] 0 2026-03-09T16:14:29.069 INFO:tasks.workunit.client.0.vm03.stdout:7/289: fdatasync d4/da/d18/d22/d24/d16/d2b/f56 0 2026-03-09T16:14:29.069 INFO:tasks.workunit.client.0.vm03.stdout:7/290: readlink d4/dc/l17 0 2026-03-09T16:14:29.069 INFO:tasks.workunit.client.0.vm03.stdout:7/291: dread - d4/da/f5f zero size 2026-03-09T16:14:29.080 INFO:tasks.workunit.client.0.vm03.stdout:9/399: dwrite d2/d4/d11/d12/f45 [0,4194304] 0 2026-03-09T16:14:29.083 INFO:tasks.workunit.client.0.vm03.stdout:3/306: rename d5/d44/d3c to d5/d58 0 2026-03-09T16:14:29.084 INFO:tasks.workunit.client.0.vm03.stdout:0/362: mkdir d0/da/d1b/d79 0 2026-03-09T16:14:29.084 INFO:tasks.workunit.client.0.vm03.stdout:1/253: unlink d4/d6/c37 0 2026-03-09T16:14:29.085 INFO:tasks.workunit.client.0.vm03.stdout:3/307: dread - d5/d44/f54 zero size 2026-03-09T16:14:29.091 INFO:tasks.workunit.client.0.vm03.stdout:8/347: mknod da/d32/c72 0 2026-03-09T16:14:29.093 INFO:tasks.workunit.client.0.vm03.stdout:7/292: creat d4/da/d45/f63 x:0 0 0 2026-03-09T16:14:29.100 INFO:tasks.workunit.client.0.vm03.stdout:1/254: chown d4/d6/f19 83 1 2026-03-09T16:14:29.105 INFO:tasks.workunit.client.0.vm03.stdout:3/308: symlink d5/d1e/d42/d34/l59 0 2026-03-09T16:14:29.105 INFO:tasks.workunit.client.0.vm03.stdout:8/348: chown da/f35 1545 1 2026-03-09T16:14:29.110 INFO:tasks.workunit.client.0.vm03.stdout:4/348: getdents d5/db/d25 0 2026-03-09T16:14:29.112 INFO:tasks.workunit.client.0.vm03.stdout:1/255: read d4/f1b [1872771,65385] 0 2026-03-09T16:14:29.112 INFO:tasks.workunit.client.0.vm03.stdout:1/256: dread - d4/d6/d1d/d20/d5f/f57 zero size 2026-03-09T16:14:29.120 INFO:tasks.workunit.client.0.vm03.stdout:0/363: mkdir d0/da/d7a 0 2026-03-09T16:14:29.122 INFO:tasks.workunit.client.0.vm03.stdout:9/400: truncate d2/d4/d11/d12/f50 743527 0 2026-03-09T16:14:29.126 INFO:tasks.workunit.client.0.vm03.stdout:3/309: mkdir d5/d58/d5a 0 2026-03-09T16:14:29.129 INFO:tasks.workunit.client.0.vm03.stdout:3/310: dread d5/f33 [0,4194304] 0 2026-03-09T16:14:29.130 INFO:tasks.workunit.client.0.vm03.stdout:5/391: getdents d2/d7/d3c 0 2026-03-09T16:14:29.132 INFO:tasks.workunit.client.0.vm03.stdout:5/392: truncate d2/f7f 1001664 0 2026-03-09T16:14:29.136 INFO:tasks.workunit.client.0.vm03.stdout:2/341: rename db/d12/f72 to db/d59/f7e 0 2026-03-09T16:14:29.140 INFO:tasks.workunit.client.0.vm03.stdout:8/349: creat da/d10/d63/f73 x:0 0 0 2026-03-09T16:14:29.148 INFO:tasks.workunit.client.0.vm03.stdout:5/393: dwrite d2/d7/d1a/d1c/f5e [0,4194304] 0 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:0/364: creat d0/d7/d3e/d57/d5a/d47/f7b x:0 0 0 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:8/350: creat da/d1d/f74 x:0 0 0 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:3/311: truncate d5/d1e/f31 1691159 0 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:3/312: chown d5/d44/f54 3116487 1 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:1/257: link d4/f1b d4/d6/d1d/d20/d23/f62 0 2026-03-09T16:14:29.156 INFO:tasks.workunit.client.0.vm03.stdout:1/258: write d4/d6/d1d/d24/d25/f4e [2478690,50942] 0 2026-03-09T16:14:29.159 INFO:tasks.workunit.client.0.vm03.stdout:9/401: sync 2026-03-09T16:14:29.159 INFO:tasks.workunit.client.0.vm03.stdout:2/342: sync 2026-03-09T16:14:29.159 INFO:tasks.workunit.client.0.vm03.stdout:2/343: stat f5 0 2026-03-09T16:14:29.169 INFO:tasks.workunit.client.0.vm03.stdout:5/394: dwrite d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:29.182 INFO:tasks.workunit.client.0.vm03.stdout:2/344: fsync db/d12/f21 0 2026-03-09T16:14:29.182 INFO:tasks.workunit.client.0.vm03.stdout:1/259: stat d4/fd 0 2026-03-09T16:14:29.182 INFO:tasks.workunit.client.0.vm03.stdout:2/345: fsync db/d12/f69 0 2026-03-09T16:14:29.183 INFO:tasks.workunit.client.0.vm03.stdout:0/365: symlink d0/d7/d3e/d57/d5a/d52/l7c 0 2026-03-09T16:14:29.189 INFO:tasks.workunit.client.0.vm03.stdout:5/395: mknod d2/d7/d3c/c8c 0 2026-03-09T16:14:29.189 INFO:tasks.workunit.client.0.vm03.stdout:6/338: rename d9/d14/l55 to d9/d42/l60 0 2026-03-09T16:14:29.200 INFO:tasks.workunit.client.0.vm03.stdout:0/366: truncate d0/d7/f3d 1329084 0 2026-03-09T16:14:29.200 INFO:tasks.workunit.client.0.vm03.stdout:2/346: sync 2026-03-09T16:14:29.201 INFO:tasks.workunit.client.0.vm03.stdout:6/339: dwrite d9/f40 [0,4194304] 0 2026-03-09T16:14:29.202 INFO:tasks.workunit.client.0.vm03.stdout:3/313: rename d5/c3d to d5/d1e/d42/d34/c5b 0 2026-03-09T16:14:29.207 INFO:tasks.workunit.client.0.vm03.stdout:8/351: dread da/d10/d28/f2c [0,4194304] 0 2026-03-09T16:14:29.209 INFO:tasks.workunit.client.0.vm03.stdout:2/347: rmdir db/d59/d64/d68 39 2026-03-09T16:14:29.210 INFO:tasks.workunit.client.0.vm03.stdout:8/352: write f8 [3276183,116586] 0 2026-03-09T16:14:29.212 INFO:tasks.workunit.client.0.vm03.stdout:8/353: stat da/d32/c41 0 2026-03-09T16:14:29.212 INFO:tasks.workunit.client.0.vm03.stdout:2/348: read db/d59/f53 [922058,36354] 0 2026-03-09T16:14:29.215 INFO:tasks.workunit.client.0.vm03.stdout:8/354: sync 2026-03-09T16:14:29.215 INFO:tasks.workunit.client.0.vm03.stdout:3/314: dread d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:29.217 INFO:tasks.workunit.client.0.vm03.stdout:6/340: rmdir d9/d22 39 2026-03-09T16:14:29.218 INFO:tasks.workunit.client.0.vm03.stdout:5/396: rename d2/d7/d8/d16/d5c/f71 to d2/d7/de/d11/d19/d31/d35/d87/f8d 0 2026-03-09T16:14:29.218 INFO:tasks.workunit.client.0.vm03.stdout:8/355: write da/db/fe [775667,3281] 0 2026-03-09T16:14:29.222 INFO:tasks.workunit.client.0.vm03.stdout:1/260: dread f1 [0,4194304] 0 2026-03-09T16:14:29.229 INFO:tasks.workunit.client.0.vm03.stdout:5/397: sync 2026-03-09T16:14:29.229 INFO:tasks.workunit.client.0.vm03.stdout:1/261: dread d4/db/f47 [0,4194304] 0 2026-03-09T16:14:29.232 INFO:tasks.workunit.client.0.vm03.stdout:5/398: write d2/d7/de/d54/f4a [730474,25365] 0 2026-03-09T16:14:29.233 INFO:tasks.workunit.client.0.vm03.stdout:5/399: chown d2/d7/de/d11/c6 269848331 1 2026-03-09T16:14:29.258 INFO:tasks.workunit.client.0.vm03.stdout:1/262: mkdir d4/d6/d3b/d63 0 2026-03-09T16:14:29.260 INFO:tasks.workunit.client.0.vm03.stdout:4/349: write d5/d17/d44/f4a [3625180,37917] 0 2026-03-09T16:14:29.260 INFO:tasks.workunit.client.0.vm03.stdout:7/293: dwrite d4/da/d18/f44 [4194304,4194304] 0 2026-03-09T16:14:29.266 INFO:tasks.workunit.client.0.vm03.stdout:4/350: chown d5/dd/f23 2 1 2026-03-09T16:14:29.271 INFO:tasks.workunit.client.0.vm03.stdout:4/351: write d5/dd/f22 [4908777,89072] 0 2026-03-09T16:14:29.272 INFO:tasks.workunit.client.0.vm03.stdout:4/352: chown d5/dd/d1f/f60 975933777 1 2026-03-09T16:14:29.285 INFO:tasks.workunit.client.0.vm03.stdout:3/315: rename d5/d1e/d42/d34/l50 to d5/d1e/d42/d4c/l5c 0 2026-03-09T16:14:29.285 INFO:tasks.workunit.client.0.vm03.stdout:3/316: readlink d5/d58/l28 0 2026-03-09T16:14:29.288 INFO:tasks.workunit.client.0.vm03.stdout:2/349: creat db/d12/d2a/d61/d79/f7f x:0 0 0 2026-03-09T16:14:29.290 INFO:tasks.workunit.client.0.vm03.stdout:2/350: write db/d59/f7b [803462,90707] 0 2026-03-09T16:14:29.313 INFO:tasks.workunit.client.0.vm03.stdout:0/367: rename d0/d7/l5b to d0/d7/d3e/d57/d5a/d52/l7d 0 2026-03-09T16:14:29.318 INFO:tasks.workunit.client.0.vm03.stdout:0/368: dwrite d0/da/d1b/fd [0,4194304] 0 2026-03-09T16:14:29.328 INFO:tasks.workunit.client.0.vm03.stdout:9/402: write d2/d4/f17 [74276,12977] 0 2026-03-09T16:14:29.331 INFO:tasks.workunit.client.0.vm03.stdout:9/403: truncate d2/d4/d11/d29/d63/f6f 528805 0 2026-03-09T16:14:29.342 INFO:tasks.workunit.client.0.vm03.stdout:7/294: mknod d4/da/c64 0 2026-03-09T16:14:29.358 INFO:tasks.workunit.client.0.vm03.stdout:8/356: write da/d15/f1b [3125243,46200] 0 2026-03-09T16:14:29.359 INFO:tasks.workunit.client.0.vm03.stdout:8/357: write da/d10/d63/f73 [696998,9857] 0 2026-03-09T16:14:29.362 INFO:tasks.workunit.client.0.vm03.stdout:8/358: stat da/d32/f61 0 2026-03-09T16:14:29.363 INFO:tasks.workunit.client.0.vm03.stdout:8/359: readlink da/d32/l37 0 2026-03-09T16:14:29.364 INFO:tasks.workunit.client.0.vm03.stdout:8/360: write da/db/d30/f62 [411789,78441] 0 2026-03-09T16:14:29.366 INFO:tasks.workunit.client.0.vm03.stdout:8/361: write da/d1d/f4a [2718189,27048] 0 2026-03-09T16:14:29.369 INFO:tasks.workunit.client.0.vm03.stdout:8/362: fsync da/db/f44 0 2026-03-09T16:14:29.377 INFO:tasks.workunit.client.0.vm03.stdout:5/400: write d2/d7/de/d33/f8b [766386,128253] 0 2026-03-09T16:14:29.388 INFO:tasks.workunit.client.0.vm03.stdout:5/401: dread d2/d7/d1a/f6e [0,4194304] 0 2026-03-09T16:14:29.389 INFO:tasks.workunit.client.0.vm03.stdout:7/295: symlink d4/da/d18/d22/l65 0 2026-03-09T16:14:29.390 INFO:tasks.workunit.client.0.vm03.stdout:5/402: read d2/d7/de/d11/d38/d3b/f88 [209894,130425] 0 2026-03-09T16:14:29.400 INFO:tasks.workunit.client.0.vm03.stdout:4/353: write d5/db/d25/f4e [2517248,73483] 0 2026-03-09T16:14:29.404 INFO:tasks.workunit.client.0.vm03.stdout:9/404: symlink d2/d4/d11/l7c 0 2026-03-09T16:14:29.405 INFO:tasks.workunit.client.0.vm03.stdout:1/263: dwrite f1 [0,4194304] 0 2026-03-09T16:14:29.412 INFO:tasks.workunit.client.0.vm03.stdout:1/264: chown d4/d6/d1d/d20/d23/f28 2819 1 2026-03-09T16:14:29.417 INFO:tasks.workunit.client.0.vm03.stdout:8/363: fsync da/db/d30/f36 0 2026-03-09T16:14:29.421 INFO:tasks.workunit.client.0.vm03.stdout:8/364: dread da/d1d/f5e [0,4194304] 0 2026-03-09T16:14:29.423 INFO:tasks.workunit.client.0.vm03.stdout:3/317: creat d5/d44/f5d x:0 0 0 2026-03-09T16:14:29.423 INFO:tasks.workunit.client.0.vm03.stdout:2/351: creat db/d59/d64/f80 x:0 0 0 2026-03-09T16:14:29.423 INFO:tasks.workunit.client.0.vm03.stdout:2/352: stat db/d59/d64/f66 0 2026-03-09T16:14:29.426 INFO:tasks.workunit.client.0.vm03.stdout:7/296: unlink d4/da/d18/d22/d24/d16/f39 0 2026-03-09T16:14:29.427 INFO:tasks.workunit.client.0.vm03.stdout:5/403: creat d2/d7/de/d11/d19/f8e x:0 0 0 2026-03-09T16:14:29.427 INFO:tasks.workunit.client.0.vm03.stdout:7/297: write d4/da/d45/f58 [995490,53688] 0 2026-03-09T16:14:29.432 INFO:tasks.workunit.client.0.vm03.stdout:5/404: dwrite d2/d7/de/d54/f4a [0,4194304] 0 2026-03-09T16:14:29.439 INFO:tasks.workunit.client.0.vm03.stdout:5/405: chown d2/d7/de/d54/l76 0 1 2026-03-09T16:14:29.440 INFO:tasks.workunit.client.0.vm03.stdout:7/298: dwrite d4/da/d45/d51/f5b [0,4194304] 0 2026-03-09T16:14:29.443 INFO:tasks.workunit.client.0.vm03.stdout:3/318: sync 2026-03-09T16:14:29.445 INFO:tasks.workunit.client.0.vm03.stdout:8/365: sync 2026-03-09T16:14:29.450 INFO:tasks.workunit.client.0.vm03.stdout:3/319: write d5/d1e/d42/f2c [70748,52262] 0 2026-03-09T16:14:29.453 INFO:tasks.workunit.client.0.vm03.stdout:5/406: dwrite d2/d7/d1a/d1c/d6c/f6d [0,4194304] 0 2026-03-09T16:14:29.454 INFO:tasks.workunit.client.0.vm03.stdout:6/341: rename d9/d42/d45/l5d to d9/d14/l61 0 2026-03-09T16:14:29.467 INFO:tasks.workunit.client.0.vm03.stdout:9/405: mkdir d2/d54/d7d 0 2026-03-09T16:14:29.469 INFO:tasks.workunit.client.0.vm03.stdout:1/265: mknod d4/db/c64 0 2026-03-09T16:14:29.473 INFO:tasks.workunit.client.0.vm03.stdout:8/366: sync 2026-03-09T16:14:29.484 INFO:tasks.workunit.client.0.vm03.stdout:2/353: creat db/d12/d2a/d61/d6d/f81 x:0 0 0 2026-03-09T16:14:29.484 INFO:tasks.workunit.client.0.vm03.stdout:3/320: dread d5/fb [0,4194304] 0 2026-03-09T16:14:29.493 INFO:tasks.workunit.client.0.vm03.stdout:7/299: dread d4/f26 [0,4194304] 0 2026-03-09T16:14:29.495 INFO:tasks.workunit.client.0.vm03.stdout:7/300: read - d4/da/d45/f4e zero size 2026-03-09T16:14:29.496 INFO:tasks.workunit.client.0.vm03.stdout:0/369: dwrite d0/d7/d75/f69 [0,4194304] 0 2026-03-09T16:14:29.497 INFO:tasks.workunit.client.0.vm03.stdout:0/370: write d0/da/d1b/f46 [1599087,88100] 0 2026-03-09T16:14:29.509 INFO:tasks.workunit.client.0.vm03.stdout:5/407: unlink d2/d7/de/d54/f4a 0 2026-03-09T16:14:29.510 INFO:tasks.workunit.client.0.vm03.stdout:5/408: fsync d2/d7/d3c/d3d/f56 0 2026-03-09T16:14:29.513 INFO:tasks.workunit.client.0.vm03.stdout:4/354: rename d5/db/d25/d31/f47 to d5/db/d25/d31/d33/f69 0 2026-03-09T16:14:29.516 INFO:tasks.workunit.client.0.vm03.stdout:1/266: write d4/d6/f15 [2077324,128273] 0 2026-03-09T16:14:29.519 INFO:tasks.workunit.client.0.vm03.stdout:1/267: dwrite d4/fa [0,4194304] 0 2026-03-09T16:14:29.526 INFO:tasks.workunit.client.0.vm03.stdout:5/409: sync 2026-03-09T16:14:29.526 INFO:tasks.workunit.client.0.vm03.stdout:5/410: readlink d2/d7/d8/l46 0 2026-03-09T16:14:29.528 INFO:tasks.workunit.client.0.vm03.stdout:5/411: read d2/d7/d1a/f4d [3775493,67086] 0 2026-03-09T16:14:29.535 INFO:tasks.workunit.client.0.vm03.stdout:3/321: rename d5/d58/l2d to d5/d58/l5e 0 2026-03-09T16:14:29.544 INFO:tasks.workunit.client.0.vm03.stdout:7/301: chown d4/dc/d61/c5e 58505402 1 2026-03-09T16:14:29.544 INFO:tasks.workunit.client.0.vm03.stdout:4/355: write d5/db/d25/f26 [3043449,2003] 0 2026-03-09T16:14:29.544 INFO:tasks.workunit.client.0.vm03.stdout:6/342: creat d9/d22/f62 x:0 0 0 2026-03-09T16:14:29.544 INFO:tasks.workunit.client.0.vm03.stdout:5/412: mknod d2/d7/d8/d16/c8f 0 2026-03-09T16:14:29.547 INFO:tasks.workunit.client.0.vm03.stdout:2/354: symlink db/d59/d64/l82 0 2026-03-09T16:14:29.548 INFO:tasks.workunit.client.0.vm03.stdout:3/322: read d5/d1e/f26 [413721,43774] 0 2026-03-09T16:14:29.552 INFO:tasks.workunit.client.0.vm03.stdout:7/302: mkdir d4/da/d45/d51/d36/d66 0 2026-03-09T16:14:29.557 INFO:tasks.workunit.client.0.vm03.stdout:6/343: mkdir d9/d42/d45/d63 0 2026-03-09T16:14:29.562 INFO:tasks.workunit.client.0.vm03.stdout:5/413: mkdir d2/d7/de/d11/d19/d29/d90 0 2026-03-09T16:14:29.564 INFO:tasks.workunit.client.0.vm03.stdout:6/344: sync 2026-03-09T16:14:29.566 INFO:tasks.workunit.client.0.vm03.stdout:8/367: creat da/db/f75 x:0 0 0 2026-03-09T16:14:29.577 INFO:tasks.workunit.client.0.vm03.stdout:2/355: dread db/f2d [0,4194304] 0 2026-03-09T16:14:29.578 INFO:tasks.workunit.client.0.vm03.stdout:2/356: readlink db/d12/d2a/l56 0 2026-03-09T16:14:29.579 INFO:tasks.workunit.client.0.vm03.stdout:0/371: write d0/d7/d48/f13 [404247,34502] 0 2026-03-09T16:14:29.580 INFO:tasks.workunit.client.0.vm03.stdout:0/372: stat d0/d7/d3e/d45/f76 0 2026-03-09T16:14:29.580 INFO:tasks.workunit.client.0.vm03.stdout:0/373: write d0/da/d1b/f73 [613634,36533] 0 2026-03-09T16:14:29.584 INFO:tasks.workunit.client.0.vm03.stdout:3/323: creat d5/d1e/d42/d34/f5f x:0 0 0 2026-03-09T16:14:29.585 INFO:tasks.workunit.client.0.vm03.stdout:9/406: truncate d2/d4/f17 425509 0 2026-03-09T16:14:29.591 INFO:tasks.workunit.client.0.vm03.stdout:0/374: creat d0/d7/d3e/d57/d5a/d74/f7e x:0 0 0 2026-03-09T16:14:29.593 INFO:tasks.workunit.client.0.vm03.stdout:3/324: fdatasync d5/f2b 0 2026-03-09T16:14:29.593 INFO:tasks.workunit.client.0.vm03.stdout:3/325: chown d5/lf 331923015 1 2026-03-09T16:14:29.595 INFO:tasks.workunit.client.0.vm03.stdout:9/407: fdatasync d2/d4/d11/d29/f4e 0 2026-03-09T16:14:29.596 INFO:tasks.workunit.client.0.vm03.stdout:9/408: truncate d2/d4/d11/d12/f35 66661 0 2026-03-09T16:14:29.597 INFO:tasks.workunit.client.0.vm03.stdout:1/268: getdents d4/db 0 2026-03-09T16:14:29.598 INFO:tasks.workunit.client.0.vm03.stdout:5/414: mknod d2/d7/de/d11/d19/d29/d90/c91 0 2026-03-09T16:14:29.601 INFO:tasks.workunit.client.0.vm03.stdout:6/345: creat d9/d42/d45/d63/f64 x:0 0 0 2026-03-09T16:14:29.602 INFO:tasks.workunit.client.0.vm03.stdout:9/409: sync 2026-03-09T16:14:29.603 INFO:tasks.workunit.client.0.vm03.stdout:9/410: write d2/df/f42 [1743379,67998] 0 2026-03-09T16:14:29.609 INFO:tasks.workunit.client.0.vm03.stdout:4/356: write d5/db/f34 [620256,109616] 0 2026-03-09T16:14:29.610 INFO:tasks.workunit.client.0.vm03.stdout:8/368: write da/db/f34 [247877,29186] 0 2026-03-09T16:14:29.612 INFO:tasks.workunit.client.0.vm03.stdout:4/357: dread d5/d17/f39 [0,4194304] 0 2026-03-09T16:14:29.612 INFO:tasks.workunit.client.0.vm03.stdout:2/357: rmdir db/d59/d52 39 2026-03-09T16:14:29.615 INFO:tasks.workunit.client.0.vm03.stdout:2/358: dread db/d12/f69 [0,4194304] 0 2026-03-09T16:14:29.637 INFO:tasks.workunit.client.0.vm03.stdout:7/303: creat d4/da/d18/d22/d24/d16/f67 x:0 0 0 2026-03-09T16:14:29.637 INFO:tasks.workunit.client.0.vm03.stdout:7/304: readlink d4/da/d45/l49 0 2026-03-09T16:14:29.638 INFO:tasks.workunit.client.0.vm03.stdout:7/305: write d4/da/d18/d22/d24/d15/f2a [3509575,19151] 0 2026-03-09T16:14:29.639 INFO:tasks.workunit.client.0.vm03.stdout:7/306: fsync d4/da/d45/f63 0 2026-03-09T16:14:29.641 INFO:tasks.workunit.client.0.vm03.stdout:7/307: dwrite d4/da/d18/d22/f33 [0,4194304] 0 2026-03-09T16:14:29.650 INFO:tasks.workunit.client.0.vm03.stdout:5/415: unlink d2/l22 0 2026-03-09T16:14:29.651 INFO:tasks.workunit.client.0.vm03.stdout:6/346: mkdir d9/d42/d45/d65 0 2026-03-09T16:14:29.653 INFO:tasks.workunit.client.0.vm03.stdout:5/416: dwrite d2/d7/d1a/d1c/d6c/f6d [4194304,4194304] 0 2026-03-09T16:14:29.653 INFO:tasks.workunit.client.0.vm03.stdout:5/417: write d2/d7/d8/f7a [529224,75057] 0 2026-03-09T16:14:29.654 INFO:tasks.workunit.client.0.vm03.stdout:5/418: chown d2/d7/de/d11/d19/d29/f77 91 1 2026-03-09T16:14:29.658 INFO:tasks.workunit.client.0.vm03.stdout:8/369: rename da/d10/f14 to da/db/d30/f76 0 2026-03-09T16:14:29.659 INFO:tasks.workunit.client.0.vm03.stdout:8/370: chown da/d10/d28/f2c 3182958 1 2026-03-09T16:14:29.660 INFO:tasks.workunit.client.0.vm03.stdout:8/371: chown da/f52 3 1 2026-03-09T16:14:29.667 INFO:tasks.workunit.client.0.vm03.stdout:6/347: rmdir d9/d14 39 2026-03-09T16:14:29.669 INFO:tasks.workunit.client.0.vm03.stdout:5/419: creat d2/d7/d1a/d1c/d3f/f92 x:0 0 0 2026-03-09T16:14:29.670 INFO:tasks.workunit.client.0.vm03.stdout:5/420: chown d2/d7/de/d11/d19/d31/d35/d87 806 1 2026-03-09T16:14:29.671 INFO:tasks.workunit.client.0.vm03.stdout:5/421: write d2/f7f [139837,127145] 0 2026-03-09T16:14:29.674 INFO:tasks.workunit.client.0.vm03.stdout:2/359: rename db/d59 to db/d12/d2a/d61/d79/d83 0 2026-03-09T16:14:29.675 INFO:tasks.workunit.client.0.vm03.stdout:8/372: mknod da/d15/c77 0 2026-03-09T16:14:29.677 INFO:tasks.workunit.client.0.vm03.stdout:8/373: dread da/db/f1c [0,4194304] 0 2026-03-09T16:14:29.679 INFO:tasks.workunit.client.0.vm03.stdout:4/358: creat d5/d56/f6a x:0 0 0 2026-03-09T16:14:29.680 INFO:tasks.workunit.client.0.vm03.stdout:0/375: rmdir d0/d7/d75/d78 0 2026-03-09T16:14:29.681 INFO:tasks.workunit.client.0.vm03.stdout:3/326: link d5/l21 d5/d53/l60 0 2026-03-09T16:14:29.682 INFO:tasks.workunit.client.0.vm03.stdout:6/348: mkdir d9/d42/d45/d63/d66 0 2026-03-09T16:14:29.683 INFO:tasks.workunit.client.0.vm03.stdout:5/422: creat d2/d7/d3c/d3d/f93 x:0 0 0 2026-03-09T16:14:29.685 INFO:tasks.workunit.client.0.vm03.stdout:8/374: chown da/d10/f23 0 1 2026-03-09T16:14:29.686 INFO:tasks.workunit.client.0.vm03.stdout:0/376: unlink d0/da/d1b/f73 0 2026-03-09T16:14:29.691 INFO:tasks.workunit.client.0.vm03.stdout:7/308: getdents d4/da/d18/d22/d24/d16/d2b 0 2026-03-09T16:14:29.692 INFO:tasks.workunit.client.0.vm03.stdout:5/423: truncate d2/d7/d8/f36 929917 0 2026-03-09T16:14:29.692 INFO:tasks.workunit.client.0.vm03.stdout:5/424: readlink d2/d7/l89 0 2026-03-09T16:14:29.695 INFO:tasks.workunit.client.0.vm03.stdout:4/359: sync 2026-03-09T16:14:29.695 INFO:tasks.workunit.client.0.vm03.stdout:8/375: sync 2026-03-09T16:14:29.696 INFO:tasks.workunit.client.0.vm03.stdout:4/360: write d5/d17/d44/f64 [5106376,55100] 0 2026-03-09T16:14:29.698 INFO:tasks.workunit.client.0.vm03.stdout:2/360: rename db/d12/f21 to db/d12/f84 0 2026-03-09T16:14:29.699 INFO:tasks.workunit.client.0.vm03.stdout:4/361: sync 2026-03-09T16:14:29.700 INFO:tasks.workunit.client.0.vm03.stdout:0/377: mknod d0/da/d7a/c7f 0 2026-03-09T16:14:29.700 INFO:tasks.workunit.client.0.vm03.stdout:4/362: write d5/dd/d1f/f60 [549554,62812] 0 2026-03-09T16:14:29.701 INFO:tasks.workunit.client.0.vm03.stdout:0/378: dread d0/f1e [0,4194304] 0 2026-03-09T16:14:29.709 INFO:tasks.workunit.client.0.vm03.stdout:1/269: write d4/d6/d1d/d20/d23/f62 [2874535,34123] 0 2026-03-09T16:14:29.710 INFO:tasks.workunit.client.0.vm03.stdout:1/270: readlink d4/db/l55 0 2026-03-09T16:14:29.714 INFO:tasks.workunit.client.0.vm03.stdout:3/327: write d5/f33 [2012002,88989] 0 2026-03-09T16:14:29.718 INFO:tasks.workunit.client.0.vm03.stdout:7/309: symlink d4/da/d18/d22/d24/d16/d2b/l68 0 2026-03-09T16:14:29.721 INFO:tasks.workunit.client.0.vm03.stdout:7/310: dwrite d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:29.728 INFO:tasks.workunit.client.0.vm03.stdout:6/349: symlink d9/d14/l67 0 2026-03-09T16:14:29.729 INFO:tasks.workunit.client.0.vm03.stdout:6/350: stat d9/cd 0 2026-03-09T16:14:29.729 INFO:tasks.workunit.client.0.vm03.stdout:9/411: write d2/d4/f17 [1441673,8558] 0 2026-03-09T16:14:29.730 INFO:tasks.workunit.client.0.vm03.stdout:9/412: fdatasync d2/df/f42 0 2026-03-09T16:14:29.730 INFO:tasks.workunit.client.0.vm03.stdout:9/413: readlink d2/df/l16 0 2026-03-09T16:14:29.733 INFO:tasks.workunit.client.0.vm03.stdout:5/425: creat d2/d7/d8/d16/d5c/f94 x:0 0 0 2026-03-09T16:14:29.740 INFO:tasks.workunit.client.0.vm03.stdout:6/351: symlink d9/d22/l68 0 2026-03-09T16:14:29.745 INFO:tasks.workunit.client.0.vm03.stdout:9/414: rmdir d2/d4 39 2026-03-09T16:14:29.749 INFO:tasks.workunit.client.0.vm03.stdout:3/328: mkdir d5/d44/d61 0 2026-03-09T16:14:29.753 INFO:tasks.workunit.client.0.vm03.stdout:3/329: dwrite d5/d1e/d42/f29 [0,4194304] 0 2026-03-09T16:14:29.760 INFO:tasks.workunit.client.0.vm03.stdout:3/330: dwrite d5/f16 [0,4194304] 0 2026-03-09T16:14:29.763 INFO:tasks.workunit.client.0.vm03.stdout:3/331: stat d5/d58 0 2026-03-09T16:14:29.765 INFO:tasks.workunit.client.0.vm03.stdout:8/376: write da/d10/f33 [898372,128829] 0 2026-03-09T16:14:29.770 INFO:tasks.workunit.client.0.vm03.stdout:0/379: dwrite d0/da/d1b/f6e [0,4194304] 0 2026-03-09T16:14:29.773 INFO:tasks.workunit.client.0.vm03.stdout:4/363: dwrite d5/d17/f21 [0,4194304] 0 2026-03-09T16:14:29.773 INFO:tasks.workunit.client.0.vm03.stdout:0/380: chown d0/d7/d3e/f4f 3852 1 2026-03-09T16:14:29.777 INFO:tasks.workunit.client.0.vm03.stdout:4/364: chown d5/db/d25/f4e 125 1 2026-03-09T16:14:29.784 INFO:tasks.workunit.client.0.vm03.stdout:7/311: mkdir d4/da/d18/d22/d24/d16/d69 0 2026-03-09T16:14:29.786 INFO:tasks.workunit.client.0.vm03.stdout:9/415: creat d2/d54/d6d/f7e x:0 0 0 2026-03-09T16:14:29.788 INFO:tasks.workunit.client.0.vm03.stdout:5/426: mknod d2/d75/c95 0 2026-03-09T16:14:29.790 INFO:tasks.workunit.client.0.vm03.stdout:2/361: rename f5 to db/d12/f85 0 2026-03-09T16:14:29.792 INFO:tasks.workunit.client.0.vm03.stdout:3/332: unlink d5/d44/c48 0 2026-03-09T16:14:29.794 INFO:tasks.workunit.client.0.vm03.stdout:8/377: mknod da/d10/d28/d64/c78 0 2026-03-09T16:14:29.796 INFO:tasks.workunit.client.0.vm03.stdout:0/381: rmdir d0/d7/d48 39 2026-03-09T16:14:29.799 INFO:tasks.workunit.client.0.vm03.stdout:6/352: symlink d9/d42/d45/d65/l69 0 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:6/353: chown d9/l5b 425 1 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:5/427: symlink d2/d7/d1a/d1c/d6c/l96 0 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:9/416: dread d2/f7 [0,4194304] 0 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:9/417: chown d2/df 51 1 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:3/333: symlink d5/d58/l62 0 2026-03-09T16:14:29.807 INFO:tasks.workunit.client.0.vm03.stdout:0/382: truncate d0/da/d5c/f31 1532597 0 2026-03-09T16:14:29.808 INFO:tasks.workunit.client.0.vm03.stdout:8/378: dread da/d32/f4d [0,4194304] 0 2026-03-09T16:14:29.809 INFO:tasks.workunit.client.0.vm03.stdout:8/379: fdatasync da/db/f34 0 2026-03-09T16:14:29.809 INFO:tasks.workunit.client.0.vm03.stdout:0/383: dread d0/d7/d75/f69 [0,4194304] 0 2026-03-09T16:14:29.810 INFO:tasks.workunit.client.0.vm03.stdout:6/354: creat d9/d42/d45/d47/f6a x:0 0 0 2026-03-09T16:14:29.813 INFO:tasks.workunit.client.0.vm03.stdout:1/271: rename d4/d6/l10 to d4/d6/d1d/d20/d23/d3e/l65 0 2026-03-09T16:14:29.814 INFO:tasks.workunit.client.0.vm03.stdout:2/362: creat db/d12/d2a/d61/d79/d83/d52/f86 x:0 0 0 2026-03-09T16:14:29.814 INFO:tasks.workunit.client.0.vm03.stdout:2/363: write f0 [1041537,99945] 0 2026-03-09T16:14:29.817 INFO:tasks.workunit.client.0.vm03.stdout:7/312: creat d4/da/d18/f6a x:0 0 0 2026-03-09T16:14:29.831 INFO:tasks.workunit.client.0.vm03.stdout:9/418: stat d2/d4/d11/d29/d63/l78 0 2026-03-09T16:14:29.835 INFO:tasks.workunit.client.0.vm03.stdout:9/419: truncate d2/f5a 604032 0 2026-03-09T16:14:29.837 INFO:tasks.workunit.client.0.vm03.stdout:9/420: write d2/d4/f17 [138721,51222] 0 2026-03-09T16:14:29.853 INFO:tasks.workunit.client.0.vm03.stdout:8/380: rmdir da 39 2026-03-09T16:14:29.864 INFO:tasks.workunit.client.0.vm03.stdout:0/384: mknod d0/da/d1b/c80 0 2026-03-09T16:14:29.866 INFO:tasks.workunit.client.0.vm03.stdout:1/272: fsync d4/db/f2e 0 2026-03-09T16:14:29.870 INFO:tasks.workunit.client.0.vm03.stdout:3/334: mkdir d5/d58/d5a/d63 0 2026-03-09T16:14:29.870 INFO:tasks.workunit.client.0.vm03.stdout:2/364: chown db/d12/d2a/d61/c4e 206 1 2026-03-09T16:14:29.872 INFO:tasks.workunit.client.0.vm03.stdout:4/365: getdents d5 0 2026-03-09T16:14:29.872 INFO:tasks.workunit.client.0.vm03.stdout:4/366: chown d5/f9 39789 1 2026-03-09T16:14:29.873 INFO:tasks.workunit.client.0.vm03.stdout:4/367: fdatasync d5/dd/f22 0 2026-03-09T16:14:29.880 INFO:tasks.workunit.client.0.vm03.stdout:3/335: dwrite d5/d1e/d42/f29 [4194304,4194304] 0 2026-03-09T16:14:29.884 INFO:tasks.workunit.client.0.vm03.stdout:3/336: stat d5/f2b 0 2026-03-09T16:14:29.884 INFO:tasks.workunit.client.0.vm03.stdout:2/365: dwrite db/d12/d2a/f58 [0,4194304] 0 2026-03-09T16:14:29.887 INFO:tasks.workunit.client.0.vm03.stdout:2/366: dread db/f55 [0,4194304] 0 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.899 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.900 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:29 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:29.909 INFO:tasks.workunit.client.0.vm03.stdout:5/428: link d2/d7/d8/d24/d27/d43/d4b/c4c d2/d7/de/d11/c97 0 2026-03-09T16:14:29.911 INFO:tasks.workunit.client.0.vm03.stdout:7/313: write d4/dc/f1a [2415700,101704] 0 2026-03-09T16:14:29.911 INFO:tasks.workunit.client.0.vm03.stdout:5/429: readlink d2/d7/d1a/d1c/d6c/l96 0 2026-03-09T16:14:29.911 INFO:tasks.workunit.client.0.vm03.stdout:5/430: write d2/d7/d3c/d3d/f93 [24248,67493] 0 2026-03-09T16:14:29.911 INFO:tasks.workunit.client.0.vm03.stdout:5/431: read d2/f7f [868183,8437] 0 2026-03-09T16:14:29.930 INFO:tasks.workunit.client.0.vm03.stdout:8/381: write da/db/f75 [952619,40791] 0 2026-03-09T16:14:29.930 INFO:tasks.workunit.client.0.vm03.stdout:0/385: dread d0/f29 [0,4194304] 0 2026-03-09T16:14:29.943 INFO:tasks.workunit.client.0.vm03.stdout:6/355: mknod d9/d42/d45/d63/d66/c6b 0 2026-03-09T16:14:29.943 INFO:tasks.workunit.client.0.vm03.stdout:1/273: rmdir d4/d6 39 2026-03-09T16:14:29.952 INFO:tasks.workunit.client.0.vm03.stdout:4/368: symlink d5/l6b 0 2026-03-09T16:14:29.972 INFO:tasks.workunit.client.0.vm03.stdout:5/432: symlink d2/d7/de/d11/d38/d52/l98 0 2026-03-09T16:14:29.973 INFO:tasks.workunit.client.0.vm03.stdout:6/356: dread d9/d22/f27 [0,4194304] 0 2026-03-09T16:14:29.973 INFO:tasks.workunit.client.0.vm03.stdout:8/382: mkdir da/d32/d79 0 2026-03-09T16:14:29.973 INFO:tasks.workunit.client.0.vm03.stdout:0/386: mknod d0/d7/d3e/d5d/c81 0 2026-03-09T16:14:29.973 INFO:tasks.workunit.client.0.vm03.stdout:8/383: chown c4 128 1 2026-03-09T16:14:29.974 INFO:tasks.workunit.client.0.vm03.stdout:9/421: truncate d2/de/f1c 1143676 0 2026-03-09T16:14:29.975 INFO:tasks.workunit.client.0.vm03.stdout:8/384: truncate da/d10/f6e 845571 0 2026-03-09T16:14:29.982 INFO:tasks.workunit.client.0.vm03.stdout:3/337: mknod d5/d53/c64 0 2026-03-09T16:14:29.982 INFO:tasks.workunit.client.0.vm03.stdout:5/433: read d2/d7/de/d11/f32 [729921,49970] 0 2026-03-09T16:14:29.990 INFO:tasks.workunit.client.0.vm03.stdout:7/314: creat d4/d2d/d4b/f6b x:0 0 0 2026-03-09T16:14:29.991 INFO:tasks.workunit.client.0.vm03.stdout:7/315: write d4/da/d18/d22/d24/d16/f67 [893585,117150] 0 2026-03-09T16:14:29.991 INFO:tasks.workunit.client.0.vm03.stdout:7/316: dread - d4/da/d18/d22/d24/f59 zero size 2026-03-09T16:14:29.992 INFO:tasks.workunit.client.0.vm03.stdout:7/317: write d4/da/d45/f4e [106876,66173] 0 2026-03-09T16:14:29.995 INFO:tasks.workunit.client.0.vm03.stdout:9/422: dread d2/df/f42 [0,4194304] 0 2026-03-09T16:14:30.000 INFO:tasks.workunit.client.0.vm03.stdout:9/423: readlink d2/d4/d11/d12/l1b 0 2026-03-09T16:14:30.014 INFO:tasks.workunit.client.0.vm03.stdout:0/387: mkdir d0/d7/d3e/d57/d5a/d82 0 2026-03-09T16:14:30.018 INFO:tasks.workunit.client.0.vm03.stdout:0/388: dwrite d0/d7/d3e/d57/d5a/d5f/f71 [0,4194304] 0 2026-03-09T16:14:30.019 INFO:tasks.workunit.client.0.vm03.stdout:0/389: chown d0/da/d1b/c80 6 1 2026-03-09T16:14:30.022 INFO:tasks.workunit.client.0.vm03.stdout:0/390: write d0/da/d5c/f31 [1686287,58678] 0 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:30.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:29 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:30.030 INFO:tasks.workunit.client.0.vm03.stdout:6/357: dwrite d9/f15 [0,4194304] 0 2026-03-09T16:14:30.035 INFO:tasks.workunit.client.0.vm03.stdout:2/367: creat db/d12/d2a/d61/d79/d83/f87 x:0 0 0 2026-03-09T16:14:30.038 INFO:tasks.workunit.client.0.vm03.stdout:6/358: write d9/f35 [327274,111540] 0 2026-03-09T16:14:30.046 INFO:tasks.workunit.client.0.vm03.stdout:6/359: write d9/d22/f37 [3252884,100300] 0 2026-03-09T16:14:30.046 INFO:tasks.workunit.client.0.vm03.stdout:6/360: dread d9/d42/d45/f4d [0,4194304] 0 2026-03-09T16:14:30.047 INFO:tasks.workunit.client.0.vm03.stdout:3/338: fsync d5/f2b 0 2026-03-09T16:14:30.050 INFO:tasks.workunit.client.0.vm03.stdout:3/339: dwrite d5/d1e/d42/d4c/f51 [0,4194304] 0 2026-03-09T16:14:30.050 INFO:tasks.workunit.client.0.vm03.stdout:3/340: write d5/f33 [4556817,71148] 0 2026-03-09T16:14:30.058 INFO:tasks.workunit.client.0.vm03.stdout:5/434: rmdir d2/d7/de/d11/d19/d31/d35 39 2026-03-09T16:14:30.067 INFO:tasks.workunit.client.0.vm03.stdout:9/424: truncate d2/d4/d11/d29/f4e 430775 0 2026-03-09T16:14:30.067 INFO:tasks.workunit.client.0.vm03.stdout:8/385: mkdir da/d6c/d7a 0 2026-03-09T16:14:30.067 INFO:tasks.workunit.client.0.vm03.stdout:4/369: symlink d5/d40/d46/l6c 0 2026-03-09T16:14:30.068 INFO:tasks.workunit.client.0.vm03.stdout:0/391: mknod d0/da/d5c/c83 0 2026-03-09T16:14:30.072 INFO:tasks.workunit.client.0.vm03.stdout:8/386: dwrite da/d10/d28/f5c [0,4194304] 0 2026-03-09T16:14:30.075 INFO:tasks.workunit.client.0.vm03.stdout:8/387: write da/d32/f66 [610597,48029] 0 2026-03-09T16:14:30.075 INFO:tasks.workunit.client.0.vm03.stdout:8/388: fsync da/db/f6a 0 2026-03-09T16:14:30.080 INFO:tasks.workunit.client.0.vm03.stdout:8/389: dwrite f8 [0,4194304] 0 2026-03-09T16:14:30.089 INFO:tasks.workunit.client.0.vm03.stdout:1/274: rename f1 to d4/d6/d1d/f66 0 2026-03-09T16:14:30.092 INFO:tasks.workunit.client.0.vm03.stdout:9/425: symlink d2/d4/d11/d12/l7f 0 2026-03-09T16:14:30.096 INFO:tasks.workunit.client.0.vm03.stdout:3/341: unlink d5/d58/l28 0 2026-03-09T16:14:30.096 INFO:tasks.workunit.client.0.vm03.stdout:7/318: creat d4/da/d18/d22/d24/d16/f6c x:0 0 0 2026-03-09T16:14:30.096 INFO:tasks.workunit.client.0.vm03.stdout:7/319: fdatasync d4/da/d18/d22/d24/f59 0 2026-03-09T16:14:30.097 INFO:tasks.workunit.client.0.vm03.stdout:6/361: symlink d9/l6c 0 2026-03-09T16:14:30.099 INFO:tasks.workunit.client.0.vm03.stdout:7/320: dread d4/da/d45/f4e [0,4194304] 0 2026-03-09T16:14:30.100 INFO:tasks.workunit.client.0.vm03.stdout:7/321: read d4/da/d18/d22/f33 [4158888,25978] 0 2026-03-09T16:14:30.103 INFO:tasks.workunit.client.0.vm03.stdout:3/342: creat d5/d2e/f65 x:0 0 0 2026-03-09T16:14:30.110 INFO:tasks.workunit.client.0.vm03.stdout:6/362: symlink d9/d22/l6d 0 2026-03-09T16:14:30.113 INFO:tasks.workunit.client.0.vm03.stdout:2/368: getdents db/d12/d2a/d61/d79 0 2026-03-09T16:14:30.114 INFO:tasks.workunit.client.0.vm03.stdout:7/322: write d4/da/d18/d22/d24/d16/d2b/f5a [740820,100283] 0 2026-03-09T16:14:30.118 INFO:tasks.workunit.client.0.vm03.stdout:1/275: dread d4/fd [0,4194304] 0 2026-03-09T16:14:30.124 INFO:tasks.workunit.client.0.vm03.stdout:1/276: write d4/d6/d1d/d3d/f45 [941460,48152] 0 2026-03-09T16:14:30.124 INFO:tasks.workunit.client.0.vm03.stdout:1/277: fdatasync d4/d6/d1d/d3d/f49 0 2026-03-09T16:14:30.124 INFO:tasks.workunit.client.0.vm03.stdout:5/435: rename d2/d7/de/d11/d19/d29/f77 to d2/d7/de/d11/d19/d31/f99 0 2026-03-09T16:14:30.132 INFO:tasks.workunit.client.0.vm03.stdout:1/278: symlink d4/d6/d1d/d20/d23/d3e/d3f/l67 0 2026-03-09T16:14:30.132 INFO:tasks.workunit.client.0.vm03.stdout:1/279: fdatasync d4/d31/f4f 0 2026-03-09T16:14:30.136 INFO:tasks.workunit.client.0.vm03.stdout:5/436: creat d2/d7/d3c/f9a x:0 0 0 2026-03-09T16:14:30.139 INFO:tasks.workunit.client.0.vm03.stdout:4/370: link d5/c15 d5/dd/d1f/c6d 0 2026-03-09T16:14:30.140 INFO:tasks.workunit.client.0.vm03.stdout:7/323: unlink d4/da/c43 0 2026-03-09T16:14:30.141 INFO:tasks.workunit.client.0.vm03.stdout:1/280: readlink d4/d6/d1d/d20/d23/d3e/l65 0 2026-03-09T16:14:30.143 INFO:tasks.workunit.client.0.vm03.stdout:5/437: mknod d2/d7/d3c/d3d/c9b 0 2026-03-09T16:14:30.145 INFO:tasks.workunit.client.0.vm03.stdout:9/426: sync 2026-03-09T16:14:30.151 INFO:tasks.workunit.client.0.vm03.stdout:7/324: dwrite d4/f26 [4194304,4194304] 0 2026-03-09T16:14:30.151 INFO:tasks.workunit.client.0.vm03.stdout:2/369: rename db/f31 to db/d12/d2a/f88 0 2026-03-09T16:14:30.151 INFO:tasks.workunit.client.0.vm03.stdout:2/370: chown db/d12/f77 7864172 1 2026-03-09T16:14:30.155 INFO:tasks.workunit.client.0.vm03.stdout:1/281: read - d4/d6/d3b/f35 zero size 2026-03-09T16:14:30.155 INFO:tasks.workunit.client.0.vm03.stdout:2/371: dread db/f55 [0,4194304] 0 2026-03-09T16:14:30.156 INFO:tasks.workunit.client.0.vm03.stdout:4/371: creat d5/db/f6e x:0 0 0 2026-03-09T16:14:30.157 INFO:tasks.workunit.client.0.vm03.stdout:4/372: stat d5/db/d25 0 2026-03-09T16:14:30.161 INFO:tasks.workunit.client.0.vm03.stdout:7/325: dread d4/da/d18/d22/f48 [0,4194304] 0 2026-03-09T16:14:30.165 INFO:tasks.workunit.client.0.vm03.stdout:7/326: dread d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:30.171 INFO:tasks.workunit.client.0.vm03.stdout:7/327: chown d4/d2d/f32 22507937 1 2026-03-09T16:14:30.180 INFO:tasks.workunit.client.0.vm03.stdout:9/427: chown d2/d4/d11/d29/l77 2002892085 1 2026-03-09T16:14:30.194 INFO:tasks.workunit.client.0.vm03.stdout:7/328: dread d4/d2d/f52 [0,4194304] 0 2026-03-09T16:14:30.195 INFO:tasks.workunit.client.0.vm03.stdout:4/373: creat d5/dd/f6f x:0 0 0 2026-03-09T16:14:30.206 INFO:tasks.workunit.client.0.vm03.stdout:2/372: fsync db/d12/f85 0 2026-03-09T16:14:30.216 INFO:tasks.workunit.client.0.vm03.stdout:8/390: write da/db/f1c [4176150,63589] 0 2026-03-09T16:14:30.222 INFO:tasks.workunit.client.0.vm03.stdout:3/343: write d5/d1e/d42/f25 [2324749,66309] 0 2026-03-09T16:14:30.222 INFO:tasks.workunit.client.0.vm03.stdout:2/373: sync 2026-03-09T16:14:30.223 INFO:tasks.workunit.client.0.vm03.stdout:3/344: chown d5/d1e/d42/f20 11032 1 2026-03-09T16:14:30.223 INFO:tasks.workunit.client.0.vm03.stdout:2/374: readlink db/d12/d2a/l56 0 2026-03-09T16:14:30.223 INFO:tasks.workunit.client.0.vm03.stdout:2/375: readlink db/d12/l17 0 2026-03-09T16:14:30.224 INFO:tasks.workunit.client.0.vm03.stdout:0/392: dwrite d0/d7/d3e/d57/d5a/d52/f68 [0,4194304] 0 2026-03-09T16:14:30.227 INFO:tasks.workunit.client.0.vm03.stdout:2/376: fsync db/f2d 0 2026-03-09T16:14:30.229 INFO:tasks.workunit.client.0.vm03.stdout:2/377: fdatasync db/d12/d2a/f58 0 2026-03-09T16:14:30.235 INFO:tasks.workunit.client.0.vm03.stdout:7/329: symlink d4/d2d/l6d 0 2026-03-09T16:14:30.249 INFO:tasks.workunit.client.0.vm03.stdout:6/363: rename d9/d42/d45/d47/c59 to d9/d42/c6e 0 2026-03-09T16:14:30.252 INFO:tasks.workunit.client.0.vm03.stdout:4/374: dwrite d5/d17/d44/f4a [4194304,4194304] 0 2026-03-09T16:14:30.256 INFO:tasks.workunit.client.0.vm03.stdout:6/364: sync 2026-03-09T16:14:30.256 INFO:tasks.workunit.client.0.vm03.stdout:6/365: read d9/d22/f27 [3792,35486] 0 2026-03-09T16:14:30.268 INFO:tasks.workunit.client.0.vm03.stdout:6/366: dwrite d9/f35 [0,4194304] 0 2026-03-09T16:14:30.274 INFO:tasks.workunit.client.0.vm03.stdout:1/282: link d4/d6/d1d/d20/d23/d3e/l65 d4/db/d59/l68 0 2026-03-09T16:14:30.277 INFO:tasks.workunit.client.0.vm03.stdout:1/283: write d4/d6/d1d/d24/d25/f4e [1706272,121601] 0 2026-03-09T16:14:30.284 INFO:tasks.workunit.client.0.vm03.stdout:3/345: creat d5/d1e/f66 x:0 0 0 2026-03-09T16:14:30.297 INFO:tasks.workunit.client.0.vm03.stdout:3/346: dread d5/f11 [0,4194304] 0 2026-03-09T16:14:30.303 INFO:tasks.workunit.client.0.vm03.stdout:0/393: creat d0/d7/d3e/d57/d5a/d5f/f84 x:0 0 0 2026-03-09T16:14:30.317 INFO:tasks.workunit.client.0.vm03.stdout:5/438: rename d2/d7/d8/d16/c10 to d2/d7/de/d54/c9c 0 2026-03-09T16:14:30.321 INFO:tasks.workunit.client.0.vm03.stdout:2/378: dwrite fa [0,4194304] 0 2026-03-09T16:14:30.322 INFO:tasks.workunit.client.0.vm03.stdout:4/375: write d5/d40/d46/f4f [973109,130395] 0 2026-03-09T16:14:30.323 INFO:tasks.workunit.client.0.vm03.stdout:4/376: chown d5/d17/d44/f4a 199 1 2026-03-09T16:14:30.343 INFO:tasks.workunit.client.0.vm03.stdout:9/428: link d2/de/c37 d2/d54/d6d/c80 0 2026-03-09T16:14:30.349 INFO:tasks.workunit.client.0.vm03.stdout:8/391: mknod da/db/c7b 0 2026-03-09T16:14:30.367 INFO:tasks.workunit.client.0.vm03.stdout:7/330: mkdir d4/da/d18/d22/d24/d16/d6e 0 2026-03-09T16:14:30.368 INFO:tasks.workunit.client.0.vm03.stdout:7/331: fsync d4/da/d18/d22/d24/d16/d2b/f5a 0 2026-03-09T16:14:30.373 INFO:tasks.workunit.client.0.vm03.stdout:5/439: write d2/d7/de/d11/d19/d31/f99 [348069,102923] 0 2026-03-09T16:14:30.375 INFO:tasks.workunit.client.0.vm03.stdout:7/332: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:30.377 INFO:tasks.workunit.client.0.vm03.stdout:5/440: truncate d2/d7/de/d11/f80 241518 0 2026-03-09T16:14:30.381 INFO:tasks.workunit.client.0.vm03.stdout:7/333: read d4/da/d45/f58 [8986,47543] 0 2026-03-09T16:14:30.382 INFO:tasks.workunit.client.0.vm03.stdout:7/334: truncate d4/da/d18/f44 9076675 0 2026-03-09T16:14:30.385 INFO:tasks.workunit.client.0.vm03.stdout:7/335: chown d4/d2d/d4b/f6b 17010994 1 2026-03-09T16:14:30.386 INFO:tasks.workunit.client.0.vm03.stdout:4/377: chown d5/c15 1 1 2026-03-09T16:14:30.386 INFO:tasks.workunit.client.0.vm03.stdout:4/378: chown d5/dd/d1f/f59 0 1 2026-03-09T16:14:30.389 INFO:tasks.workunit.client.0.vm03.stdout:4/379: fsync d5/db/f6e 0 2026-03-09T16:14:30.389 INFO:tasks.workunit.client.0.vm03.stdout:2/379: symlink db/d12/d2a/d61/d79/d83/d52/l89 0 2026-03-09T16:14:30.398 INFO:tasks.workunit.client.0.vm03.stdout:4/380: dwrite d5/dd/d1f/f48 [0,4194304] 0 2026-03-09T16:14:30.401 INFO:tasks.workunit.client.0.vm03.stdout:5/441: sync 2026-03-09T16:14:30.432 INFO:tasks.workunit.client.0.vm03.stdout:1/284: mkdir d4/d6/d1d/d69 0 2026-03-09T16:14:30.439 INFO:tasks.workunit.client.0.vm03.stdout:8/392: rename da/d45/c6b to da/d10/d28/c7c 0 2026-03-09T16:14:30.452 INFO:tasks.workunit.client.0.vm03.stdout:0/394: truncate d0/d7/d3e/f72 1815741 0 2026-03-09T16:14:30.461 INFO:tasks.workunit.client.0.vm03.stdout:2/380: creat db/d12/d2a/d61/d6d/f8a x:0 0 0 2026-03-09T16:14:30.461 INFO:tasks.workunit.client.0.vm03.stdout:4/381: creat d5/dd/d1f/f70 x:0 0 0 2026-03-09T16:14:30.462 INFO:tasks.workunit.client.0.vm03.stdout:7/336: dread - d4/da/d18/d22/d24/d16/d2b/f56 zero size 2026-03-09T16:14:30.462 INFO:tasks.workunit.client.0.vm03.stdout:9/429: dread d2/d4/d11/d29/d2a/d38/f72 [0,4194304] 0 2026-03-09T16:14:30.473 INFO:tasks.workunit.client.0.vm03.stdout:5/442: read d2/d7/de/d11/d19/d31/f42 [6344424,28667] 0 2026-03-09T16:14:30.476 INFO:tasks.workunit.client.0.vm03.stdout:8/393: mknod da/d1d/d3b/c7d 0 2026-03-09T16:14:30.478 INFO:tasks.workunit.client.0.vm03.stdout:0/395: chown d0/d7/d48/f4a 724093926 1 2026-03-09T16:14:30.482 INFO:tasks.workunit.client.0.vm03.stdout:4/382: creat d5/dd/d1f/d5f/f71 x:0 0 0 2026-03-09T16:14:30.484 INFO:tasks.workunit.client.0.vm03.stdout:7/337: creat d4/da/d45/d51/d36/f6f x:0 0 0 2026-03-09T16:14:30.493 INFO:tasks.workunit.client.0.vm03.stdout:9/430: creat d2/d4/d11/d29/d2a/d46/f81 x:0 0 0 2026-03-09T16:14:30.493 INFO:tasks.workunit.client.0.vm03.stdout:6/367: getdents d9/d42 0 2026-03-09T16:14:30.497 INFO:tasks.workunit.client.0.vm03.stdout:1/285: symlink d4/d6/l6a 0 2026-03-09T16:14:30.498 INFO:tasks.workunit.client.0.vm03.stdout:1/286: dread d4/d6/d1d/d3d/f49 [0,4194304] 0 2026-03-09T16:14:30.502 INFO:tasks.workunit.client.0.vm03.stdout:3/347: creat d5/d58/f67 x:0 0 0 2026-03-09T16:14:30.503 INFO:tasks.workunit.client.0.vm03.stdout:8/394: rmdir da/d10 39 2026-03-09T16:14:30.503 INFO:tasks.workunit.client.0.vm03.stdout:8/395: chown da/d32/f61 2674770 1 2026-03-09T16:14:30.504 INFO:tasks.workunit.client.0.vm03.stdout:8/396: write da/db/f44 [4778444,96446] 0 2026-03-09T16:14:30.504 INFO:tasks.workunit.client.0.vm03.stdout:0/396: mknod d0/d7/d3e/d57/d5a/d74/c85 0 2026-03-09T16:14:30.510 INFO:tasks.workunit.client.0.vm03.stdout:7/338: symlink d4/da/d45/d51/d36/l70 0 2026-03-09T16:14:30.510 INFO:tasks.workunit.client.0.vm03.stdout:3/348: sync 2026-03-09T16:14:30.516 INFO:tasks.workunit.client.0.vm03.stdout:1/287: rename d4/d6/d1d/d24 to d4/d6/d3b/d6b 0 2026-03-09T16:14:30.519 INFO:tasks.workunit.client.0.vm03.stdout:8/397: mknod da/d15/c7e 0 2026-03-09T16:14:30.521 INFO:tasks.workunit.client.0.vm03.stdout:4/383: mkdir d5/db/d25/d31/d4d/d5b/d72 0 2026-03-09T16:14:30.522 INFO:tasks.workunit.client.0.vm03.stdout:4/384: truncate d5/dd/d1f/f5e 889861 0 2026-03-09T16:14:30.527 INFO:tasks.workunit.client.0.vm03.stdout:6/368: mknod d9/c6f 0 2026-03-09T16:14:30.527 INFO:tasks.workunit.client.0.vm03.stdout:2/381: rename db/d12/d2a/d61/d79/d83/d64/l82 to db/d12/d2a/d61/d79/d83/d64/l8b 0 2026-03-09T16:14:30.535 INFO:tasks.workunit.client.0.vm03.stdout:9/431: fdatasync d2/d4/d11/d12/f3d 0 2026-03-09T16:14:30.536 INFO:tasks.workunit.client.0.vm03.stdout:9/432: chown d2/d4/d11/d29/d63/f75 748062 1 2026-03-09T16:14:30.546 INFO:tasks.workunit.client.0.vm03.stdout:3/349: dwrite d5/fb [0,4194304] 0 2026-03-09T16:14:30.550 INFO:tasks.workunit.client.0.vm03.stdout:0/397: rename d0/d7/d48/c3b to d0/da/d1b/c86 0 2026-03-09T16:14:30.550 INFO:tasks.workunit.client.0.vm03.stdout:1/288: truncate d4/d6/d3b/f36 663202 0 2026-03-09T16:14:30.551 INFO:tasks.workunit.client.0.vm03.stdout:0/398: fsync d0/d7/d3e/d57/d5a/f38 0 2026-03-09T16:14:30.558 INFO:tasks.workunit.client.0.vm03.stdout:3/350: dwrite d5/f33 [4194304,4194304] 0 2026-03-09T16:14:30.574 INFO:tasks.workunit.client.0.vm03.stdout:2/382: mkdir db/d12/d2a/d61/d6d/d8c 0 2026-03-09T16:14:30.584 INFO:tasks.workunit.client.0.vm03.stdout:2/383: dread f9 [0,4194304] 0 2026-03-09T16:14:30.585 INFO:tasks.workunit.client.0.vm03.stdout:5/443: getdents d2/d75 0 2026-03-09T16:14:30.587 INFO:tasks.workunit.client.0.vm03.stdout:5/444: dread d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:30.592 INFO:tasks.workunit.client.0.vm03.stdout:4/385: link d5/d17/d44/f4a d5/dd/f73 0 2026-03-09T16:14:30.596 INFO:tasks.workunit.client.0.vm03.stdout:8/398: rename da/d1d/f74 to da/d6c/d7a/f7f 0 2026-03-09T16:14:30.599 INFO:tasks.workunit.client.0.vm03.stdout:0/399: unlink d0/da/f2c 0 2026-03-09T16:14:30.607 INFO:tasks.workunit.client.0.vm03.stdout:3/351: dread - d5/d58/f67 zero size 2026-03-09T16:14:30.610 INFO:tasks.workunit.client.0.vm03.stdout:4/386: sync 2026-03-09T16:14:30.621 INFO:tasks.workunit.client.0.vm03.stdout:7/339: write d4/da/d18/f37 [1535993,12569] 0 2026-03-09T16:14:30.623 INFO:tasks.workunit.client.0.vm03.stdout:7/340: dread - d4/da/d18/d22/d24/d16/f6c zero size 2026-03-09T16:14:30.636 INFO:tasks.workunit.client.0.vm03.stdout:9/433: mknod d2/d4/d11/d29/d2a/c82 0 2026-03-09T16:14:30.638 INFO:tasks.workunit.client.0.vm03.stdout:2/384: write db/d12/d2a/d61/d79/d83/f7e [438667,35285] 0 2026-03-09T16:14:30.642 INFO:tasks.workunit.client.0.vm03.stdout:2/385: dread db/d12/f77 [0,4194304] 0 2026-03-09T16:14:30.644 INFO:tasks.workunit.client.0.vm03.stdout:5/445: symlink d2/d75/l9d 0 2026-03-09T16:14:30.645 INFO:tasks.workunit.client.0.vm03.stdout:5/446: chown d2/d7/de/d11/d19/d31/c7b 873 1 2026-03-09T16:14:30.646 INFO:tasks.workunit.client.0.vm03.stdout:5/447: readlink d2/d7/d1a/l1f 0 2026-03-09T16:14:30.659 INFO:tasks.workunit.client.0.vm03.stdout:0/400: creat d0/d7/d3e/d5d/f87 x:0 0 0 2026-03-09T16:14:30.664 INFO:tasks.workunit.client.0.vm03.stdout:3/352: symlink d5/d1e/d42/d37/l68 0 2026-03-09T16:14:30.670 INFO:tasks.workunit.client.0.vm03.stdout:4/387: chown d5/f7 2 1 2026-03-09T16:14:30.671 INFO:tasks.workunit.client.0.vm03.stdout:7/341: mkdir d4/da/d18/d22/d24/d15/d71 0 2026-03-09T16:14:30.671 INFO:tasks.workunit.client.0.vm03.stdout:4/388: truncate d5/d17/f21 4647767 0 2026-03-09T16:14:30.675 INFO:tasks.workunit.client.0.vm03.stdout:4/389: stat d5/db/d25/f4e 0 2026-03-09T16:14:30.680 INFO:tasks.workunit.client.0.vm03.stdout:9/434: mkdir d2/d4/d1f/d83 0 2026-03-09T16:14:30.680 INFO:tasks.workunit.client.0.vm03.stdout:9/435: fsync d2/d4/d11/f66 0 2026-03-09T16:14:30.686 INFO:tasks.workunit.client.0.vm03.stdout:2/386: unlink db/d12/d2a/f5e 0 2026-03-09T16:14:30.699 INFO:tasks.workunit.client.0.vm03.stdout:9/436: dread d2/d4/d11/d12/d28/f2f [0,4194304] 0 2026-03-09T16:14:30.712 INFO:tasks.workunit.client.0.vm03.stdout:0/401: creat d0/d7/d3e/d57/d5a/d47/f88 x:0 0 0 2026-03-09T16:14:30.713 INFO:tasks.workunit.client.0.vm03.stdout:0/402: dread - d0/f4d zero size 2026-03-09T16:14:30.714 INFO:tasks.workunit.client.0.vm03.stdout:1/289: link d4/d6/c61 d4/d6/d3b/d6b/d25/c6c 0 2026-03-09T16:14:30.719 INFO:tasks.workunit.client.0.vm03.stdout:1/290: dwrite d4/db/f21 [4194304,4194304] 0 2026-03-09T16:14:30.731 INFO:tasks.workunit.client.0.vm03.stdout:7/342: dread d4/da/d45/f58 [0,4194304] 0 2026-03-09T16:14:30.733 INFO:tasks.workunit.client.0.vm03.stdout:7/343: truncate d4/da/d45/f63 145008 0 2026-03-09T16:14:30.733 INFO:tasks.workunit.client.0.vm03.stdout:7/344: chown d4/da/d18 5861704 1 2026-03-09T16:14:30.735 INFO:tasks.workunit.client.0.vm03.stdout:7/345: write d4/da/d45/d51/f50 [370457,85233] 0 2026-03-09T16:14:30.752 INFO:tasks.workunit.client.0.vm03.stdout:9/437: mkdir d2/df/d84 0 2026-03-09T16:14:30.752 INFO:tasks.workunit.client.0.vm03.stdout:5/448: creat d2/d7/de/d33/f9e x:0 0 0 2026-03-09T16:14:30.756 INFO:tasks.workunit.client.0.vm03.stdout:6/369: rename d9/d42/d45/l56 to d9/d14/l70 0 2026-03-09T16:14:30.757 INFO:tasks.workunit.client.0.vm03.stdout:7/346: sync 2026-03-09T16:14:30.764 INFO:tasks.workunit.client.0.vm03.stdout:5/449: dwrite d2/d7/d8/d24/d27/d43/f82 [0,4194304] 0 2026-03-09T16:14:30.767 INFO:tasks.workunit.client.0.vm03.stdout:9/438: dwrite d2/d4/d11/d12/f35 [0,4194304] 0 2026-03-09T16:14:30.774 INFO:tasks.workunit.client.0.vm03.stdout:9/439: dwrite d2/df/f10 [4194304,4194304] 0 2026-03-09T16:14:30.795 INFO:tasks.workunit.client.0.vm03.stdout:1/291: dread - d4/d6/d3b/d6b/f42 zero size 2026-03-09T16:14:30.811 INFO:tasks.workunit.client.0.vm03.stdout:0/403: mkdir d0/d7/d3e/d57/d5a/d82/d89 0 2026-03-09T16:14:30.811 INFO:tasks.workunit.client.0.vm03.stdout:0/404: stat d0/d7/d3e/d57/d5a/d52/l7c 0 2026-03-09T16:14:30.814 INFO:tasks.workunit.client.0.vm03.stdout:8/399: rename da/d15 to da/d10/d28/d4f/d68/d80 0 2026-03-09T16:14:30.830 INFO:tasks.workunit.client.0.vm03.stdout:5/450: fsync d2/d7/d1a/f6e 0 2026-03-09T16:14:30.848 INFO:tasks.workunit.client.0.vm03.stdout:9/440: dread d2/d4/d11/d12/f50 [0,4194304] 0 2026-03-09T16:14:30.848 INFO:tasks.workunit.client.0.vm03.stdout:9/441: dread - d2/d4/d11/d29/d2a/d38/f74 zero size 2026-03-09T16:14:30.858 INFO:tasks.workunit.client.0.vm03.stdout:4/390: creat d5/f74 x:0 0 0 2026-03-09T16:14:30.861 INFO:tasks.workunit.client.0.vm03.stdout:3/353: truncate d5/d1e/d42/f29 7166225 0 2026-03-09T16:14:30.863 INFO:tasks.workunit.client.0.vm03.stdout:3/354: dread d5/f11 [0,4194304] 0 2026-03-09T16:14:30.864 INFO:tasks.workunit.client.0.vm03.stdout:3/355: fdatasync d5/f33 0 2026-03-09T16:14:30.866 INFO:tasks.workunit.client.0.vm03.stdout:2/387: creat db/d12/d2a/f8d x:0 0 0 2026-03-09T16:14:30.867 INFO:tasks.workunit.client.0.vm03.stdout:0/405: truncate d0/f1e 1011591 0 2026-03-09T16:14:30.868 INFO:tasks.workunit.client.0.vm03.stdout:0/406: write d0/d7/d3e/d45/f5e [236481,91386] 0 2026-03-09T16:14:30.884 INFO:tasks.workunit.client.0.vm03.stdout:6/370: mkdir d9/d14/d71 0 2026-03-09T16:14:30.884 INFO:tasks.workunit.client.0.vm03.stdout:6/371: chown d9/l2a 467464911 1 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: pgmap v7: 65 pgs: 65 active+clean; 675 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 14 MiB/s rd, 57 MiB/s wr, 167 op/s 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:30 vm03.local ceph-mon[51019]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:14:30.892 INFO:tasks.workunit.client.0.vm03.stdout:7/347: mknod d4/da/d18/d22/d24/d16/d2b/c72 0 2026-03-09T16:14:30.895 INFO:tasks.workunit.client.0.vm03.stdout:5/451: stat d2/d75/l8a 0 2026-03-09T16:14:30.895 INFO:tasks.workunit.client.0.vm03.stdout:5/452: chown d2/l14 143593 1 2026-03-09T16:14:30.898 INFO:tasks.workunit.client.0.vm03.stdout:9/442: creat d2/de/f85 x:0 0 0 2026-03-09T16:14:30.922 INFO:tasks.workunit.client.0.vm03.stdout:2/388: mknod db/d12/d2a/d61/d79/c8e 0 2026-03-09T16:14:30.922 INFO:tasks.workunit.client.0.vm03.stdout:2/389: write db/d12/d2a/f38 [1057365,3200] 0 2026-03-09T16:14:30.940 INFO:tasks.workunit.client.0.vm03.stdout:3/356: dwrite d5/d1e/d42/f20 [0,4194304] 0 2026-03-09T16:14:30.942 INFO:tasks.workunit.client.0.vm03.stdout:3/357: chown d5/d1e/f26 345411962 1 2026-03-09T16:14:30.946 INFO:tasks.workunit.client.0.vm03.stdout:0/407: creat d0/d7/d3e/d57/d5a/d82/f8a x:0 0 0 2026-03-09T16:14:30.949 INFO:tasks.workunit.client.0.vm03.stdout:8/400: mknod da/db/c81 0 2026-03-09T16:14:30.955 INFO:tasks.workunit.client.0.vm03.stdout:7/348: rename d4/da/d18/d22/d24/f30 to d4/da/d18/d22/d24/d16/d6e/f73 0 2026-03-09T16:14:30.957 INFO:tasks.workunit.client.0.vm03.stdout:9/443: creat d2/d4/d11/d29/d63/f86 x:0 0 0 2026-03-09T16:14:30.957 INFO:tasks.workunit.client.0.vm03.stdout:1/292: creat d4/f6d x:0 0 0 2026-03-09T16:14:30.959 INFO:tasks.workunit.client.0.vm03.stdout:4/391: creat d5/db/d25/d31/d4d/d5b/d72/f75 x:0 0 0 2026-03-09T16:14:30.965 INFO:tasks.workunit.client.0.vm03.stdout:1/293: write d4/d6/d3b/d6b/d25/f4e [4316031,78721] 0 2026-03-09T16:14:30.965 INFO:tasks.workunit.client.0.vm03.stdout:1/294: write d4/d6/d1d/d20/d23/f28 [402039,37067] 0 2026-03-09T16:14:30.965 INFO:tasks.workunit.client.0.vm03.stdout:2/390: truncate db/d12/d2a/f60 563603 0 2026-03-09T16:14:30.965 INFO:tasks.workunit.client.0.vm03.stdout:9/444: dwrite d2/d4/d11/d12/f45 [0,4194304] 0 2026-03-09T16:14:30.969 INFO:tasks.workunit.client.0.vm03.stdout:9/445: dwrite d2/d4/d11/d29/d2a/d38/f74 [0,4194304] 0 2026-03-09T16:14:30.971 INFO:tasks.workunit.client.0.vm03.stdout:9/446: chown d2/d4/d11/d12/d28/f2c 0 1 2026-03-09T16:14:30.985 INFO:tasks.workunit.client.0.vm03.stdout:8/401: mknod da/d6c/c82 0 2026-03-09T16:14:30.987 INFO:tasks.workunit.client.0.vm03.stdout:5/453: rename d2/d7/d1a/d1c/d6c/f6d to d2/d7/d1a/f9f 0 2026-03-09T16:14:30.993 INFO:tasks.workunit.client.0.vm03.stdout:8/402: dread da/db/f53 [0,4194304] 0 2026-03-09T16:14:30.997 INFO:tasks.workunit.client.0.vm03.stdout:8/403: dwrite da/d10/d28/d4f/d68/d80/f1b [4194304,4194304] 0 2026-03-09T16:14:30.999 INFO:tasks.workunit.client.0.vm03.stdout:7/349: mknod d4/dc/c74 0 2026-03-09T16:14:31.003 INFO:tasks.workunit.client.0.vm03.stdout:4/392: unlink d5/dd/f6f 0 2026-03-09T16:14:31.004 INFO:tasks.workunit.client.0.vm03.stdout:4/393: dread - d5/dd/d1f/f59 zero size 2026-03-09T16:14:31.009 INFO:tasks.workunit.client.0.vm03.stdout:2/391: creat db/d12/d2a/d61/d6d/f8f x:0 0 0 2026-03-09T16:14:31.010 INFO:tasks.workunit.client.0.vm03.stdout:2/392: chown db/d12/d2a/d61/f47 1 1 2026-03-09T16:14:31.013 INFO:tasks.workunit.client.0.vm03.stdout:2/393: dwrite db/f23 [0,4194304] 0 2026-03-09T16:14:31.020 INFO:tasks.workunit.client.0.vm03.stdout:2/394: dwrite db/d12/d2a/f58 [4194304,4194304] 0 2026-03-09T16:14:31.021 INFO:tasks.workunit.client.0.vm03.stdout:1/295: rmdir d4/d6/d1d/d20/d23/d3e 39 2026-03-09T16:14:31.026 INFO:tasks.workunit.client.0.vm03.stdout:9/447: fsync d2/df/f22 0 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: pgmap v7: 65 pgs: 65 active+clean; 675 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 14 MiB/s rd, 57 MiB/s wr, 167 op/s 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:31.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:30 vm05.local ceph-mon[58702]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:14:31.027 INFO:tasks.workunit.client.0.vm03.stdout:3/358: mknod d5/d58/c69 0 2026-03-09T16:14:31.029 INFO:tasks.workunit.client.0.vm03.stdout:1/296: dwrite d4/f6d [0,4194304] 0 2026-03-09T16:14:31.043 INFO:tasks.workunit.client.0.vm03.stdout:5/454: mkdir d2/d7/d8/da0 0 2026-03-09T16:14:31.046 INFO:tasks.workunit.client.0.vm03.stdout:5/455: dread d2/d7/d8/f7a [0,4194304] 0 2026-03-09T16:14:31.047 INFO:tasks.workunit.client.0.vm03.stdout:5/456: write d2/d7/de/d11/d38/d52/f7d [529663,22908] 0 2026-03-09T16:14:31.048 INFO:tasks.workunit.client.0.vm03.stdout:5/457: truncate d2/d7/de/d11/d38/d3b/f68 1148852 0 2026-03-09T16:14:31.053 INFO:tasks.workunit.client.0.vm03.stdout:8/404: symlink da/db/d43/l83 0 2026-03-09T16:14:31.058 INFO:tasks.workunit.client.0.vm03.stdout:5/458: dread d2/d7/de/d33/f8b [0,4194304] 0 2026-03-09T16:14:31.063 INFO:tasks.workunit.client.0.vm03.stdout:9/448: creat d2/de/f87 x:0 0 0 2026-03-09T16:14:31.069 INFO:tasks.workunit.client.0.vm03.stdout:9/449: chown d2/d4/d11/d29/d2a/d46/f81 23 1 2026-03-09T16:14:31.071 INFO:tasks.workunit.client.0.vm03.stdout:0/408: creat d0/da/f8b x:0 0 0 2026-03-09T16:14:31.075 INFO:tasks.workunit.client.0.vm03.stdout:6/372: getdents d9/d42/d45/d63/d66 0 2026-03-09T16:14:31.082 INFO:tasks.workunit.client.0.vm03.stdout:5/459: creat d2/d7/de/d11/d19/d29/fa1 x:0 0 0 2026-03-09T16:14:31.083 INFO:tasks.workunit.client.0.vm03.stdout:5/460: readlink d2/d7/l89 0 2026-03-09T16:14:31.084 INFO:tasks.workunit.client.0.vm03.stdout:3/359: mkdir d5/d58/d6a 0 2026-03-09T16:14:31.086 INFO:tasks.workunit.client.0.vm03.stdout:8/405: creat da/d32/d79/f84 x:0 0 0 2026-03-09T16:14:31.089 INFO:tasks.workunit.client.0.vm03.stdout:8/406: dwrite da/d32/f4d [0,4194304] 0 2026-03-09T16:14:31.092 INFO:tasks.workunit.client.0.vm03.stdout:8/407: dread da/db/f53 [0,4194304] 0 2026-03-09T16:14:31.098 INFO:tasks.workunit.client.0.vm03.stdout:0/409: sync 2026-03-09T16:14:31.106 INFO:tasks.workunit.client.0.vm03.stdout:4/394: rmdir d5/db/d25/d31/d66 0 2026-03-09T16:14:31.110 INFO:tasks.workunit.client.0.vm03.stdout:7/350: truncate d4/da/d18/d22/d24/d15/f2a 1039431 0 2026-03-09T16:14:31.112 INFO:tasks.workunit.client.0.vm03.stdout:5/461: creat d2/d7/de/d11/d38/d3b/fa2 x:0 0 0 2026-03-09T16:14:31.115 INFO:tasks.workunit.client.0.vm03.stdout:7/351: dread d4/da/d18/f37 [0,4194304] 0 2026-03-09T16:14:31.116 INFO:tasks.workunit.client.0.vm03.stdout:7/352: write d4/da/d18/d22/d24/f59 [804621,66040] 0 2026-03-09T16:14:31.120 INFO:tasks.workunit.client.0.vm03.stdout:6/373: getdents d9/d14/d71 0 2026-03-09T16:14:31.131 INFO:tasks.workunit.client.0.vm03.stdout:8/408: stat da/db/d30/c65 0 2026-03-09T16:14:31.138 INFO:tasks.workunit.client.0.vm03.stdout:2/395: getdents db/d12/d2a/d61/d79/d83 0 2026-03-09T16:14:31.140 INFO:tasks.workunit.client.0.vm03.stdout:7/353: creat d4/da/d18/d22/d24/d16/d3e/f75 x:0 0 0 2026-03-09T16:14:31.145 INFO:tasks.workunit.client.0.vm03.stdout:1/297: rename d4/c3c to d4/d6/d3b/d6b/c6e 0 2026-03-09T16:14:31.147 INFO:tasks.workunit.client.0.vm03.stdout:2/396: mknod db/d12/d2a/d61/d79/d83/d52/c90 0 2026-03-09T16:14:31.148 INFO:tasks.workunit.client.0.vm03.stdout:2/397: chown db/d12/d2a 363959 1 2026-03-09T16:14:31.150 INFO:tasks.workunit.client.0.vm03.stdout:1/298: dwrite d4/fa [0,4194304] 0 2026-03-09T16:14:31.153 INFO:tasks.workunit.client.0.vm03.stdout:2/398: dwrite db/d12/d2a/d61/d79/d83/f3f [0,4194304] 0 2026-03-09T16:14:31.161 INFO:tasks.workunit.client.0.vm03.stdout:5/462: dwrite d2/d7/d1a/d1c/f5e [4194304,4194304] 0 2026-03-09T16:14:31.176 INFO:tasks.workunit.client.0.vm03.stdout:9/450: rename d2/d4/d11/d29/d63 to d2/de/d88 0 2026-03-09T16:14:31.179 INFO:tasks.workunit.client.0.vm03.stdout:8/409: truncate da/d10/f1f 2045056 0 2026-03-09T16:14:31.182 INFO:tasks.workunit.client.0.vm03.stdout:5/463: rmdir d2/d7/de/d54 39 2026-03-09T16:14:31.185 INFO:tasks.workunit.client.0.vm03.stdout:4/395: rename d5/d40/d46/l6c to d5/db/l76 0 2026-03-09T16:14:31.187 INFO:tasks.workunit.client.0.vm03.stdout:0/410: link d0/d7/d3e/l2d d0/d7/d3e/d57/d5a/d52/l8c 0 2026-03-09T16:14:31.188 INFO:tasks.workunit.client.0.vm03.stdout:8/410: mkdir da/d10/d28/d4f/d85 0 2026-03-09T16:14:31.189 INFO:tasks.workunit.client.0.vm03.stdout:1/299: creat d4/d6/d1d/d20/d23/d3e/d3f/f6f x:0 0 0 2026-03-09T16:14:31.190 INFO:tasks.workunit.client.0.vm03.stdout:3/360: getdents d5/d58/d5a 0 2026-03-09T16:14:31.190 INFO:tasks.workunit.client.0.vm03.stdout:1/300: write d4/d6/d1d/d3d/f49 [522259,13223] 0 2026-03-09T16:14:31.193 INFO:tasks.workunit.client.0.vm03.stdout:3/361: write d5/d1e/f26 [4411628,58354] 0 2026-03-09T16:14:31.194 INFO:tasks.workunit.client.0.vm03.stdout:1/301: chown d4/d6/f15 3877 1 2026-03-09T16:14:31.194 INFO:tasks.workunit.client.0.vm03.stdout:3/362: write d5/d1e/d42/d55/f57 [41755,82843] 0 2026-03-09T16:14:31.197 INFO:tasks.workunit.client.0.vm03.stdout:5/464: creat d2/d7/d3c/d3d/fa3 x:0 0 0 2026-03-09T16:14:31.200 INFO:tasks.workunit.client.0.vm03.stdout:2/399: rename db/d12/d2a/d61/d79/d83/f3f to db/d12/d2a/d61/d6d/f91 0 2026-03-09T16:14:31.203 INFO:tasks.workunit.client.0.vm03.stdout:4/396: mkdir d5/db/d25/d31/d4d/d5b/d72/d77 0 2026-03-09T16:14:31.203 INFO:tasks.workunit.client.0.vm03.stdout:4/397: read d5/dd/f73 [7271311,72173] 0 2026-03-09T16:14:31.203 INFO:tasks.workunit.client.0.vm03.stdout:4/398: write d5/dd/d1f/f48 [4840157,78205] 0 2026-03-09T16:14:31.209 INFO:tasks.workunit.client.0.vm03.stdout:0/411: unlink d0/da/d7a/c7f 0 2026-03-09T16:14:31.210 INFO:tasks.workunit.client.0.vm03.stdout:0/412: write d0/d7/d3e/d57/d5a/d74/f7e [563057,36319] 0 2026-03-09T16:14:31.218 INFO:tasks.workunit.client.0.vm03.stdout:5/465: sync 2026-03-09T16:14:31.224 INFO:tasks.workunit.client.0.vm03.stdout:1/302: dread d4/db/f21 [0,4194304] 0 2026-03-09T16:14:31.232 INFO:tasks.workunit.client.0.vm03.stdout:9/451: rename d2/d54/d6d to d2/df/d89 0 2026-03-09T16:14:31.235 INFO:tasks.workunit.client.0.vm03.stdout:9/452: dread d2/d4/d11/d12/d28/f2f [0,4194304] 0 2026-03-09T16:14:31.236 INFO:tasks.workunit.client.0.vm03.stdout:9/453: chown d2/d4/d11/d29/d2a/d46/f81 2 1 2026-03-09T16:14:31.240 INFO:tasks.workunit.client.0.vm03.stdout:7/354: dwrite d4/da/d45/f58 [0,4194304] 0 2026-03-09T16:14:31.241 INFO:tasks.workunit.client.0.vm03.stdout:6/374: dwrite d9/d14/f31 [0,4194304] 0 2026-03-09T16:14:31.244 INFO:tasks.workunit.client.0.vm03.stdout:7/355: chown d4/da/d18 3536025 1 2026-03-09T16:14:31.245 INFO:tasks.workunit.client.0.vm03.stdout:6/375: chown d9/c4c 26 1 2026-03-09T16:14:31.245 INFO:tasks.workunit.client.0.vm03.stdout:9/454: dwrite d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:31.269 INFO:tasks.workunit.client.0.vm03.stdout:2/400: write db/d12/d2a/d61/f65 [272623,118156] 0 2026-03-09T16:14:31.274 INFO:tasks.workunit.client.0.vm03.stdout:0/413: symlink d0/d7/d3e/d57/d5a/d52/l8d 0 2026-03-09T16:14:31.277 INFO:tasks.workunit.client.0.vm03.stdout:0/414: dwrite d0/d7/d3e/d5d/f61 [0,4194304] 0 2026-03-09T16:14:31.280 INFO:tasks.workunit.client.0.vm03.stdout:8/411: unlink da/db/d43/c55 0 2026-03-09T16:14:31.280 INFO:tasks.workunit.client.0.vm03.stdout:8/412: chown da/d10/d28/d4f/d68 195903242 1 2026-03-09T16:14:31.286 INFO:tasks.workunit.client.0.vm03.stdout:5/466: mkdir d2/d7/d8/d16/da4 0 2026-03-09T16:14:31.287 INFO:tasks.workunit.client.0.vm03.stdout:5/467: write d2/d7/d3c/d3d/f56 [2745417,43150] 0 2026-03-09T16:14:31.308 INFO:tasks.workunit.client.0.vm03.stdout:9/455: mkdir d2/df/d84/d8a 0 2026-03-09T16:14:31.311 INFO:tasks.workunit.client.0.vm03.stdout:2/401: rename db/d12/d2a/d61/c28 to db/c92 0 2026-03-09T16:14:31.312 INFO:tasks.workunit.client.0.vm03.stdout:2/402: chown db/d12/d2a/d61/f45 96 1 2026-03-09T16:14:31.315 INFO:tasks.workunit.client.0.vm03.stdout:6/376: getdents d9/d42/d45/d63 0 2026-03-09T16:14:31.315 INFO:tasks.workunit.client.0.vm03.stdout:9/456: dread d2/d4/d11/d29/d2a/d46/f47 [0,4194304] 0 2026-03-09T16:14:31.317 INFO:tasks.workunit.client.0.vm03.stdout:9/457: read d2/d4/d11/d29/f70 [66365,77601] 0 2026-03-09T16:14:31.328 INFO:tasks.workunit.client.0.vm03.stdout:3/363: write d5/d1e/f31 [1301653,108033] 0 2026-03-09T16:14:31.329 INFO:tasks.workunit.client.0.vm03.stdout:3/364: fsync d5/d1e/d42/d55/f57 0 2026-03-09T16:14:31.331 INFO:tasks.workunit.client.0.vm03.stdout:1/303: dwrite d4/d6/d3b/f35 [0,4194304] 0 2026-03-09T16:14:31.334 INFO:tasks.workunit.client.0.vm03.stdout:4/399: truncate d5/db/d25/f26 1667498 0 2026-03-09T16:14:31.335 INFO:tasks.workunit.client.0.vm03.stdout:7/356: write d4/da/d18/f37 [726195,109278] 0 2026-03-09T16:14:31.337 INFO:tasks.workunit.client.0.vm03.stdout:5/468: write d2/d7/de/d11/f32 [1529824,41052] 0 2026-03-09T16:14:31.337 INFO:tasks.workunit.client.0.vm03.stdout:7/357: dread - d4/da/d18/f6a zero size 2026-03-09T16:14:31.340 INFO:tasks.workunit.client.0.vm03.stdout:5/469: write d2/d7/de/d11/d19/f8e [1020920,27185] 0 2026-03-09T16:14:31.344 INFO:tasks.workunit.client.0.vm03.stdout:0/415: dwrite d0/f1e [0,4194304] 0 2026-03-09T16:14:31.347 INFO:tasks.workunit.client.0.vm03.stdout:8/413: dwrite da/db/d30/f36 [0,4194304] 0 2026-03-09T16:14:31.348 INFO:tasks.workunit.client.0.vm03.stdout:8/414: fdatasync da/db/f6a 0 2026-03-09T16:14:31.348 INFO:tasks.workunit.client.0.vm03.stdout:8/415: write da/d10/d63/f73 [1686822,40056] 0 2026-03-09T16:14:31.348 INFO:tasks.workunit.client.0.vm03.stdout:8/416: chown da/db/c7b 4039 1 2026-03-09T16:14:31.355 INFO:tasks.workunit.client.0.vm03.stdout:2/403: dwrite db/d12/f85 [0,4194304] 0 2026-03-09T16:14:31.376 INFO:tasks.workunit.client.0.vm03.stdout:6/377: rename d9/d14/l70 to d9/d42/d45/d50/l72 0 2026-03-09T16:14:31.378 INFO:tasks.workunit.client.0.vm03.stdout:1/304: rmdir d4/db/d59 39 2026-03-09T16:14:31.420 INFO:tasks.workunit.client.0.vm03.stdout:4/400: creat d5/db/d25/f78 x:0 0 0 2026-03-09T16:14:31.425 INFO:tasks.workunit.client.0.vm03.stdout:0/416: mkdir d0/d7/d3e/d45/d8e 0 2026-03-09T16:14:31.428 INFO:tasks.workunit.client.0.vm03.stdout:0/417: dwrite d0/da/d5c/f39 [0,4194304] 0 2026-03-09T16:14:31.433 INFO:tasks.workunit.client.0.vm03.stdout:8/417: mknod da/d1d/d3b/c86 0 2026-03-09T16:14:31.439 INFO:tasks.workunit.client.0.vm03.stdout:5/470: rename d2/d7/d8/d24/d27/d43/f82 to d2/d7/d8/fa5 0 2026-03-09T16:14:31.448 INFO:tasks.workunit.client.0.vm03.stdout:1/305: mkdir d4/d39/d70 0 2026-03-09T16:14:31.452 INFO:tasks.workunit.client.0.vm03.stdout:7/358: symlink d4/da/l76 0 2026-03-09T16:14:31.452 INFO:tasks.workunit.client.0.vm03.stdout:7/359: chown d4/da/d18/d22/d24/d16/f6c 192 1 2026-03-09T16:14:31.452 INFO:tasks.workunit.client.0.vm03.stdout:7/360: dread - d4/da/d18/d22/d24/d16/f6c zero size 2026-03-09T16:14:31.452 INFO:tasks.workunit.client.0.vm03.stdout:7/361: readlink d4/da/l21 0 2026-03-09T16:14:31.455 INFO:tasks.workunit.client.0.vm03.stdout:8/418: stat da/c49 0 2026-03-09T16:14:31.455 INFO:tasks.workunit.client.0.vm03.stdout:7/362: dread d4/da/f42 [0,4194304] 0 2026-03-09T16:14:31.456 INFO:tasks.workunit.client.0.vm03.stdout:8/419: write da/db/f34 [227255,9140] 0 2026-03-09T16:14:31.456 INFO:tasks.workunit.client.0.vm03.stdout:8/420: chown c5 213 1 2026-03-09T16:14:31.459 INFO:tasks.workunit.client.0.vm03.stdout:3/365: truncate d5/d1e/d42/d4c/f51 3298662 0 2026-03-09T16:14:31.463 INFO:tasks.workunit.client.0.vm03.stdout:9/458: creat d2/d4/d11/d29/d2a/f8b x:0 0 0 2026-03-09T16:14:31.469 INFO:tasks.workunit.client.0.vm03.stdout:4/401: rename d5/d40/d46 to d5/db/d25/d31/d33/d79 0 2026-03-09T16:14:31.471 INFO:tasks.workunit.client.0.vm03.stdout:5/471: write d2/d7/de/d11/d38/f57 [3938697,103629] 0 2026-03-09T16:14:31.480 INFO:tasks.workunit.client.0.vm03.stdout:7/363: mkdir d4/da/d18/d22/d24/d16/d3e/d77 0 2026-03-09T16:14:31.481 INFO:tasks.workunit.client.0.vm03.stdout:6/378: truncate d9/d22/f4e 838733 0 2026-03-09T16:14:31.482 INFO:tasks.workunit.client.0.vm03.stdout:7/364: write d4/da/d18/d22/d24/d16/d2b/f5a [136825,68115] 0 2026-03-09T16:14:31.482 INFO:tasks.workunit.client.0.vm03.stdout:8/421: symlink da/d32/d79/l87 0 2026-03-09T16:14:31.482 INFO:tasks.workunit.client.0.vm03.stdout:7/365: write d4/da/d45/d51/f5b [2219305,71171] 0 2026-03-09T16:14:31.485 INFO:tasks.workunit.client.0.vm03.stdout:7/366: dread d4/da/d45/d51/f50 [0,4194304] 0 2026-03-09T16:14:31.485 INFO:tasks.workunit.client.0.vm03.stdout:7/367: write d4/da/f42 [151944,46989] 0 2026-03-09T16:14:31.486 INFO:tasks.workunit.client.0.vm03.stdout:3/366: mknod d5/d2e/c6b 0 2026-03-09T16:14:31.487 INFO:tasks.workunit.client.0.vm03.stdout:3/367: fsync d5/d1e/d42/f25 0 2026-03-09T16:14:31.491 INFO:tasks.workunit.client.0.vm03.stdout:2/404: creat db/f93 x:0 0 0 2026-03-09T16:14:31.494 INFO:tasks.workunit.client.0.vm03.stdout:4/402: stat d5/db/l76 0 2026-03-09T16:14:31.495 INFO:tasks.workunit.client.0.vm03.stdout:4/403: write d5/dd/f23 [5489479,69073] 0 2026-03-09T16:14:31.498 INFO:tasks.workunit.client.0.vm03.stdout:4/404: dwrite d5/dd/d1f/f70 [0,4194304] 0 2026-03-09T16:14:31.513 INFO:tasks.workunit.client.0.vm03.stdout:5/472: mknod d2/ca6 0 2026-03-09T16:14:31.528 INFO:tasks.workunit.client.0.vm03.stdout:6/379: fsync d9/d42/d45/f4d 0 2026-03-09T16:14:31.533 INFO:tasks.workunit.client.0.vm03.stdout:8/422: symlink da/d6c/d7a/l88 0 2026-03-09T16:14:31.541 INFO:tasks.workunit.client.0.vm03.stdout:8/423: fsync da/d32/f66 0 2026-03-09T16:14:31.541 INFO:tasks.workunit.client.0.vm03.stdout:7/368: write d4/da/d18/d22/d24/d16/d6e/f73 [812415,101578] 0 2026-03-09T16:14:31.543 INFO:tasks.workunit.client.0.vm03.stdout:4/405: sync 2026-03-09T16:14:31.543 INFO:tasks.workunit.client.0.vm03.stdout:5/473: sync 2026-03-09T16:14:31.551 INFO:tasks.workunit.client.0.vm03.stdout:0/418: dwrite d0/d7/d48/f2e [4194304,4194304] 0 2026-03-09T16:14:31.557 INFO:tasks.workunit.client.0.vm03.stdout:0/419: dwrite d0/d7/d3e/d45/f76 [0,4194304] 0 2026-03-09T16:14:31.559 INFO:tasks.workunit.client.0.vm03.stdout:2/405: write db/d12/d2a/f40 [60510,43292] 0 2026-03-09T16:14:31.564 INFO:tasks.workunit.client.0.vm03.stdout:1/306: creat d4/d6/d3b/d6b/f71 x:0 0 0 2026-03-09T16:14:31.564 INFO:tasks.workunit.client.0.vm03.stdout:1/307: chown d4/d6/d3b/d6b/d25 948998912 1 2026-03-09T16:14:31.565 INFO:tasks.workunit.client.0.vm03.stdout:1/308: write d4/d6/d3b/d6b/d25/f4e [745270,34623] 0 2026-03-09T16:14:31.580 INFO:tasks.workunit.client.0.vm03.stdout:5/474: unlink d2/l14 0 2026-03-09T16:14:31.611 INFO:tasks.workunit.client.0.vm03.stdout:0/420: dwrite d0/d7/d3e/f4f [0,4194304] 0 2026-03-09T16:14:31.616 INFO:tasks.workunit.client.0.vm03.stdout:0/421: dwrite d0/d7/d3e/d5d/f87 [0,4194304] 0 2026-03-09T16:14:31.632 INFO:tasks.workunit.client.0.vm03.stdout:1/309: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:31.643 INFO:tasks.workunit.client.0.vm03.stdout:1/310: write d4/d39/f5a [1811942,110666] 0 2026-03-09T16:14:31.644 INFO:tasks.workunit.client.0.vm03.stdout:1/311: fdatasync d4/db/f60 0 2026-03-09T16:14:31.644 INFO:tasks.workunit.client.0.vm03.stdout:8/424: mknod da/c89 0 2026-03-09T16:14:31.644 INFO:tasks.workunit.client.0.vm03.stdout:8/425: chown da/db 170 1 2026-03-09T16:14:31.648 INFO:tasks.workunit.client.0.vm03.stdout:7/369: symlink d4/da/d18/d22/d24/d15/d71/l78 0 2026-03-09T16:14:31.649 INFO:tasks.workunit.client.0.vm03.stdout:7/370: write d4/da/d18/d22/d24/d16/f6c [490947,27766] 0 2026-03-09T16:14:31.649 INFO:tasks.workunit.client.0.vm03.stdout:7/371: chown d4/dc/c74 489095 1 2026-03-09T16:14:31.673 INFO:tasks.workunit.client.0.vm03.stdout:5/475: rename d2/d7/d8/l46 to d2/d7/de/d11/d19/d29/la7 0 2026-03-09T16:14:31.679 INFO:tasks.workunit.client.0.vm03.stdout:9/459: getdents d2/de 0 2026-03-09T16:14:31.685 INFO:tasks.workunit.client.0.vm03.stdout:0/422: fdatasync d0/d7/f8 0 2026-03-09T16:14:31.686 INFO:tasks.workunit.client.0.vm03.stdout:2/406: write db/d12/d2a/d61/d79/d83/f7e [921484,27105] 0 2026-03-09T16:14:31.686 INFO:tasks.workunit.client.0.vm03.stdout:2/407: dread - db/d12/d2a/d61/f20 zero size 2026-03-09T16:14:31.689 INFO:tasks.workunit.client.0.vm03.stdout:2/408: dwrite db/d12/f85 [0,4194304] 0 2026-03-09T16:14:31.699 INFO:tasks.workunit.client.0.vm03.stdout:6/380: creat d9/f73 x:0 0 0 2026-03-09T16:14:31.702 INFO:tasks.workunit.client.0.vm03.stdout:7/372: mkdir d4/da/d18/d22/d24/d15/d71/d79 0 2026-03-09T16:14:31.702 INFO:tasks.workunit.client.0.vm03.stdout:1/312: creat d4/d6/d1d/d20/f72 x:0 0 0 2026-03-09T16:14:31.702 INFO:tasks.workunit.client.0.vm03.stdout:3/368: getdents d5/d1e/d42/d34 0 2026-03-09T16:14:31.712 INFO:tasks.workunit.client.0.vm03.stdout:1/313: write d4/d6/d3b/f35 [4780563,119516] 0 2026-03-09T16:14:31.712 INFO:tasks.workunit.client.0.vm03.stdout:5/476: unlink d2/f7f 0 2026-03-09T16:14:31.712 INFO:tasks.workunit.client.0.vm03.stdout:1/314: dread d4/d6/d1d/d20/d23/f28 [0,4194304] 0 2026-03-09T16:14:31.718 INFO:tasks.workunit.client.0.vm03.stdout:8/426: rename da/d1d/f5e to da/d10/d28/d4f/d68/f8a 0 2026-03-09T16:14:31.720 INFO:tasks.workunit.client.0.vm03.stdout:3/369: dread d5/d1e/f26 [0,4194304] 0 2026-03-09T16:14:31.725 INFO:tasks.workunit.client.0.vm03.stdout:9/460: mknod d2/df/d89/c8c 0 2026-03-09T16:14:31.735 INFO:tasks.workunit.client.0.vm03.stdout:6/381: creat d9/d42/f74 x:0 0 0 2026-03-09T16:14:31.735 INFO:tasks.workunit.client.0.vm03.stdout:6/382: readlink d9/d22/l26 0 2026-03-09T16:14:31.744 INFO:tasks.workunit.client.0.vm03.stdout:6/383: dread d9/d22/f3e [0,4194304] 0 2026-03-09T16:14:31.746 INFO:tasks.workunit.client.0.vm03.stdout:1/315: symlink d4/d31/l73 0 2026-03-09T16:14:31.753 INFO:tasks.workunit.client.0.vm03.stdout:4/406: getdents d5/db/d25/d31 0 2026-03-09T16:14:31.753 INFO:tasks.workunit.client.0.vm03.stdout:4/407: write d5/f7 [7244842,45660] 0 2026-03-09T16:14:31.762 INFO:tasks.workunit.client.0.vm03.stdout:2/409: dwrite db/d12/d2a/d61/d79/d83/f53 [0,4194304] 0 2026-03-09T16:14:31.763 INFO:tasks.workunit.client.0.vm03.stdout:2/410: fsync db/d12/f85 0 2026-03-09T16:14:31.783 INFO:tasks.workunit.client.0.vm03.stdout:4/408: mknod d5/d17/d44/c7a 0 2026-03-09T16:14:31.784 INFO:tasks.workunit.client.0.vm03.stdout:4/409: dread d5/dd/d1f/f58 [0,4194304] 0 2026-03-09T16:14:31.785 INFO:tasks.workunit.client.0.vm03.stdout:4/410: chown d5/db 109501241 1 2026-03-09T16:14:31.789 INFO:tasks.workunit.client.0.vm03.stdout:7/373: dwrite d4/da/d18/d22/d24/d15/f2a [0,4194304] 0 2026-03-09T16:14:31.790 INFO:tasks.workunit.client.0.vm03.stdout:7/374: chown d4/da/d18/d22/d24/d15/d71 794412163 1 2026-03-09T16:14:31.791 INFO:tasks.workunit.client.0.vm03.stdout:7/375: truncate d4/da/f20 4919608 0 2026-03-09T16:14:31.800 INFO:tasks.workunit.client.0.vm03.stdout:9/461: mknod d2/df/d84/d8a/c8d 0 2026-03-09T16:14:31.806 INFO:tasks.workunit.client.0.vm03.stdout:0/423: link d0/da/d1b/c80 d0/d7/d3e/c8f 0 2026-03-09T16:14:31.810 INFO:tasks.workunit.client.0.vm03.stdout:6/384: mknod d9/c75 0 2026-03-09T16:14:31.826 INFO:tasks.workunit.client.0.vm03.stdout:7/376: dwrite d4/da/f20 [0,4194304] 0 2026-03-09T16:14:31.831 INFO:tasks.workunit.client.0.vm03.stdout:9/462: mknod d2/d4/d11/d12/d28/c8e 0 2026-03-09T16:14:31.838 INFO:tasks.workunit.client.0.vm03.stdout:2/411: mkdir db/d12/d2a/d61/d6d/d8c/d94 0 2026-03-09T16:14:31.838 INFO:tasks.workunit.client.0.vm03.stdout:0/424: creat d0/d7/d3e/d57/f90 x:0 0 0 2026-03-09T16:14:31.840 INFO:tasks.workunit.client.0.vm03.stdout:5/477: link d2/d7/d3c/d3d/c9b d2/d7/d8/d24/d27/ca8 0 2026-03-09T16:14:31.841 INFO:tasks.workunit.client.0.vm03.stdout:5/478: write d2/d7/de/d11/d19/d31/f99 [114056,113373] 0 2026-03-09T16:14:31.846 INFO:tasks.workunit.client.0.vm03.stdout:1/316: link d4/d6/d1d/d20/d23/f30 d4/d6/d1d/d20/d23/f74 0 2026-03-09T16:14:31.867 INFO:tasks.workunit.client.0.vm03.stdout:8/427: getdents da/d32 0 2026-03-09T16:14:31.867 INFO:tasks.workunit.client.0.vm03.stdout:8/428: chown da/db/d43/l83 66030462 1 2026-03-09T16:14:31.867 INFO:tasks.workunit.client.0.vm03.stdout:3/370: getdents d5/d53 0 2026-03-09T16:14:31.867 INFO:tasks.workunit.client.0.vm03.stdout:2/412: creat db/d12/d2a/d61/d79/f95 x:0 0 0 2026-03-09T16:14:31.867 INFO:tasks.workunit.client.0.vm03.stdout:0/425: symlink d0/d7/d3e/d57/d5a/d74/l91 0 2026-03-09T16:14:31.871 INFO:tasks.workunit.client.0.vm03.stdout:3/371: rename d5/d1e/d42/d37 to d5/d53/d6c 0 2026-03-09T16:14:31.875 INFO:tasks.workunit.client.0.vm03.stdout:9/463: mkdir d2/d54/d7d/d8f 0 2026-03-09T16:14:31.884 INFO:tasks.workunit.client.0.vm03.stdout:0/426: symlink d0/da/d7a/l92 0 2026-03-09T16:14:31.884 INFO:tasks.workunit.client.0.vm03.stdout:0/427: chown d0/d7/d3e 1 1 2026-03-09T16:14:31.885 INFO:tasks.workunit.client.0.vm03.stdout:6/385: symlink d9/d42/d45/l76 0 2026-03-09T16:14:31.889 INFO:tasks.workunit.client.0.vm03.stdout:7/377: dwrite d4/d2d/f52 [0,4194304] 0 2026-03-09T16:14:31.891 INFO:tasks.workunit.client.0.vm03.stdout:4/411: getdents d5/db/d25/d31/d33 0 2026-03-09T16:14:31.894 INFO:tasks.workunit.client.0.vm03.stdout:8/429: sync 2026-03-09T16:14:31.902 INFO:tasks.workunit.client.0.vm03.stdout:9/464: rename d2/d4/d11/d29/d2a/d46/f47 to d2/d54/f90 0 2026-03-09T16:14:31.902 INFO:tasks.workunit.client.0.vm03.stdout:2/413: truncate db/d12/f84 705126 0 2026-03-09T16:14:31.902 INFO:tasks.workunit.client.0.vm03.stdout:5/479: rmdir d2/d7/d8/d16/da4 0 2026-03-09T16:14:31.902 INFO:tasks.workunit.client.0.vm03.stdout:1/317: link d4/d6/d3b/d6b/f42 d4/d31/d5c/f75 0 2026-03-09T16:14:31.903 INFO:tasks.workunit.client.0.vm03.stdout:1/318: dread - d4/db/f60 zero size 2026-03-09T16:14:31.903 INFO:tasks.workunit.client.0.vm03.stdout:4/412: fdatasync d5/dd/f16 0 2026-03-09T16:14:31.903 INFO:tasks.workunit.client.0.vm03.stdout:2/414: chown f0 63472 1 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: Standby manager daemon vm03.gbgzmu started 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/crt"}]: dispatch 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/key"}]: dispatch 2026-03-09T16:14:31.912 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:31 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:31.912 INFO:tasks.workunit.client.0.vm03.stdout:0/428: dread d0/da/ff [0,4194304] 0 2026-03-09T16:14:31.914 INFO:tasks.workunit.client.0.vm03.stdout:8/430: creat da/d10/d28/f8b x:0 0 0 2026-03-09T16:14:31.914 INFO:tasks.workunit.client.0.vm03.stdout:6/386: mkdir d9/d14/d77 0 2026-03-09T16:14:31.915 INFO:tasks.workunit.client.0.vm03.stdout:8/431: chown da/db/f44 7029036 1 2026-03-09T16:14:31.917 INFO:tasks.workunit.client.0.vm03.stdout:7/378: symlink d4/da/d18/d22/d24/d16/d3e/d77/l7a 0 2026-03-09T16:14:31.919 INFO:tasks.workunit.client.0.vm03.stdout:9/465: creat d2/df/d84/d8a/f91 x:0 0 0 2026-03-09T16:14:31.919 INFO:tasks.workunit.client.0.vm03.stdout:9/466: dread - d2/df/d84/d8a/f91 zero size 2026-03-09T16:14:31.920 INFO:tasks.workunit.client.0.vm03.stdout:2/415: unlink db/d12/d2a/d61/d79/c8e 0 2026-03-09T16:14:31.922 INFO:tasks.workunit.client.0.vm03.stdout:7/379: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:31.928 INFO:tasks.workunit.client.0.vm03.stdout:4/413: rename d5/db/f2f to d5/db/d25/d31/d33/d79/f7b 0 2026-03-09T16:14:31.937 INFO:tasks.workunit.client.0.vm03.stdout:3/372: rename d5/d58 to d5/d6d 0 2026-03-09T16:14:31.945 INFO:tasks.workunit.client.0.vm03.stdout:0/429: creat d0/d7/d3e/d5d/f93 x:0 0 0 2026-03-09T16:14:31.945 INFO:tasks.workunit.client.0.vm03.stdout:8/432: unlink da/d10/f6e 0 2026-03-09T16:14:31.945 INFO:tasks.workunit.client.0.vm03.stdout:5/480: mkdir d2/d7/de/da9 0 2026-03-09T16:14:31.948 INFO:tasks.workunit.client.0.vm03.stdout:5/481: fdatasync d2/d7/d3c/d3d/f56 0 2026-03-09T16:14:31.949 INFO:tasks.workunit.client.0.vm03.stdout:4/414: sync 2026-03-09T16:14:31.950 INFO:tasks.workunit.client.0.vm03.stdout:5/482: write d2/d7/d3c/f9a [262773,67863] 0 2026-03-09T16:14:31.968 INFO:tasks.workunit.client.0.vm03.stdout:1/319: link d4/d6/d1d/d20/d23/f74 d4/d6/d1d/d69/f76 0 2026-03-09T16:14:31.968 INFO:tasks.workunit.client.0.vm03.stdout:8/433: rmdir da/db 39 2026-03-09T16:14:31.968 INFO:tasks.workunit.client.0.vm03.stdout:6/387: creat d9/d42/f78 x:0 0 0 2026-03-09T16:14:31.969 INFO:tasks.workunit.client.0.vm03.stdout:4/415: write d5/db/d25/d31/d33/d79/f49 [2213998,53925] 0 2026-03-09T16:14:31.982 INFO:tasks.workunit.client.0.vm03.stdout:3/373: write d5/f2b [814534,10160] 0 2026-03-09T16:14:31.985 INFO:tasks.workunit.client.0.vm03.stdout:2/416: dwrite db/d12/d2a/d61/f47 [0,4194304] 0 2026-03-09T16:14:31.988 INFO:tasks.workunit.client.0.vm03.stdout:6/388: fdatasync d9/d14/f1d 0 2026-03-09T16:14:31.988 INFO:tasks.workunit.client.0.vm03.stdout:2/417: fdatasync db/d12/d2a/f58 0 2026-03-09T16:14:31.989 INFO:tasks.workunit.client.0.vm03.stdout:2/418: read db/f2d [45089,114851] 0 2026-03-09T16:14:31.991 INFO:tasks.workunit.client.0.vm03.stdout:3/374: dwrite d5/f2b [0,4194304] 0 2026-03-09T16:14:31.991 INFO:tasks.workunit.client.0.vm03.stdout:2/419: write f0 [4974614,88593] 0 2026-03-09T16:14:31.994 INFO:tasks.workunit.client.0.vm03.stdout:2/420: chown db/d12/d2a/c35 4038801 1 2026-03-09T16:14:32.000 INFO:tasks.workunit.client.0.vm03.stdout:8/434: rename da/d10/d28/d4f/d68/d80/f1b to da/d10/d28/f8c 0 2026-03-09T16:14:32.005 INFO:tasks.workunit.client.0.vm03.stdout:9/467: getdents d2/df/d84 0 2026-03-09T16:14:32.010 INFO:tasks.workunit.client.0.vm03.stdout:7/380: getdents d4/da 0 2026-03-09T16:14:32.012 INFO:tasks.workunit.client.0.vm03.stdout:0/430: getdents d0/d7 0 2026-03-09T16:14:32.012 INFO:tasks.workunit.client.0.vm03.stdout:5/483: creat d2/d7/de/faa x:0 0 0 2026-03-09T16:14:32.013 INFO:tasks.workunit.client.0.vm03.stdout:0/431: chown d0/d7/d3e/d5d 124753709 1 2026-03-09T16:14:32.014 INFO:tasks.workunit.client.0.vm03.stdout:0/432: read d0/da/d5c/f33 [653279,50982] 0 2026-03-09T16:14:32.015 INFO:tasks.workunit.client.0.vm03.stdout:0/433: chown d0/d7/d3e/d57/d5a/d52/l8d 3139834 1 2026-03-09T16:14:32.019 INFO:tasks.workunit.client.0.vm03.stdout:0/434: dwrite d0/d7/d3e/d57/d5a/d47/f88 [0,4194304] 0 2026-03-09T16:14:32.025 INFO:tasks.workunit.client.0.vm03.stdout:0/435: dwrite d0/d7/d3e/f4f [4194304,4194304] 0 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: Standby manager daemon vm03.gbgzmu started 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/crt"}]: dispatch 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/key"}]: dispatch 2026-03-09T16:14:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:31 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.103:0/4074088957' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:32.050 INFO:tasks.workunit.client.0.vm03.stdout:2/421: mknod db/d12/d2a/d61/c96 0 2026-03-09T16:14:32.050 INFO:tasks.workunit.client.0.vm03.stdout:1/320: creat d4/d6/d3b/d63/f77 x:0 0 0 2026-03-09T16:14:32.052 INFO:tasks.workunit.client.0.vm03.stdout:4/416: rename d5/dd/f16 to d5/dd/d1f/d5f/f7c 0 2026-03-09T16:14:32.052 INFO:tasks.workunit.client.0.vm03.stdout:4/417: fdatasync d5/dd/d1f/f70 0 2026-03-09T16:14:32.053 INFO:tasks.workunit.client.0.vm03.stdout:8/435: mknod da/d10/d28/d4f/d68/c8d 0 2026-03-09T16:14:32.060 INFO:tasks.workunit.client.0.vm03.stdout:5/484: rmdir d2/d7/de/d11 39 2026-03-09T16:14:32.067 INFO:tasks.workunit.client.0.vm03.stdout:3/375: mknod d5/d44/d61/c6e 0 2026-03-09T16:14:32.075 INFO:tasks.workunit.client.0.vm03.stdout:2/422: dread db/d12/f77 [0,4194304] 0 2026-03-09T16:14:32.075 INFO:tasks.workunit.client.0.vm03.stdout:2/423: write db/d12/d2a/d61/f65 [913851,6135] 0 2026-03-09T16:14:32.076 INFO:tasks.workunit.client.0.vm03.stdout:3/376: dread d5/d1e/f26 [0,4194304] 0 2026-03-09T16:14:32.077 INFO:tasks.workunit.client.0.vm03.stdout:9/468: rename d2/df/d5f to d2/d4/d11/d29/d92 0 2026-03-09T16:14:32.078 INFO:tasks.workunit.client.0.vm03.stdout:9/469: stat d2/d4/d11/d29/d2a 0 2026-03-09T16:14:32.084 INFO:tasks.workunit.client.0.vm03.stdout:8/436: read da/db/f75 [213780,13699] 0 2026-03-09T16:14:32.092 INFO:tasks.workunit.client.0.vm03.stdout:0/436: creat d0/d7/d3e/d45/d8e/f94 x:0 0 0 2026-03-09T16:14:32.095 INFO:tasks.workunit.client.0.vm03.stdout:6/389: creat d9/d42/f79 x:0 0 0 2026-03-09T16:14:32.098 INFO:tasks.workunit.client.0.vm03.stdout:3/377: unlink d5/f16 0 2026-03-09T16:14:32.098 INFO:tasks.workunit.client.0.vm03.stdout:3/378: chown d5/d2e 1779 1 2026-03-09T16:14:32.099 INFO:tasks.workunit.client.0.vm03.stdout:3/379: write d5/fb [2982739,106926] 0 2026-03-09T16:14:32.099 INFO:tasks.workunit.client.0.vm03.stdout:3/380: fdatasync d5/d1e/f66 0 2026-03-09T16:14:32.100 INFO:tasks.workunit.client.0.vm03.stdout:3/381: fsync d5/d2e/f65 0 2026-03-09T16:14:32.100 INFO:tasks.workunit.client.0.vm03.stdout:3/382: dread - d5/d44/f54 zero size 2026-03-09T16:14:32.112 INFO:tasks.workunit.client.0.vm03.stdout:7/381: write d4/da/d45/f4e [670518,115779] 0 2026-03-09T16:14:32.118 INFO:tasks.workunit.client.0.vm03.stdout:0/437: mkdir d0/d7/d3e/d95 0 2026-03-09T16:14:32.121 INFO:tasks.workunit.client.0.vm03.stdout:6/390: chown d9/d42/d45/d63/d66 0 1 2026-03-09T16:14:32.121 INFO:tasks.workunit.client.0.vm03.stdout:6/391: write d9/d42/d45/f4a [1559356,73365] 0 2026-03-09T16:14:32.135 INFO:tasks.workunit.client.0.vm03.stdout:6/392: write d9/d42/d45/d47/f6a [184,6508] 0 2026-03-09T16:14:32.136 INFO:tasks.workunit.client.0.vm03.stdout:6/393: write d9/f73 [678759,68044] 0 2026-03-09T16:14:32.140 INFO:tasks.workunit.client.0.vm03.stdout:6/394: dread d9/d22/f27 [0,4194304] 0 2026-03-09T16:14:32.145 INFO:tasks.workunit.client.0.vm03.stdout:3/383: symlink d5/d1e/d42/d34/l6f 0 2026-03-09T16:14:32.146 INFO:tasks.workunit.client.0.vm03.stdout:3/384: write d5/d1e/f26 [1074815,40203] 0 2026-03-09T16:14:32.149 INFO:tasks.workunit.client.0.vm03.stdout:3/385: write d5/d1e/f66 [410667,125392] 0 2026-03-09T16:14:32.155 INFO:tasks.workunit.client.0.vm03.stdout:3/386: dwrite d5/d44/f56 [0,4194304] 0 2026-03-09T16:14:32.161 INFO:tasks.workunit.client.0.vm03.stdout:3/387: write d5/d44/f56 [1579706,79556] 0 2026-03-09T16:14:32.165 INFO:tasks.workunit.client.0.vm03.stdout:5/485: dwrite d2/d7/d8/f7a [0,4194304] 0 2026-03-09T16:14:32.167 INFO:tasks.workunit.client.0.vm03.stdout:3/388: dwrite d5/d44/f5d [0,4194304] 0 2026-03-09T16:14:32.180 INFO:tasks.workunit.client.0.vm03.stdout:1/321: write d4/d31/d5c/f75 [438981,49673] 0 2026-03-09T16:14:32.183 INFO:tasks.workunit.client.0.vm03.stdout:7/382: creat d4/da/d18/d22/d24/d16/d6e/f7b x:0 0 0 2026-03-09T16:14:32.187 INFO:tasks.workunit.client.0.vm03.stdout:7/383: dwrite d4/da/d18/d22/d24/d16/d2b/f56 [0,4194304] 0 2026-03-09T16:14:32.192 INFO:tasks.workunit.client.0.vm03.stdout:8/437: unlink da/d10/d28/d4f/d68/d80/l27 0 2026-03-09T16:14:32.193 INFO:tasks.workunit.client.0.vm03.stdout:0/438: unlink d0/f1e 0 2026-03-09T16:14:32.194 INFO:tasks.workunit.client.0.vm03.stdout:6/395: mknod d9/d42/d45/d63/c7a 0 2026-03-09T16:14:32.203 INFO:tasks.workunit.client.0.vm03.stdout:7/384: dread d4/da/d18/d22/d24/d15/f2a [0,4194304] 0 2026-03-09T16:14:32.206 INFO:tasks.workunit.client.0.vm03.stdout:3/389: mkdir d5/d1e/d42/d34/d70 0 2026-03-09T16:14:32.207 INFO:tasks.workunit.client.0.vm03.stdout:3/390: stat d5/d44/d61 0 2026-03-09T16:14:32.208 INFO:tasks.workunit.client.0.vm03.stdout:3/391: chown d5/f11 298 1 2026-03-09T16:14:32.209 INFO:tasks.workunit.client.0.vm03.stdout:4/418: getdents d5 0 2026-03-09T16:14:32.210 INFO:tasks.workunit.client.0.vm03.stdout:3/392: dread d5/d1e/d42/d55/f57 [0,4194304] 0 2026-03-09T16:14:32.211 INFO:tasks.workunit.client.0.vm03.stdout:4/419: stat d5/d17/d44/l50 0 2026-03-09T16:14:32.213 INFO:tasks.workunit.client.0.vm03.stdout:4/420: fsync d5/db/d25/f4e 0 2026-03-09T16:14:32.213 INFO:tasks.workunit.client.0.vm03.stdout:6/396: dwrite d9/d42/d45/d50/f51 [0,4194304] 0 2026-03-09T16:14:32.231 INFO:tasks.workunit.client.0.vm03.stdout:9/470: getdents d2/d4 0 2026-03-09T16:14:32.231 INFO:tasks.workunit.client.0.vm03.stdout:2/424: getdents db/d12/d2a/d61/d6d 0 2026-03-09T16:14:32.231 INFO:tasks.workunit.client.0.vm03.stdout:4/421: fsync d5/dd/d1f/f59 0 2026-03-09T16:14:32.232 INFO:tasks.workunit.client.0.vm03.stdout:9/471: readlink d2/d4/d11/d29/d2a/l3a 0 2026-03-09T16:14:32.233 INFO:tasks.workunit.client.0.vm03.stdout:1/322: dread d4/db/f2e [0,4194304] 0 2026-03-09T16:14:32.238 INFO:tasks.workunit.client.0.vm03.stdout:0/439: mknod d0/d7/d3e/d57/d5a/d82/d89/c96 0 2026-03-09T16:14:32.240 INFO:tasks.workunit.client.0.vm03.stdout:0/440: write d0/d7/f3d [3686948,30497] 0 2026-03-09T16:14:32.242 INFO:tasks.workunit.client.0.vm03.stdout:2/425: dread db/d12/f39 [0,4194304] 0 2026-03-09T16:14:32.247 INFO:tasks.workunit.client.0.vm03.stdout:9/472: readlink d2/d4/d1f/l59 0 2026-03-09T16:14:32.247 INFO:tasks.workunit.client.0.vm03.stdout:0/441: fdatasync d0/f29 0 2026-03-09T16:14:32.252 INFO:tasks.workunit.client.0.vm03.stdout:2/426: creat db/d12/d2a/d61/d6d/d8c/f97 x:0 0 0 2026-03-09T16:14:32.253 INFO:tasks.workunit.client.0.vm03.stdout:1/323: dwrite d4/d6/d1d/d20/f72 [0,4194304] 0 2026-03-09T16:14:32.254 INFO:tasks.workunit.client.0.vm03.stdout:1/324: chown d4/d6/d1d 14767295 1 2026-03-09T16:14:32.255 INFO:tasks.workunit.client.0.vm03.stdout:2/427: truncate db/d12/d2a/d61/d79/f7f 330948 0 2026-03-09T16:14:32.255 INFO:tasks.workunit.client.0.vm03.stdout:1/325: chown d4/d31 423 1 2026-03-09T16:14:32.265 INFO:tasks.workunit.client.0.vm03.stdout:9/473: sync 2026-03-09T16:14:32.265 INFO:tasks.workunit.client.0.vm03.stdout:7/385: sync 2026-03-09T16:14:32.265 INFO:tasks.workunit.client.0.vm03.stdout:3/393: sync 2026-03-09T16:14:32.266 INFO:tasks.workunit.client.0.vm03.stdout:3/394: read - d5/d44/f54 zero size 2026-03-09T16:14:32.267 INFO:tasks.workunit.client.0.vm03.stdout:7/386: truncate d4/da/d18/f6a 429466 0 2026-03-09T16:14:32.269 INFO:tasks.workunit.client.0.vm03.stdout:6/397: rmdir d9/d14/d77 0 2026-03-09T16:14:32.277 INFO:tasks.workunit.client.0.vm03.stdout:7/387: dwrite d4/d2d/f32 [0,4194304] 0 2026-03-09T16:14:32.280 INFO:tasks.workunit.client.0.vm03.stdout:0/442: creat d0/d7/d3e/d57/d5a/d52/f97 x:0 0 0 2026-03-09T16:14:32.280 INFO:tasks.workunit.client.0.vm03.stdout:9/474: dread d2/d4/d11/d29/d2a/d38/f72 [0,4194304] 0 2026-03-09T16:14:32.282 INFO:tasks.workunit.client.0.vm03.stdout:0/443: chown d0/f4d 2 1 2026-03-09T16:14:32.287 INFO:tasks.workunit.client.0.vm03.stdout:7/388: dwrite d4/da/f5f [0,4194304] 0 2026-03-09T16:14:32.293 INFO:tasks.workunit.client.0.vm03.stdout:9/475: dwrite d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:32.311 INFO:tasks.workunit.client.0.vm03.stdout:7/389: dwrite d4/da/d18/d22/d24/d16/d2b/f56 [0,4194304] 0 2026-03-09T16:14:32.312 INFO:tasks.workunit.client.0.vm03.stdout:3/395: chown d5/l8 170 1 2026-03-09T16:14:32.313 INFO:tasks.workunit.client.0.vm03.stdout:7/390: chown d4/da/d18/d22/d24/d16/f6c 6506316 1 2026-03-09T16:14:32.313 INFO:tasks.workunit.client.0.vm03.stdout:7/391: fsync d4/da/d45/f63 0 2026-03-09T16:14:32.332 INFO:tasks.workunit.client.0.vm03.stdout:2/428: mknod db/d12/d2a/d61/d6d/d8c/d94/c98 0 2026-03-09T16:14:32.339 INFO:tasks.workunit.client.0.vm03.stdout:5/486: write d2/d7/de/f48 [1546739,117609] 0 2026-03-09T16:14:32.342 INFO:tasks.workunit.client.0.vm03.stdout:0/444: dread - d0/f54 zero size 2026-03-09T16:14:32.348 INFO:tasks.workunit.client.0.vm03.stdout:8/438: truncate da/db/d30/f36 817435 0 2026-03-09T16:14:32.348 INFO:tasks.workunit.client.0.vm03.stdout:9/476: mknod d2/d4/d11/d29/d92/c93 0 2026-03-09T16:14:32.348 INFO:tasks.workunit.client.0.vm03.stdout:7/392: symlink d4/dc/l7c 0 2026-03-09T16:14:32.350 INFO:tasks.workunit.client.0.vm03.stdout:9/477: dread - d2/df/d89/f7e zero size 2026-03-09T16:14:32.350 INFO:tasks.workunit.client.0.vm03.stdout:3/396: rename d5/l8 to d5/d1e/d42/l71 0 2026-03-09T16:14:32.356 INFO:tasks.workunit.client.0.vm03.stdout:8/439: dread da/db/f1c [0,4194304] 0 2026-03-09T16:14:32.360 INFO:tasks.workunit.client.0.vm03.stdout:4/422: truncate d5/d17/d44/f64 345549 0 2026-03-09T16:14:32.361 INFO:tasks.workunit.client.0.vm03.stdout:6/398: write d9/f35 [4092052,128796] 0 2026-03-09T16:14:32.379 INFO:tasks.workunit.client.0.vm03.stdout:1/326: dwrite d4/d6/d1d/d20/d23/d3e/d3f/f48 [0,4194304] 0 2026-03-09T16:14:32.394 INFO:tasks.workunit.client.0.vm03.stdout:9/478: dread d2/d4/d11/d12/f45 [0,4194304] 0 2026-03-09T16:14:32.398 INFO:tasks.workunit.client.0.vm03.stdout:9/479: stat d2/de/d88/f86 0 2026-03-09T16:14:32.402 INFO:tasks.workunit.client.0.vm03.stdout:9/480: dread d2/df/f22 [0,4194304] 0 2026-03-09T16:14:32.403 INFO:tasks.workunit.client.0.vm03.stdout:9/481: dread d2/de/d88/f75 [0,4194304] 0 2026-03-09T16:14:32.409 INFO:tasks.workunit.client.0.vm03.stdout:5/487: truncate d2/d7/d8/d16/d5c/f64 709071 0 2026-03-09T16:14:32.411 INFO:tasks.workunit.client.0.vm03.stdout:3/397: creat d5/d1e/f72 x:0 0 0 2026-03-09T16:14:32.420 INFO:tasks.workunit.client.0.vm03.stdout:7/393: rename d4/da/d18/d22/d24/d16/f67 to d4/da/d18/d22/d24/d15/d71/f7d 0 2026-03-09T16:14:32.422 INFO:tasks.workunit.client.0.vm03.stdout:6/399: dread d9/d22/f24 [4194304,4194304] 0 2026-03-09T16:14:32.422 INFO:tasks.workunit.client.0.vm03.stdout:5/488: dwrite d2/d7/d3c/d3d/fa3 [0,4194304] 0 2026-03-09T16:14:32.423 INFO:tasks.workunit.client.0.vm03.stdout:6/400: truncate d9/d22/f3f 1739859 0 2026-03-09T16:14:32.428 INFO:tasks.workunit.client.0.vm03.stdout:6/401: dwrite d9/d42/d45/d47/f6a [0,4194304] 0 2026-03-09T16:14:32.432 INFO:tasks.workunit.client.0.vm03.stdout:8/440: creat da/d6c/f8e x:0 0 0 2026-03-09T16:14:32.441 INFO:tasks.workunit.client.0.vm03.stdout:4/423: mkdir d5/db/d25/d31/d4d/d5b/d7d 0 2026-03-09T16:14:32.441 INFO:tasks.workunit.client.0.vm03.stdout:4/424: stat d5/db/d25/c3e 0 2026-03-09T16:14:32.452 INFO:tasks.workunit.client.0.vm03.stdout:2/429: dread db/d12/d2a/d61/d79/d83/d64/d68/f6b [0,4194304] 0 2026-03-09T16:14:32.459 INFO:tasks.workunit.client.0.vm03.stdout:2/430: dread - db/d12/d2a/d61/d6d/f8a zero size 2026-03-09T16:14:32.476 INFO:tasks.workunit.client.0.vm03.stdout:2/431: dwrite db/d12/d2a/d61/f5d [0,4194304] 0 2026-03-09T16:14:32.476 INFO:tasks.workunit.client.0.vm03.stdout:1/327: creat d4/d6/d3b/d63/f78 x:0 0 0 2026-03-09T16:14:32.478 INFO:tasks.workunit.client.0.vm03.stdout:9/482: rmdir d2/d4 39 2026-03-09T16:14:32.484 INFO:tasks.workunit.client.0.vm03.stdout:3/398: dread d5/f33 [4194304,4194304] 0 2026-03-09T16:14:32.491 INFO:tasks.workunit.client.0.vm03.stdout:0/445: truncate d0/da/d5c/f39 694513 0 2026-03-09T16:14:32.494 INFO:tasks.workunit.client.0.vm03.stdout:6/402: rename d9/l6c to d9/d42/d45/d65/l7b 0 2026-03-09T16:14:32.494 INFO:tasks.workunit.client.0.vm03.stdout:7/394: mkdir d4/da/d18/d22/d24/d16/d6e/d7e 0 2026-03-09T16:14:32.494 INFO:tasks.workunit.client.0.vm03.stdout:6/403: write d9/d42/f78 [943775,57047] 0 2026-03-09T16:14:32.497 INFO:tasks.workunit.client.0.vm03.stdout:8/441: creat da/d10/d28/d4f/d68/f8f x:0 0 0 2026-03-09T16:14:32.497 INFO:tasks.workunit.client.0.vm03.stdout:4/425: mknod d5/db/d25/d31/c7e 0 2026-03-09T16:14:32.497 INFO:tasks.workunit.client.0.vm03.stdout:7/395: chown d4/da/d18/d22/d24/f41 804 1 2026-03-09T16:14:32.499 INFO:tasks.workunit.client.0.vm03.stdout:8/442: dread - da/d10/d28/f57 zero size 2026-03-09T16:14:32.503 INFO:tasks.workunit.client.0.vm03.stdout:7/396: dwrite d4/dc/f1a [0,4194304] 0 2026-03-09T16:14:32.509 INFO:tasks.workunit.client.0.vm03.stdout:7/397: write d4/dc/f1a [429729,11802] 0 2026-03-09T16:14:32.512 INFO:tasks.workunit.client.0.vm03.stdout:1/328: creat d4/d31/f79 x:0 0 0 2026-03-09T16:14:32.518 INFO:tasks.workunit.client.0.vm03.stdout:7/398: dwrite d4/da/d18/f6a [0,4194304] 0 2026-03-09T16:14:32.526 INFO:tasks.workunit.client.0.vm03.stdout:3/399: creat d5/d1e/d42/d34/f73 x:0 0 0 2026-03-09T16:14:32.528 INFO:tasks.workunit.client.0.vm03.stdout:3/400: fdatasync d5/fb 0 2026-03-09T16:14:32.528 INFO:tasks.workunit.client.0.vm03.stdout:0/446: mkdir d0/da/d7a/d98 0 2026-03-09T16:14:32.528 INFO:tasks.workunit.client.0.vm03.stdout:3/401: write d5/d1e/f72 [968399,100202] 0 2026-03-09T16:14:32.528 INFO:tasks.workunit.client.0.vm03.stdout:0/447: dread - d0/d7/d3e/d57/d5a/d5f/f84 zero size 2026-03-09T16:14:32.529 INFO:tasks.workunit.client.0.vm03.stdout:0/448: chown d0/c22 1 1 2026-03-09T16:14:32.535 INFO:tasks.workunit.client.0.vm03.stdout:5/489: rmdir d2/d7/de/d11/d38 39 2026-03-09T16:14:32.536 INFO:tasks.workunit.client.0.vm03.stdout:6/404: chown d9/d14/c2e 45553311 1 2026-03-09T16:14:32.537 INFO:tasks.workunit.client.0.vm03.stdout:5/490: read d2/d7/de/f78 [765245,43177] 0 2026-03-09T16:14:32.542 INFO:tasks.workunit.client.0.vm03.stdout:7/399: dread d4/da/d18/d22/d24/d16/d6e/f73 [0,4194304] 0 2026-03-09T16:14:32.552 INFO:tasks.workunit.client.0.vm03.stdout:4/426: creat d5/db/d25/d31/d33/d55/f7f x:0 0 0 2026-03-09T16:14:32.560 INFO:tasks.workunit.client.0.vm03.stdout:2/432: mkdir db/d12/d2a/d99 0 2026-03-09T16:14:32.564 INFO:tasks.workunit.client.0.vm03.stdout:1/329: read d4/d6/d3b/f36 [205983,126035] 0 2026-03-09T16:14:32.572 INFO:tasks.workunit.client.0.vm03.stdout:9/483: symlink d2/d4/d11/d12/l94 0 2026-03-09T16:14:32.573 INFO:tasks.workunit.client.0.vm03.stdout:0/449: unlink d0/d7/d3e/c8f 0 2026-03-09T16:14:32.574 INFO:tasks.workunit.client.0.vm03.stdout:3/402: creat d5/d1e/d42/f74 x:0 0 0 2026-03-09T16:14:32.583 INFO:tasks.workunit.client.0.vm03.stdout:6/405: symlink d9/d42/d45/d47/l7c 0 2026-03-09T16:14:32.595 INFO:tasks.workunit.client.0.vm03.stdout:8/443: truncate da/db/d30/f36 770401 0 2026-03-09T16:14:32.601 INFO:tasks.workunit.client.0.vm03.stdout:9/484: creat d2/d4/d11/d29/f95 x:0 0 0 2026-03-09T16:14:32.601 INFO:tasks.workunit.client.0.vm03.stdout:9/485: chown d2/de 878 1 2026-03-09T16:14:32.602 INFO:tasks.workunit.client.0.vm03.stdout:3/403: rename d5/d1e/l3f to d5/d1e/l75 0 2026-03-09T16:14:32.604 INFO:tasks.workunit.client.0.vm03.stdout:3/404: rename d5/d1e/d42 to d5/d1e/d42/d76 22 2026-03-09T16:14:32.605 INFO:tasks.workunit.client.0.vm03.stdout:4/427: write d5/fa [2842248,63696] 0 2026-03-09T16:14:32.608 INFO:tasks.workunit.client.0.vm03.stdout:2/433: write db/d12/d2a/f5f [319891,91168] 0 2026-03-09T16:14:32.613 INFO:tasks.workunit.client.0.vm03.stdout:6/406: creat d9/d42/d45/d63/d66/f7d x:0 0 0 2026-03-09T16:14:32.621 INFO:tasks.workunit.client.0.vm03.stdout:7/400: mknod d4/da/d18/d22/d24/d16/d6e/d7e/c7f 0 2026-03-09T16:14:32.624 INFO:tasks.workunit.client.0.vm03.stdout:5/491: mknod d2/d7/de/da9/cab 0 2026-03-09T16:14:32.624 INFO:tasks.workunit.client.0.vm03.stdout:8/444: creat da/d32/d79/f90 x:0 0 0 2026-03-09T16:14:32.625 INFO:tasks.workunit.client.0.vm03.stdout:1/330: mknod d4/d6/d3b/d6b/d25/d50/c7a 0 2026-03-09T16:14:32.626 INFO:tasks.workunit.client.0.vm03.stdout:8/445: stat da/d1d/d3b/f4b 0 2026-03-09T16:14:32.627 INFO:tasks.workunit.client.0.vm03.stdout:1/331: stat d4/d6/d1d/d20/d23/d3e/c4d 0 2026-03-09T16:14:32.627 INFO:tasks.workunit.client.0.vm03.stdout:1/332: fdatasync d4/f6d 0 2026-03-09T16:14:32.633 INFO:tasks.workunit.client.0.vm03.stdout:9/486: rename d2/de/c4b to d2/d4/d11/d29/d2a/d38/c96 0 2026-03-09T16:14:32.633 INFO:tasks.workunit.client.0.vm03.stdout:1/333: stat d4/d6/d1d/d20 0 2026-03-09T16:14:32.633 INFO:tasks.workunit.client.0.vm03.stdout:4/428: unlink d5/dd/c37 0 2026-03-09T16:14:32.634 INFO:tasks.workunit.client.0.vm03.stdout:2/434: creat db/d12/d2a/d61/d79/f9a x:0 0 0 2026-03-09T16:14:32.634 INFO:tasks.workunit.client.0.vm03.stdout:4/429: readlink d5/l68 0 2026-03-09T16:14:32.636 INFO:tasks.workunit.client.0.vm03.stdout:8/446: creat da/d6c/d7a/f91 x:0 0 0 2026-03-09T16:14:32.637 INFO:tasks.workunit.client.0.vm03.stdout:5/492: dread - d2/d7/de/d11/d38/d3b/fa2 zero size 2026-03-09T16:14:32.637 INFO:tasks.workunit.client.0.vm03.stdout:8/447: write f8 [3284628,53222] 0 2026-03-09T16:14:32.643 INFO:tasks.workunit.client.0.vm03.stdout:3/405: rename d5/l49 to d5/d6d/d5a/d63/l77 0 2026-03-09T16:14:32.644 INFO:tasks.workunit.client.0.vm03.stdout:7/401: dwrite d4/da/d18/d22/d24/d16/d6e/f7b [0,4194304] 0 2026-03-09T16:14:32.651 INFO:tasks.workunit.client.0.vm03.stdout:5/493: creat d2/d7/de/d11/d19/d29/d90/fac x:0 0 0 2026-03-09T16:14:32.658 INFO:tasks.workunit.client.0.vm03.stdout:2/435: read db/d12/f62 [2914084,60259] 0 2026-03-09T16:14:32.660 INFO:tasks.workunit.client.0.vm03.stdout:0/450: getdents d0/da/d7a 0 2026-03-09T16:14:32.661 INFO:tasks.workunit.client.0.vm03.stdout:1/334: unlink d4/c29 0 2026-03-09T16:14:32.661 INFO:tasks.workunit.client.0.vm03.stdout:3/406: creat d5/d6d/d5a/f78 x:0 0 0 2026-03-09T16:14:32.661 INFO:tasks.workunit.client.0.vm03.stdout:8/448: chown da/d10/f1f 1 1 2026-03-09T16:14:32.663 INFO:tasks.workunit.client.0.vm03.stdout:9/487: mknod d2/d4/d1f/d83/c97 0 2026-03-09T16:14:32.663 INFO:tasks.workunit.client.0.vm03.stdout:7/402: dread d4/da/d18/d22/d24/d15/d71/f7d [0,4194304] 0 2026-03-09T16:14:32.665 INFO:tasks.workunit.client.0.vm03.stdout:5/494: mknod d2/d7/d1a/d1c/d6c/cad 0 2026-03-09T16:14:32.666 INFO:tasks.workunit.client.0.vm03.stdout:4/430: dwrite d5/d17/d44/f4a [8388608,4194304] 0 2026-03-09T16:14:32.666 INFO:tasks.workunit.client.0.vm03.stdout:4/431: chown d5/db/l76 0 1 2026-03-09T16:14:32.672 INFO:tasks.workunit.client.0.vm03.stdout:9/488: fsync d2/d4/d11/d29/d92/f6a 0 2026-03-09T16:14:32.673 INFO:tasks.workunit.client.0.vm03.stdout:6/407: getdents d9/d42/d45/d47 0 2026-03-09T16:14:32.677 INFO:tasks.workunit.client.0.vm03.stdout:1/335: rmdir d4/d6 39 2026-03-09T16:14:32.677 INFO:tasks.workunit.client.0.vm03.stdout:6/408: write d9/d42/d45/d63/d66/f7d [131501,27391] 0 2026-03-09T16:14:32.677 INFO:tasks.workunit.client.0.vm03.stdout:4/432: creat d5/d17/f80 x:0 0 0 2026-03-09T16:14:32.684 INFO:tasks.workunit.client.0.vm03.stdout:5/495: rename d2/d7/d1a/d1c/l5b to d2/d7/de/da9/lae 0 2026-03-09T16:14:32.687 INFO:tasks.workunit.client.0.vm03.stdout:5/496: write d2/d7/de/f48 [131112,5279] 0 2026-03-09T16:14:32.688 INFO:tasks.workunit.client.0.vm03.stdout:5/497: fsync d2/d7/d8/f7a 0 2026-03-09T16:14:32.691 INFO:tasks.workunit.client.0.vm03.stdout:3/407: dwrite d5/fb [0,4194304] 0 2026-03-09T16:14:32.692 INFO:tasks.workunit.client.0.vm03.stdout:2/436: dwrite db/d12/f63 [0,4194304] 0 2026-03-09T16:14:32.692 INFO:tasks.workunit.client.0.vm03.stdout:2/437: fsync db/d12/d2a/d61/d79/f95 0 2026-03-09T16:14:32.695 INFO:tasks.workunit.client.0.vm03.stdout:7/403: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:32.698 INFO:tasks.workunit.client.0.vm03.stdout:2/438: write db/d12/f63 [3720377,63600] 0 2026-03-09T16:14:32.705 INFO:tasks.workunit.client.0.vm03.stdout:0/451: write d0/d7/d48/f4a [567637,50103] 0 2026-03-09T16:14:32.711 INFO:tasks.workunit.client.0.vm03.stdout:6/409: rename d9/l41 to d9/d42/d45/d63/d66/l7e 0 2026-03-09T16:14:32.712 INFO:tasks.workunit.client.0.vm03.stdout:1/336: dwrite d4/d6/d3b/d6b/f42 [0,4194304] 0 2026-03-09T16:14:32.712 INFO:tasks.workunit.client.0.vm03.stdout:8/449: link da/d10/d28/c2d da/db/d43/c92 0 2026-03-09T16:14:32.726 INFO:tasks.workunit.client.0.vm03.stdout:7/404: readlink d4/d2d/l3f 0 2026-03-09T16:14:32.727 INFO:tasks.workunit.client.0.vm03.stdout:8/450: symlink da/d10/d63/l93 0 2026-03-09T16:14:32.733 INFO:tasks.workunit.client.0.vm03.stdout:2/439: dwrite db/d12/d2a/d61/d79/f9a [0,4194304] 0 2026-03-09T16:14:32.733 INFO:tasks.workunit.client.0.vm03.stdout:3/408: unlink d5/d1e/d42/d4c/f51 0 2026-03-09T16:14:32.733 INFO:tasks.workunit.client.0.vm03.stdout:8/451: read - da/d32/f56 zero size 2026-03-09T16:14:32.737 INFO:tasks.workunit.client.0.vm03.stdout:5/498: symlink d2/d7/de/d11/d19/d31/d35/d87/laf 0 2026-03-09T16:14:32.750 INFO:tasks.workunit.client.0.vm03.stdout:1/337: mkdir d4/d7b 0 2026-03-09T16:14:32.756 INFO:tasks.workunit.client.0.vm03.stdout:3/409: mkdir d5/d53/d6c/d79 0 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:1/338: read d4/d6/d3b/f35 [4444205,87743] 0 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:9/489: dread d2/d4/d1f/f25 [0,4194304] 0 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:3/410: chown d5/d1e/f26 18 1 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:5/499: fsync d2/d7/de/d11/d19/d31/f42 0 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:1/339: truncate d4/d6/d1d/d3d/f49 1387834 0 2026-03-09T16:14:32.757 INFO:tasks.workunit.client.0.vm03.stdout:1/340: chown d4 72566 1 2026-03-09T16:14:32.763 INFO:tasks.workunit.client.0.vm03.stdout:2/440: getdents db/d12/d2a/d99 0 2026-03-09T16:14:32.766 INFO:tasks.workunit.client.0.vm03.stdout:1/341: readlink d4/d6/l32 0 2026-03-09T16:14:32.769 INFO:tasks.workunit.client.0.vm03.stdout:5/500: creat d2/d7/de/d11/d19/d31/d35/fb0 x:0 0 0 2026-03-09T16:14:32.771 INFO:tasks.workunit.client.0.vm03.stdout:5/501: chown d2/d7/de/d11/d19/f8e 526409457 1 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:2/441: stat db/c92 0 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:3/411: creat d5/d6d/f7a x:0 0 0 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:3/412: write d5/f2b [2066710,31187] 0 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:5/502: creat d2/d7/d8/d24/fb1 x:0 0 0 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:2/442: write db/d12/f85 [3161211,53100] 0 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:5/503: dread - d2/d7/d8/d16/d5c/f94 zero size 2026-03-09T16:14:32.780 INFO:tasks.workunit.client.0.vm03.stdout:3/413: symlink d5/d6d/d5a/l7b 0 2026-03-09T16:14:32.781 INFO:tasks.workunit.client.0.vm03.stdout:3/414: fdatasync d5/d1e/f26 0 2026-03-09T16:14:32.781 INFO:tasks.workunit.client.0.vm03.stdout:3/415: fdatasync d5/d44/f54 0 2026-03-09T16:14:32.782 INFO:tasks.workunit.client.0.vm03.stdout:5/504: creat d2/d7/d3c/fb2 x:0 0 0 2026-03-09T16:14:32.782 INFO:tasks.workunit.client.0.vm03.stdout:3/416: chown d5/d1e/d42/f20 1792149 1 2026-03-09T16:14:32.783 INFO:tasks.workunit.client.0.vm03.stdout:2/443: chown db/d12/d2a/d61/l6f 118 1 2026-03-09T16:14:32.783 INFO:tasks.workunit.client.0.vm03.stdout:1/342: link d4/d6/d3b/d6b/c2c d4/d6/d1d/d20/c7c 0 2026-03-09T16:14:32.784 INFO:tasks.workunit.client.0.vm03.stdout:1/343: dread - d4/db/f60 zero size 2026-03-09T16:14:32.785 INFO:tasks.workunit.client.0.vm03.stdout:3/417: creat d5/d6d/d5a/f7c x:0 0 0 2026-03-09T16:14:32.786 INFO:tasks.workunit.client.0.vm03.stdout:1/344: chown d4/d6/d3b/d63/f78 26 1 2026-03-09T16:14:32.787 INFO:tasks.workunit.client.0.vm03.stdout:2/444: fsync db/d12/d2a/f5f 0 2026-03-09T16:14:32.788 INFO:tasks.workunit.client.0.vm03.stdout:7/405: sync 2026-03-09T16:14:32.789 INFO:tasks.workunit.client.0.vm03.stdout:5/505: truncate d2/d7/d1a/f6e 1147115 0 2026-03-09T16:14:32.789 INFO:tasks.workunit.client.0.vm03.stdout:3/418: write d5/d1e/d42/f20 [1852639,93538] 0 2026-03-09T16:14:32.791 INFO:tasks.workunit.client.0.vm03.stdout:7/406: creat d4/dc/d61/f80 x:0 0 0 2026-03-09T16:14:32.794 INFO:tasks.workunit.client.0.vm03.stdout:1/345: chown d4/d6/d3b/d63/f78 56 1 2026-03-09T16:14:32.794 INFO:tasks.workunit.client.0.vm03.stdout:3/419: write d5/d44/f56 [1214798,19052] 0 2026-03-09T16:14:32.795 INFO:tasks.workunit.client.0.vm03.stdout:5/506: chown d2/d7/de/d11/d19/d31/d35 228 1 2026-03-09T16:14:32.811 INFO:tasks.workunit.client.0.vm03.stdout:8/452: write da/db/d30/f76 [1936217,41485] 0 2026-03-09T16:14:32.816 INFO:tasks.workunit.client.0.vm03.stdout:8/453: write da/d10/d28/d4f/d68/f8f [1033490,109447] 0 2026-03-09T16:14:32.820 INFO:tasks.workunit.client.0.vm03.stdout:1/346: dread d4/d6/d1d/d3d/f45 [0,4194304] 0 2026-03-09T16:14:32.822 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: pgmap v8: 65 pgs: 65 active+clean; 675 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 11 MiB/s rd, 44 MiB/s wr, 128 op/s 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: mgrmap e24: vm05.dygxfv(active, since 11s), standbys: vm03.gbgzmu 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:32.823 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:32 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:32.838 INFO:tasks.workunit.client.0.vm03.stdout:3/420: creat d5/d1e/d42/d4c/f7d x:0 0 0 2026-03-09T16:14:32.841 INFO:tasks.workunit.client.0.vm03.stdout:6/410: dwrite d9/d14/f44 [0,4194304] 0 2026-03-09T16:14:32.843 INFO:tasks.workunit.client.0.vm03.stdout:8/454: dwrite da/d32/d79/f84 [0,4194304] 0 2026-03-09T16:14:32.844 INFO:tasks.workunit.client.0.vm03.stdout:6/411: readlink d9/l5b 0 2026-03-09T16:14:32.844 INFO:tasks.workunit.client.0.vm03.stdout:2/445: dwrite db/d12/f77 [0,4194304] 0 2026-03-09T16:14:32.851 INFO:tasks.workunit.client.0.vm03.stdout:9/490: dwrite d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:32.861 INFO:tasks.workunit.client.0.vm03.stdout:5/507: dwrite d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:32.887 INFO:tasks.workunit.client.0.vm03.stdout:3/421: rmdir d5/d53/d6c 39 2026-03-09T16:14:32.887 INFO:tasks.workunit.client.0.vm03.stdout:3/422: dread - d5/d6d/f67 zero size 2026-03-09T16:14:32.892 INFO:tasks.workunit.client.0.vm03.stdout:6/412: creat d9/d42/d45/d65/f7f x:0 0 0 2026-03-09T16:14:32.893 INFO:tasks.workunit.client.0.vm03.stdout:6/413: truncate d9/d42/d45/f4a 1692916 0 2026-03-09T16:14:32.893 INFO:tasks.workunit.client.0.vm03.stdout:6/414: dread - d9/d22/f62 zero size 2026-03-09T16:14:32.894 INFO:tasks.workunit.client.0.vm03.stdout:3/423: dread d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:32.895 INFO:tasks.workunit.client.0.vm03.stdout:9/491: read d2/d4/d11/d29/f4e [265288,63251] 0 2026-03-09T16:14:32.895 INFO:tasks.workunit.client.0.vm03.stdout:3/424: readlink d5/d1e/l45 0 2026-03-09T16:14:32.898 INFO:tasks.workunit.client.0.vm03.stdout:8/455: creat da/db/d30/f94 x:0 0 0 2026-03-09T16:14:32.900 INFO:tasks.workunit.client.0.vm03.stdout:8/456: read da/db/f44 [7419281,22857] 0 2026-03-09T16:14:32.907 INFO:tasks.workunit.client.0.vm03.stdout:2/446: creat db/d12/d2a/d61/f9b x:0 0 0 2026-03-09T16:14:32.910 INFO:tasks.workunit.client.0.vm03.stdout:4/433: dread d5/db/d25/f4e [0,4194304] 0 2026-03-09T16:14:32.915 INFO:tasks.workunit.client.0.vm03.stdout:0/452: dread d0/d7/d3e/d57/d5a/d5f/f71 [0,4194304] 0 2026-03-09T16:14:32.915 INFO:tasks.workunit.client.0.vm03.stdout:6/415: mkdir d9/d42/d45/d50/d80 0 2026-03-09T16:14:32.916 INFO:tasks.workunit.client.0.vm03.stdout:8/457: dwrite f8 [4194304,4194304] 0 2026-03-09T16:14:32.916 INFO:tasks.workunit.client.0.vm03.stdout:0/453: write d0/d7/d3e/d57/d5a/f38 [4493434,116765] 0 2026-03-09T16:14:32.926 INFO:tasks.workunit.client.0.vm03.stdout:5/508: dwrite d2/d7/de/d11/d19/d31/d35/d87/f8d [0,4194304] 0 2026-03-09T16:14:32.926 INFO:tasks.workunit.client.0.vm03.stdout:4/434: readlink d5/db/l76 0 2026-03-09T16:14:32.928 INFO:tasks.workunit.client.0.vm03.stdout:5/509: dwrite d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:32.937 INFO:tasks.workunit.client.0.vm03.stdout:3/425: chown d5/d6d/l5e 0 1 2026-03-09T16:14:32.937 INFO:tasks.workunit.client.0.vm03.stdout:5/510: truncate d2/d7/d3c/f9a 1207572 0 2026-03-09T16:14:32.937 INFO:tasks.workunit.client.0.vm03.stdout:5/511: chown d2/d7/d3c/f9a 651394097 1 2026-03-09T16:14:32.937 INFO:tasks.workunit.client.0.vm03.stdout:5/512: chown d2/d7/d3c/l83 0 1 2026-03-09T16:14:32.937 INFO:tasks.workunit.client.0.vm03.stdout:4/435: write d5/db/d25/d31/d33/f69 [3072784,75273] 0 2026-03-09T16:14:32.938 INFO:tasks.workunit.client.0.vm03.stdout:9/492: mknod d2/d4/d11/d12/d28/c98 0 2026-03-09T16:14:32.939 INFO:tasks.workunit.client.0.vm03.stdout:1/347: creat d4/db/f7d x:0 0 0 2026-03-09T16:14:32.946 INFO:tasks.workunit.client.0.vm03.stdout:5/513: truncate d2/d7/de/d11/d19/d29/d90/fac 579394 0 2026-03-09T16:14:32.952 INFO:tasks.workunit.client.0.vm03.stdout:3/426: read d5/d1e/d42/f20 [2901805,82517] 0 2026-03-09T16:14:32.957 INFO:tasks.workunit.client.0.vm03.stdout:6/416: creat d9/d42/d45/d63/d66/f81 x:0 0 0 2026-03-09T16:14:32.963 INFO:tasks.workunit.client.0.vm03.stdout:1/348: creat d4/d6/d3b/d63/f7e x:0 0 0 2026-03-09T16:14:32.963 INFO:tasks.workunit.client.0.vm03.stdout:2/447: creat db/d12/d2a/d61/d79/d83/d52/f9c x:0 0 0 2026-03-09T16:14:32.963 INFO:tasks.workunit.client.0.vm03.stdout:9/493: dread d2/d4/d11/d12/f45 [0,4194304] 0 2026-03-09T16:14:32.964 INFO:tasks.workunit.client.0.vm03.stdout:6/417: truncate d9/d42/f74 891114 0 2026-03-09T16:14:32.974 INFO:tasks.workunit.client.0.vm03.stdout:2/448: write db/d12/d2a/d61/d6d/d8c/f97 [1011657,74557] 0 2026-03-09T16:14:32.975 INFO:tasks.workunit.client.0.vm03.stdout:5/514: dwrite d2/d7/d1a/d1c/d3f/f67 [0,4194304] 0 2026-03-09T16:14:32.975 INFO:tasks.workunit.client.0.vm03.stdout:9/494: dread - d2/d54/f5e zero size 2026-03-09T16:14:32.975 INFO:tasks.workunit.client.0.vm03.stdout:8/458: getdents da/d1d 0 2026-03-09T16:14:32.980 INFO:tasks.workunit.client.0.vm03.stdout:8/459: chown da/f4c 11 1 2026-03-09T16:14:32.984 INFO:tasks.workunit.client.0.vm03.stdout:8/460: dread - da/d32/f56 zero size 2026-03-09T16:14:32.991 INFO:tasks.workunit.client.0.vm03.stdout:6/418: dread d9/d42/d45/d50/f51 [0,4194304] 0 2026-03-09T16:14:33.003 INFO:tasks.workunit.client.0.vm03.stdout:8/461: write da/f35 [1503325,85335] 0 2026-03-09T16:14:33.003 INFO:tasks.workunit.client.0.vm03.stdout:2/449: rename db/d12/d2a/d61/f45 to db/d12/d2a/d61/f9d 0 2026-03-09T16:14:33.004 INFO:tasks.workunit.client.0.vm03.stdout:5/515: creat d2/d7/de/d11/d38/d3b/fb3 x:0 0 0 2026-03-09T16:14:33.005 INFO:tasks.workunit.client.0.vm03.stdout:6/419: dwrite d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:33.010 INFO:tasks.workunit.client.0.vm03.stdout:8/462: write da/d6c/d7a/f91 [765858,93252] 0 2026-03-09T16:14:33.012 INFO:tasks.workunit.client.0.vm03.stdout:9/495: symlink d2/d4/d11/d29/l99 0 2026-03-09T16:14:33.020 INFO:tasks.workunit.client.0.vm03.stdout:8/463: fsync da/db/f6a 0 2026-03-09T16:14:33.023 INFO:tasks.workunit.client.0.vm03.stdout:8/464: readlink da/d10/d28/l50 0 2026-03-09T16:14:33.024 INFO:tasks.workunit.client.0.vm03.stdout:8/465: stat da/d32/c41 0 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: pgmap v8: 65 pgs: 65 active+clean; 675 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 11 MiB/s rd, 44 MiB/s wr, 128 op/s 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: mgrmap e24: vm05.dygxfv(active, since 11s), standbys: vm03.gbgzmu 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:32 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:33.033 INFO:tasks.workunit.client.0.vm03.stdout:6/420: symlink d9/d42/d45/d47/l82 0 2026-03-09T16:14:33.037 INFO:tasks.workunit.client.0.vm03.stdout:4/436: dread d5/db/d25/d31/d33/d79/f4f [0,4194304] 0 2026-03-09T16:14:33.040 INFO:tasks.workunit.client.0.vm03.stdout:6/421: dwrite d9/d22/f27 [0,4194304] 0 2026-03-09T16:14:33.047 INFO:tasks.workunit.client.0.vm03.stdout:6/422: write d9/d42/d45/d63/d66/f7d [807399,78531] 0 2026-03-09T16:14:33.049 INFO:tasks.workunit.client.0.vm03.stdout:1/349: dread d4/d6/d3b/d6b/d25/f4e [0,4194304] 0 2026-03-09T16:14:33.049 INFO:tasks.workunit.client.0.vm03.stdout:1/350: chown d4/c5e 459187280 1 2026-03-09T16:14:33.055 INFO:tasks.workunit.client.0.vm03.stdout:8/466: mkdir da/d32/d79/d95 0 2026-03-09T16:14:33.058 INFO:tasks.workunit.client.0.vm03.stdout:6/423: dwrite d9/d42/d45/d63/d66/f7d [0,4194304] 0 2026-03-09T16:14:33.063 INFO:tasks.workunit.client.0.vm03.stdout:7/407: truncate d4/da/d18/d22/d24/d15/f34 4082645 0 2026-03-09T16:14:33.064 INFO:tasks.workunit.client.0.vm03.stdout:7/408: stat d4/da/d18/f37 0 2026-03-09T16:14:33.071 INFO:tasks.workunit.client.0.vm03.stdout:4/437: rmdir d5/dd 39 2026-03-09T16:14:33.075 INFO:tasks.workunit.client.0.vm03.stdout:1/351: mkdir d4/d39/d7f 0 2026-03-09T16:14:33.076 INFO:tasks.workunit.client.0.vm03.stdout:8/467: symlink da/d10/d63/l96 0 2026-03-09T16:14:33.078 INFO:tasks.workunit.client.0.vm03.stdout:8/468: chown da/db/f53 1197670628 1 2026-03-09T16:14:33.079 INFO:tasks.workunit.client.0.vm03.stdout:5/516: rename d2/d7/d1a/c20 to d2/d7/de/cb4 0 2026-03-09T16:14:33.080 INFO:tasks.workunit.client.0.vm03.stdout:0/454: dwrite d0/f60 [0,4194304] 0 2026-03-09T16:14:33.083 INFO:tasks.workunit.client.0.vm03.stdout:7/409: fdatasync d4/da/d18/d22/d24/d16/d6e/f73 0 2026-03-09T16:14:33.090 INFO:tasks.workunit.client.0.vm03.stdout:7/410: dread d4/da/d18/d22/d24/d16/f6c [0,4194304] 0 2026-03-09T16:14:33.093 INFO:tasks.workunit.client.0.vm03.stdout:8/469: rename da/d10/d28/f2c to da/d6c/d7a/f97 0 2026-03-09T16:14:33.100 INFO:tasks.workunit.client.0.vm03.stdout:6/424: sync 2026-03-09T16:14:33.106 INFO:tasks.workunit.client.0.vm03.stdout:6/425: dwrite d9/f3b [0,4194304] 0 2026-03-09T16:14:33.106 INFO:tasks.workunit.client.0.vm03.stdout:8/470: dread da/db/f53 [0,4194304] 0 2026-03-09T16:14:33.110 INFO:tasks.workunit.client.0.vm03.stdout:3/427: truncate d5/d1e/f72 803788 0 2026-03-09T16:14:33.121 INFO:tasks.workunit.client.0.vm03.stdout:4/438: dread d5/db/f28 [0,4194304] 0 2026-03-09T16:14:33.122 INFO:tasks.workunit.client.0.vm03.stdout:7/411: mknod d4/da/d18/d22/d24/d16/d6e/d7e/c81 0 2026-03-09T16:14:33.129 INFO:tasks.workunit.client.0.vm03.stdout:6/426: rename d9/f36 to d9/d22/f83 0 2026-03-09T16:14:33.137 INFO:tasks.workunit.client.0.vm03.stdout:4/439: mkdir d5/db/d25/d31/d33/d55/d81 0 2026-03-09T16:14:33.138 INFO:tasks.workunit.client.0.vm03.stdout:3/428: link d5/d1e/d42/d4c/f7d d5/d1e/d42/d55/f7e 0 2026-03-09T16:14:33.140 INFO:tasks.workunit.client.0.vm03.stdout:8/471: getdents da/d1d 0 2026-03-09T16:14:33.141 INFO:tasks.workunit.client.0.vm03.stdout:4/440: dread d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:33.142 INFO:tasks.workunit.client.0.vm03.stdout:3/429: mknod d5/d6d/d5a/d63/c7f 0 2026-03-09T16:14:33.151 INFO:tasks.workunit.client.0.vm03.stdout:3/430: symlink d5/d1e/l80 0 2026-03-09T16:14:33.151 INFO:tasks.workunit.client.0.vm03.stdout:3/431: mknod d5/d2e/c81 0 2026-03-09T16:14:33.151 INFO:tasks.workunit.client.0.vm03.stdout:6/427: getdents d9/d42/d45/d50 0 2026-03-09T16:14:33.151 INFO:tasks.workunit.client.0.vm03.stdout:4/441: dwrite d5/db/d25/d31/d33/d55/f62 [4194304,4194304] 0 2026-03-09T16:14:33.151 INFO:tasks.workunit.client.0.vm03.stdout:7/412: sync 2026-03-09T16:14:33.163 INFO:tasks.workunit.client.0.vm03.stdout:3/432: creat d5/d44/f82 x:0 0 0 2026-03-09T16:14:33.163 INFO:tasks.workunit.client.0.vm03.stdout:4/442: write d5/db/d25/d31/d33/d79/f7b [2300179,108696] 0 2026-03-09T16:14:33.170 INFO:tasks.workunit.client.0.vm03.stdout:4/443: dread d5/f7 [4194304,4194304] 0 2026-03-09T16:14:33.176 INFO:tasks.workunit.client.0.vm03.stdout:6/428: dread d9/d22/f3e [0,4194304] 0 2026-03-09T16:14:33.182 INFO:tasks.workunit.client.0.vm03.stdout:4/444: mkdir d5/db/d25/d31/d4d/d5b/d72/d82 0 2026-03-09T16:14:33.189 INFO:tasks.workunit.client.0.vm03.stdout:3/433: dwrite d5/d1e/f66 [0,4194304] 0 2026-03-09T16:14:33.189 INFO:tasks.workunit.client.0.vm03.stdout:7/413: dwrite d4/da/d45/f63 [0,4194304] 0 2026-03-09T16:14:33.192 INFO:tasks.workunit.client.0.vm03.stdout:4/445: dwrite d5/db/d25/f78 [0,4194304] 0 2026-03-09T16:14:33.199 INFO:tasks.workunit.client.0.vm03.stdout:7/414: readlink d4/da/d18/d22/d24/d15/d71/l78 0 2026-03-09T16:14:33.208 INFO:tasks.workunit.client.0.vm03.stdout:2/450: write db/d12/d2a/d61/f54 [160443,114138] 0 2026-03-09T16:14:33.209 INFO:tasks.workunit.client.0.vm03.stdout:3/434: rename d5/d2e/c4a to d5/d1e/d42/d4c/c83 0 2026-03-09T16:14:33.212 INFO:tasks.workunit.client.0.vm03.stdout:7/415: rmdir d4/da/d18/d22/d24/d16/d3e 39 2026-03-09T16:14:33.218 INFO:tasks.workunit.client.0.vm03.stdout:8/472: dread da/f35 [0,4194304] 0 2026-03-09T16:14:33.223 INFO:tasks.workunit.client.0.vm03.stdout:6/429: mkdir d9/d84 0 2026-03-09T16:14:33.228 INFO:tasks.workunit.client.0.vm03.stdout:4/446: creat d5/d17/f83 x:0 0 0 2026-03-09T16:14:33.233 INFO:tasks.workunit.client.0.vm03.stdout:2/451: dread db/d12/d2a/d61/f65 [0,4194304] 0 2026-03-09T16:14:33.237 INFO:tasks.workunit.client.0.vm03.stdout:3/435: creat d5/d1e/d42/f84 x:0 0 0 2026-03-09T16:14:33.238 INFO:tasks.workunit.client.0.vm03.stdout:7/416: mknod d4/d2d/d4b/c82 0 2026-03-09T16:14:33.240 INFO:tasks.workunit.client.0.vm03.stdout:3/436: dwrite d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:33.246 INFO:tasks.workunit.client.0.vm03.stdout:6/430: rmdir d9/d42 39 2026-03-09T16:14:33.250 INFO:tasks.workunit.client.0.vm03.stdout:9/496: dwrite d2/d54/f90 [0,4194304] 0 2026-03-09T16:14:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/352: write d4/d6/d3b/f35 [3195317,7892] 0 2026-03-09T16:14:33.257 INFO:tasks.workunit.client.0.vm03.stdout:5/517: write d2/d7/d3c/d3d/f93 [565109,126931] 0 2026-03-09T16:14:33.263 INFO:tasks.workunit.client.0.vm03.stdout:1/353: dwrite d4/d6/f15 [0,4194304] 0 2026-03-09T16:14:33.274 INFO:tasks.workunit.client.0.vm03.stdout:4/447: truncate f1 1445990 0 2026-03-09T16:14:33.276 INFO:tasks.workunit.client.0.vm03.stdout:0/455: truncate d0/d7/d3e/d5d/f61 138752 0 2026-03-09T16:14:33.277 INFO:tasks.workunit.client.0.vm03.stdout:0/456: chown d0/d7/d3e 798851522 1 2026-03-09T16:14:33.285 INFO:tasks.workunit.client.0.vm03.stdout:2/452: creat db/d12/d2a/d61/d79/d83/d64/d68/f9e x:0 0 0 2026-03-09T16:14:33.285 INFO:tasks.workunit.client.0.vm03.stdout:2/453: fdatasync db/f2d 0 2026-03-09T16:14:33.288 INFO:tasks.workunit.client.0.vm03.stdout:7/417: truncate d4/da/d18/d22/d24/d16/d3e/f75 650063 0 2026-03-09T16:14:33.295 INFO:tasks.workunit.client.0.vm03.stdout:8/473: symlink da/db/l98 0 2026-03-09T16:14:33.305 INFO:tasks.workunit.client.0.vm03.stdout:6/431: chown d9/d42/d45/d50/f51 13888 1 2026-03-09T16:14:33.317 INFO:tasks.workunit.client.0.vm03.stdout:9/497: creat d2/d4/d11/d12/f9a x:0 0 0 2026-03-09T16:14:33.318 INFO:tasks.workunit.client.0.vm03.stdout:9/498: chown d2/df/c3f 291 1 2026-03-09T16:14:33.318 INFO:tasks.workunit.client.0.vm03.stdout:9/499: stat d2/d4/d11/d29/d92 0 2026-03-09T16:14:33.325 INFO:tasks.workunit.client.0.vm03.stdout:9/500: dread d2/d4/d1f/f44 [0,4194304] 0 2026-03-09T16:14:33.326 INFO:tasks.workunit.client.0.vm03.stdout:5/518: rename d2/d7/de/d11/d38/f57 to d2/d7/d8/d16/d5c/fb5 0 2026-03-09T16:14:33.328 INFO:tasks.workunit.client.0.vm03.stdout:1/354: rmdir d4/d6/d3b/d6b/d25/d50 39 2026-03-09T16:14:33.331 INFO:tasks.workunit.client.0.vm03.stdout:1/355: dread d4/d6/d3b/d6b/d25/f4e [0,4194304] 0 2026-03-09T16:14:33.332 INFO:tasks.workunit.client.0.vm03.stdout:1/356: write d4/d6/d1d/d20/d5f/f57 [793594,105950] 0 2026-03-09T16:14:33.333 INFO:tasks.workunit.client.0.vm03.stdout:4/448: creat d5/d17/d44/f84 x:0 0 0 2026-03-09T16:14:33.336 INFO:tasks.workunit.client.0.vm03.stdout:8/474: creat da/d1d/f99 x:0 0 0 2026-03-09T16:14:33.337 INFO:tasks.workunit.client.0.vm03.stdout:7/418: dread d4/da/d18/d22/d24/d16/f6c [0,4194304] 0 2026-03-09T16:14:33.341 INFO:tasks.workunit.client.0.vm03.stdout:3/437: mkdir d5/d6d/d6a/d85 0 2026-03-09T16:14:33.341 INFO:tasks.workunit.client.0.vm03.stdout:3/438: dread - d5/d1e/d42/f84 zero size 2026-03-09T16:14:33.345 INFO:tasks.workunit.client.0.vm03.stdout:3/439: dwrite d5/d6d/d5a/f78 [0,4194304] 0 2026-03-09T16:14:33.352 INFO:tasks.workunit.client.0.vm03.stdout:3/440: dwrite d5/d44/f54 [0,4194304] 0 2026-03-09T16:14:33.359 INFO:tasks.workunit.client.0.vm03.stdout:9/501: creat d2/df/d89/f9b x:0 0 0 2026-03-09T16:14:33.362 INFO:tasks.workunit.client.0.vm03.stdout:0/457: creat d0/d7/d3e/d95/f99 x:0 0 0 2026-03-09T16:14:33.366 INFO:tasks.workunit.client.0.vm03.stdout:5/519: dwrite d2/d7/d1a/f4d [4194304,4194304] 0 2026-03-09T16:14:33.373 INFO:tasks.workunit.client.0.vm03.stdout:0/458: dwrite d0/d7/d48/f4a [0,4194304] 0 2026-03-09T16:14:33.374 INFO:tasks.workunit.client.0.vm03.stdout:0/459: fdatasync d0/f4d 0 2026-03-09T16:14:33.380 INFO:tasks.workunit.client.0.vm03.stdout:0/460: dwrite d0/da/d5c/f31 [0,4194304] 0 2026-03-09T16:14:33.388 INFO:tasks.workunit.client.0.vm03.stdout:2/454: unlink db/d12/d2a/d61/d79/d83/c5a 0 2026-03-09T16:14:33.388 INFO:tasks.workunit.client.0.vm03.stdout:0/461: write d0/d7/d3e/d57/f90 [921638,51666] 0 2026-03-09T16:14:33.388 INFO:tasks.workunit.client.0.vm03.stdout:8/475: chown da/l46 7197 1 2026-03-09T16:14:33.388 INFO:tasks.workunit.client.0.vm03.stdout:2/455: chown db/d12/d2a/c35 3206 1 2026-03-09T16:14:33.392 INFO:tasks.workunit.client.0.vm03.stdout:8/476: dwrite da/db/f6a [0,4194304] 0 2026-03-09T16:14:33.410 INFO:tasks.workunit.client.0.vm03.stdout:6/432: symlink d9/d14/d71/l85 0 2026-03-09T16:14:33.410 INFO:tasks.workunit.client.0.vm03.stdout:6/433: chown d9/d14/f44 7 1 2026-03-09T16:14:33.419 INFO:tasks.workunit.client.0.vm03.stdout:3/441: chown d5/d53/d6c/d79 4961781 1 2026-03-09T16:14:33.420 INFO:tasks.workunit.client.0.vm03.stdout:3/442: write d5/f33 [45940,23223] 0 2026-03-09T16:14:33.421 INFO:tasks.workunit.client.0.vm03.stdout:9/502: truncate d2/d4/d11/d12/f50 294645 0 2026-03-09T16:14:33.423 INFO:tasks.workunit.client.0.vm03.stdout:4/449: link d5/db/d25/d31/d33/d79/f49 d5/db/d25/d31/d4d/f85 0 2026-03-09T16:14:33.424 INFO:tasks.workunit.client.0.vm03.stdout:1/357: mknod d4/d6/d3b/d6b/c80 0 2026-03-09T16:14:33.430 INFO:tasks.workunit.client.0.vm03.stdout:0/462: creat d0/d7/d3e/d57/d5a/d52/f9a x:0 0 0 2026-03-09T16:14:33.434 INFO:tasks.workunit.client.0.vm03.stdout:9/503: dwrite d2/de/d88/f6f [0,4194304] 0 2026-03-09T16:14:33.438 INFO:tasks.workunit.client.0.vm03.stdout:8/477: dread da/d32/f61 [0,4194304] 0 2026-03-09T16:14:33.443 INFO:tasks.workunit.client.0.vm03.stdout:6/434: unlink d9/d42/d45/d63/c7a 0 2026-03-09T16:14:33.447 INFO:tasks.workunit.client.0.vm03.stdout:4/450: mkdir d5/d17/d44/d86 0 2026-03-09T16:14:33.447 INFO:tasks.workunit.client.0.vm03.stdout:1/358: read - d4/d31/f38 zero size 2026-03-09T16:14:33.448 INFO:tasks.workunit.client.0.vm03.stdout:2/456: mknod db/d12/d2a/d99/c9f 0 2026-03-09T16:14:33.449 INFO:tasks.workunit.client.0.vm03.stdout:1/359: readlink d4/d6/d1d/d20/d23/d3e/l65 0 2026-03-09T16:14:33.451 INFO:tasks.workunit.client.0.vm03.stdout:8/478: symlink da/d10/d28/l9a 0 2026-03-09T16:14:33.458 INFO:tasks.workunit.client.0.vm03.stdout:9/504: dread d2/df/f22 [0,4194304] 0 2026-03-09T16:14:33.458 INFO:tasks.workunit.client.0.vm03.stdout:4/451: dwrite d5/dd/f23 [0,4194304] 0 2026-03-09T16:14:33.471 INFO:tasks.workunit.client.0.vm03.stdout:1/360: creat d4/d31/f81 x:0 0 0 2026-03-09T16:14:33.471 INFO:tasks.workunit.client.0.vm03.stdout:6/435: mknod d9/d84/c86 0 2026-03-09T16:14:33.475 INFO:tasks.workunit.client.0.vm03.stdout:9/505: mknod d2/d54/d7d/c9c 0 2026-03-09T16:14:33.476 INFO:tasks.workunit.client.0.vm03.stdout:9/506: chown d2/de/f85 1630901646 1 2026-03-09T16:14:33.478 INFO:tasks.workunit.client.0.vm03.stdout:4/452: mkdir d5/d17/d44/d87 0 2026-03-09T16:14:33.482 INFO:tasks.workunit.client.0.vm03.stdout:3/443: getdents d5/d6d/d6a 0 2026-03-09T16:14:33.491 INFO:tasks.workunit.client.0.vm03.stdout:8/479: rmdir da/d10/d28/d4f/d68/d80 39 2026-03-09T16:14:33.491 INFO:tasks.workunit.client.0.vm03.stdout:6/436: truncate d9/d42/d45/f4d 4142100 0 2026-03-09T16:14:33.491 INFO:tasks.workunit.client.0.vm03.stdout:0/463: getdents d0/da/d5c 0 2026-03-09T16:14:33.494 INFO:tasks.workunit.client.0.vm03.stdout:1/361: rmdir d4/d6/d3b/d6b/d25/d50 39 2026-03-09T16:14:33.494 INFO:tasks.workunit.client.0.vm03.stdout:1/362: chown d4/fd 465 1 2026-03-09T16:14:33.498 INFO:tasks.workunit.client.0.vm03.stdout:6/437: mknod d9/d42/d45/c87 0 2026-03-09T16:14:33.506 INFO:tasks.workunit.client.0.vm03.stdout:4/453: truncate d5/d17/d44/f64 926078 0 2026-03-09T16:14:33.511 INFO:tasks.workunit.client.0.vm03.stdout:8/480: dread da/d10/d28/f8c [4194304,4194304] 0 2026-03-09T16:14:33.516 INFO:tasks.workunit.client.0.vm03.stdout:1/363: creat d4/d6/d3b/d63/f82 x:0 0 0 2026-03-09T16:14:33.516 INFO:tasks.workunit.client.0.vm03.stdout:8/481: write da/db/fe [5155229,28439] 0 2026-03-09T16:14:33.516 INFO:tasks.workunit.client.0.vm03.stdout:8/482: write da/d10/d28/f57 [770704,113204] 0 2026-03-09T16:14:33.518 INFO:tasks.workunit.client.0.vm03.stdout:0/464: sync 2026-03-09T16:14:33.522 INFO:tasks.workunit.client.0.vm03.stdout:6/438: write d9/d42/d45/d50/f51 [4759630,85909] 0 2026-03-09T16:14:33.525 INFO:tasks.workunit.client.0.vm03.stdout:6/439: write d9/f40 [4566178,76834] 0 2026-03-09T16:14:33.531 INFO:tasks.workunit.client.0.vm03.stdout:5/520: dwrite d2/d7/de/d33/f8b [0,4194304] 0 2026-03-09T16:14:33.533 INFO:tasks.workunit.client.0.vm03.stdout:5/521: stat d2/d7/d8/d24/d27/l6a 0 2026-03-09T16:14:33.534 INFO:tasks.workunit.client.0.vm03.stdout:5/522: readlink d2/d7/d8/d16/l5f 0 2026-03-09T16:14:33.539 INFO:tasks.workunit.client.0.vm03.stdout:8/483: write da/f4c [755064,42798] 0 2026-03-09T16:14:33.545 INFO:tasks.workunit.client.0.vm03.stdout:1/364: symlink d4/d6/d1d/d20/l83 0 2026-03-09T16:14:33.554 INFO:tasks.workunit.client.0.vm03.stdout:7/419: truncate d4/da/d18/f6a 3942908 0 2026-03-09T16:14:33.557 INFO:tasks.workunit.client.0.vm03.stdout:4/454: symlink d5/l88 0 2026-03-09T16:14:33.570 INFO:tasks.workunit.client.0.vm03.stdout:2/457: dwrite db/f14 [0,4194304] 0 2026-03-09T16:14:33.572 INFO:tasks.workunit.client.0.vm03.stdout:2/458: chown db/d12/c19 299 1 2026-03-09T16:14:33.574 INFO:tasks.workunit.client.0.vm03.stdout:5/523: mkdir d2/d7/de/d11/d19/d29/d90/db6 0 2026-03-09T16:14:33.584 INFO:tasks.workunit.client.0.vm03.stdout:8/484: mknod da/d6c/c9b 0 2026-03-09T16:14:33.588 INFO:tasks.workunit.client.0.vm03.stdout:9/507: dwrite d2/d4/d11/d12/d28/f2f [0,4194304] 0 2026-03-09T16:14:33.594 INFO:tasks.workunit.client.0.vm03.stdout:7/420: symlink d4/dc/l83 0 2026-03-09T16:14:33.596 INFO:tasks.workunit.client.0.vm03.stdout:3/444: dwrite d5/d1e/d42/f20 [4194304,4194304] 0 2026-03-09T16:14:33.604 INFO:tasks.workunit.client.0.vm03.stdout:7/421: dwrite d4/da/f42 [0,4194304] 0 2026-03-09T16:14:33.620 INFO:tasks.workunit.client.0.vm03.stdout:4/455: creat d5/db/d25/d31/d33/d79/f89 x:0 0 0 2026-03-09T16:14:33.640 INFO:tasks.workunit.client.0.vm03.stdout:0/465: write d0/d7/d3e/d57/d5a/f4b [1283261,75601] 0 2026-03-09T16:14:33.643 INFO:tasks.workunit.client.0.vm03.stdout:2/459: mkdir db/d12/d2a/d61/d79/d83/d64/d68/da0 0 2026-03-09T16:14:33.643 INFO:tasks.workunit.client.0.vm03.stdout:2/460: chown db/d12/f57 7241583 1 2026-03-09T16:14:33.647 INFO:tasks.workunit.client.0.vm03.stdout:5/524: symlink d2/lb7 0 2026-03-09T16:14:33.653 INFO:tasks.workunit.client.0.vm03.stdout:9/508: fsync d2/df/f22 0 2026-03-09T16:14:33.661 INFO:tasks.workunit.client.0.vm03.stdout:6/440: dwrite d9/f1e [0,4194304] 0 2026-03-09T16:14:33.672 INFO:tasks.workunit.client.0.vm03.stdout:3/445: mkdir d5/d1e/d42/d55/d86 0 2026-03-09T16:14:33.675 INFO:tasks.workunit.client.0.vm03.stdout:3/446: truncate d5/d1e/d42/d34/f5f 955696 0 2026-03-09T16:14:33.679 INFO:tasks.workunit.client.0.vm03.stdout:3/447: dread d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:33.682 INFO:tasks.workunit.client.0.vm03.stdout:3/448: chown d5/d2e/l3b 63908 1 2026-03-09T16:14:33.689 INFO:tasks.workunit.client.0.vm03.stdout:2/461: symlink db/d12/d2a/d61/la1 0 2026-03-09T16:14:33.691 INFO:tasks.workunit.client.0.vm03.stdout:8/485: mkdir da/d10/d28/d4f/d85/d9c 0 2026-03-09T16:14:33.693 INFO:tasks.workunit.client.0.vm03.stdout:8/486: read da/d6c/d7a/f91 [347897,60574] 0 2026-03-09T16:14:33.696 INFO:tasks.workunit.client.0.vm03.stdout:9/509: mknod d2/d4/d11/d29/c9d 0 2026-03-09T16:14:33.703 INFO:tasks.workunit.client.0.vm03.stdout:1/365: link d4/db/f60 d4/d6/d3b/d6b/d25/f84 0 2026-03-09T16:14:33.703 INFO:tasks.workunit.client.0.vm03.stdout:1/366: read - d4/d31/f38 zero size 2026-03-09T16:14:33.714 INFO:tasks.workunit.client.0.vm03.stdout:7/422: rename d4/dc/f1a to d4/dc/d61/f84 0 2026-03-09T16:14:33.742 INFO:tasks.workunit.client.0.vm03.stdout:2/462: symlink db/d12/d2a/d61/la2 0 2026-03-09T16:14:33.745 INFO:tasks.workunit.client.0.vm03.stdout:8/487: mknod da/d6c/d7a/c9d 0 2026-03-09T16:14:33.746 INFO:tasks.workunit.client.0.vm03.stdout:9/510: creat d2/d4/d11/d29/d2a/d46/f9e x:0 0 0 2026-03-09T16:14:33.752 INFO:tasks.workunit.client.0.vm03.stdout:7/423: mknod d4/da/d45/d51/d36/c85 0 2026-03-09T16:14:33.755 INFO:tasks.workunit.client.0.vm03.stdout:2/463: creat db/d12/d2a/d61/d6d/d8c/fa3 x:0 0 0 2026-03-09T16:14:33.773 INFO:tasks.workunit.client.0.vm03.stdout:3/449: link d5/d6d/c69 d5/d1e/d42/d34/d70/c87 0 2026-03-09T16:14:33.773 INFO:tasks.workunit.client.0.vm03.stdout:7/424: symlink d4/d2d/d4b/l86 0 2026-03-09T16:14:33.773 INFO:tasks.workunit.client.0.vm03.stdout:3/450: dread - d5/d2e/f65 zero size 2026-03-09T16:14:33.781 INFO:tasks.workunit.client.0.vm03.stdout:2/464: mkdir db/d12/d2a/d61/d6d/d8c/d94/da4 0 2026-03-09T16:14:33.781 INFO:tasks.workunit.client.0.vm03.stdout:8/488: creat da/db/f9e x:0 0 0 2026-03-09T16:14:33.800 INFO:tasks.workunit.client.0.vm03.stdout:4/456: dwrite f1 [0,4194304] 0 2026-03-09T16:14:33.802 INFO:tasks.workunit.client.0.vm03.stdout:4/457: fsync d5/dd/d1f/f70 0 2026-03-09T16:14:33.807 INFO:tasks.workunit.client.0.vm03.stdout:1/367: dwrite d4/d6/d3b/d6b/d25/f84 [0,4194304] 0 2026-03-09T16:14:33.816 INFO:tasks.workunit.client.0.vm03.stdout:1/368: dwrite d4/d6/d3b/d63/f78 [0,4194304] 0 2026-03-09T16:14:33.818 INFO:tasks.workunit.client.0.vm03.stdout:1/369: write d4/d6/d1d/d20/d23/d3e/d3f/f48 [552216,97239] 0 2026-03-09T16:14:33.820 INFO:tasks.workunit.client.0.vm03.stdout:2/465: sync 2026-03-09T16:14:33.826 INFO:tasks.workunit.client.0.vm03.stdout:3/451: dwrite d5/d44/f82 [0,4194304] 0 2026-03-09T16:14:33.826 INFO:tasks.workunit.client.0.vm03.stdout:7/425: dwrite d4/d2d/d4b/f6b [0,4194304] 0 2026-03-09T16:14:33.848 INFO:tasks.workunit.client.0.vm03.stdout:6/441: dwrite d9/d42/d45/f4d [0,4194304] 0 2026-03-09T16:14:33.848 INFO:tasks.workunit.client.0.vm03.stdout:9/511: dwrite d2/f7 [4194304,4194304] 0 2026-03-09T16:14:33.848 INFO:tasks.workunit.client.0.vm03.stdout:6/442: stat d9/d22/l26 0 2026-03-09T16:14:33.854 INFO:tasks.workunit.client.0.vm03.stdout:2/466: fdatasync db/d12/d2a/d61/d79/d83/d64/d68/f6b 0 2026-03-09T16:14:33.854 INFO:tasks.workunit.client.0.vm03.stdout:6/443: read d9/f3b [2427976,94312] 0 2026-03-09T16:14:33.866 INFO:tasks.workunit.client.0.vm03.stdout:3/452: mkdir d5/d53/d88 0 2026-03-09T16:14:33.866 INFO:tasks.workunit.client.0.vm03.stdout:9/512: creat d2/df/f9f x:0 0 0 2026-03-09T16:14:33.869 INFO:tasks.workunit.client.0.vm03.stdout:3/453: read d5/d44/f56 [1375984,20140] 0 2026-03-09T16:14:33.871 INFO:tasks.workunit.client.0.vm03.stdout:6/444: dwrite d9/f5c [0,4194304] 0 2026-03-09T16:14:33.875 INFO:tasks.workunit.client.0.vm03.stdout:2/467: mkdir db/d12/da5 0 2026-03-09T16:14:33.876 INFO:tasks.workunit.client.0.vm03.stdout:2/468: chown db/d12/d2a/d61/d79/d83/d52/f86 896208 1 2026-03-09T16:14:33.884 INFO:tasks.workunit.client.0.vm03.stdout:6/445: mknod d9/d42/d45/d50/c88 0 2026-03-09T16:14:33.888 INFO:tasks.workunit.client.0.vm03.stdout:6/446: symlink d9/d14/d71/l89 0 2026-03-09T16:14:33.891 INFO:tasks.workunit.client.0.vm03.stdout:6/447: mkdir d9/d42/d45/d50/d80/d8a 0 2026-03-09T16:14:33.948 INFO:tasks.workunit.client.0.vm03.stdout:6/448: rmdir d9/d42/d45/d65 39 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: Upgrade: Need to upgrade myself (mgr.vm05.dygxfv) 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: Upgrade: Need to upgrade myself (mgr.vm05.dygxfv) 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:33.950 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:33 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:33.956 INFO:tasks.workunit.client.0.vm03.stdout:6/449: chown d9/d42/l60 1660951 1 2026-03-09T16:14:33.960 INFO:tasks.workunit.client.0.vm03.stdout:8/489: dwrite da/d10/d28/d4f/d68/f8a [0,4194304] 0 2026-03-09T16:14:33.967 INFO:tasks.workunit.client.0.vm03.stdout:6/450: dread d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:33.971 INFO:tasks.workunit.client.0.vm03.stdout:8/490: symlink da/d32/d79/l9f 0 2026-03-09T16:14:33.973 INFO:tasks.workunit.client.0.vm03.stdout:8/491: stat da/db/l2a 0 2026-03-09T16:14:33.975 INFO:tasks.workunit.client.0.vm03.stdout:8/492: chown da/d1d/l22 32787 1 2026-03-09T16:14:33.976 INFO:tasks.workunit.client.0.vm03.stdout:4/458: dwrite d5/d17/f2b [0,4194304] 0 2026-03-09T16:14:33.980 INFO:tasks.workunit.client.0.vm03.stdout:0/466: rename d0/d7/d3e/d5d to d0/da/d1b/d9b 0 2026-03-09T16:14:33.985 INFO:tasks.workunit.client.0.vm03.stdout:9/513: write d2/d4/d1f/f51 [1014175,92902] 0 2026-03-09T16:14:33.985 INFO:tasks.workunit.client.0.vm03.stdout:7/426: write d4/da/d45/d51/f50 [323748,17355] 0 2026-03-09T16:14:33.985 INFO:tasks.workunit.client.0.vm03.stdout:7/427: readlink d4/d2d/d4b/l86 0 2026-03-09T16:14:33.985 INFO:tasks.workunit.client.0.vm03.stdout:4/459: creat d5/d17/f8a x:0 0 0 2026-03-09T16:14:33.985 INFO:tasks.workunit.client.0.vm03.stdout:1/370: dwrite d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:33.986 INFO:tasks.workunit.client.0.vm03.stdout:8/493: symlink da/d6c/d7a/la0 0 2026-03-09T16:14:33.992 INFO:tasks.workunit.client.0.vm03.stdout:9/514: write d2/d4/d11/d29/d2a/d46/f81 [245983,52234] 0 2026-03-09T16:14:33.999 INFO:tasks.workunit.client.0.vm03.stdout:5/525: rename d2/d7/d8/d16/c8f to d2/d7/de/d11/d38/cb8 0 2026-03-09T16:14:34.002 INFO:tasks.workunit.client.0.vm03.stdout:0/467: chown d0/d7/d48/f43 1 1 2026-03-09T16:14:34.002 INFO:tasks.workunit.client.0.vm03.stdout:3/454: dwrite d5/d1e/f72 [0,4194304] 0 2026-03-09T16:14:34.011 INFO:tasks.workunit.client.0.vm03.stdout:3/455: write d5/d1e/d42/f84 [852101,67358] 0 2026-03-09T16:14:34.011 INFO:tasks.workunit.client.0.vm03.stdout:8/494: dwrite da/d1d/f4a [4194304,4194304] 0 2026-03-09T16:14:34.011 INFO:tasks.workunit.client.0.vm03.stdout:1/371: rmdir d4/d31/d5c 39 2026-03-09T16:14:34.011 INFO:tasks.workunit.client.0.vm03.stdout:7/428: mknod d4/da/d45/c87 0 2026-03-09T16:14:34.021 INFO:tasks.workunit.client.0.vm03.stdout:2/469: rename db/d12/l16 to db/d12/d2a/d61/la6 0 2026-03-09T16:14:34.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: Upgrade: Need to upgrade myself (mgr.vm05.dygxfv) 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: Upgrade: Need to upgrade myself (mgr.vm05.dygxfv) 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:34.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:33 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:34.027 INFO:tasks.workunit.client.0.vm03.stdout:2/470: chown db/c78 1 1 2026-03-09T16:14:34.027 INFO:tasks.workunit.client.0.vm03.stdout:5/526: creat d2/fb9 x:0 0 0 2026-03-09T16:14:34.027 INFO:tasks.workunit.client.0.vm03.stdout:5/527: write d2/d7/de/d11/d38/d3b/fa2 [273439,106200] 0 2026-03-09T16:14:34.028 INFO:tasks.workunit.client.0.vm03.stdout:3/456: dread d5/d44/f5d [0,4194304] 0 2026-03-09T16:14:34.033 INFO:tasks.workunit.client.0.vm03.stdout:0/468: mkdir d0/d7/d3e/d57/d5a/d74/d9c 0 2026-03-09T16:14:34.038 INFO:tasks.workunit.client.0.vm03.stdout:3/457: dwrite d5/d2e/f65 [0,4194304] 0 2026-03-09T16:14:34.038 INFO:tasks.workunit.client.0.vm03.stdout:3/458: write d5/f2b [740997,129080] 0 2026-03-09T16:14:34.040 INFO:tasks.workunit.client.0.vm03.stdout:7/429: dread d4/d2d/f52 [0,4194304] 0 2026-03-09T16:14:34.041 INFO:tasks.workunit.client.0.vm03.stdout:7/430: write d4/da/d45/d51/d36/f6f [733329,15975] 0 2026-03-09T16:14:34.042 INFO:tasks.workunit.client.0.vm03.stdout:5/528: dread d2/d7/de/d11/d19/d31/f99 [0,4194304] 0 2026-03-09T16:14:34.043 INFO:tasks.workunit.client.0.vm03.stdout:6/451: sync 2026-03-09T16:14:34.044 INFO:tasks.workunit.client.0.vm03.stdout:5/529: write d2/fb9 [380557,6030] 0 2026-03-09T16:14:34.046 INFO:tasks.workunit.client.0.vm03.stdout:5/530: chown d2/d7/d3c/l83 1016 1 2026-03-09T16:14:34.047 INFO:tasks.workunit.client.0.vm03.stdout:6/452: dread d9/d42/d45/d63/d66/f7d [0,4194304] 0 2026-03-09T16:14:34.049 INFO:tasks.workunit.client.0.vm03.stdout:1/372: chown d4/d39/l51 13235 1 2026-03-09T16:14:34.051 INFO:tasks.workunit.client.0.vm03.stdout:4/460: rename d5/d17/d44/d86 to d5/db/d25/d8b 0 2026-03-09T16:14:34.054 INFO:tasks.workunit.client.0.vm03.stdout:8/495: sync 2026-03-09T16:14:34.057 INFO:tasks.workunit.client.0.vm03.stdout:0/469: rmdir d0/d7/d3e/d95 39 2026-03-09T16:14:34.065 INFO:tasks.workunit.client.0.vm03.stdout:7/431: rmdir d4/da/d18/d22/d24/d16/d2b 39 2026-03-09T16:14:34.068 INFO:tasks.workunit.client.0.vm03.stdout:6/453: rmdir d9/d14/d71 39 2026-03-09T16:14:34.072 INFO:tasks.workunit.client.0.vm03.stdout:5/531: rename d2/d7/d3c/l63 to d2/d7/d8/d24/d27/d43/d4b/lba 0 2026-03-09T16:14:34.073 INFO:tasks.workunit.client.0.vm03.stdout:8/496: truncate da/db/f75 175232 0 2026-03-09T16:14:34.075 INFO:tasks.workunit.client.0.vm03.stdout:6/454: symlink d9/d42/d45/d63/l8b 0 2026-03-09T16:14:34.081 INFO:tasks.workunit.client.0.vm03.stdout:6/455: chown d9/d42/f78 12061 1 2026-03-09T16:14:34.081 INFO:tasks.workunit.client.0.vm03.stdout:1/373: rename d4/d31/f38 to d4/d6/d1d/d20/d23/d3e/d3f/f85 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:2/471: creat db/d12/d2a/d61/d79/d83/fa7 x:0 0 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:8/497: creat da/d10/d28/d4f/d85/fa1 x:0 0 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:1/374: write d4/d6/d3b/d6b/f71 [663793,27312] 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:6/456: rename d9/d42/f79 to d9/d42/d45/d47/f8c 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:1/375: mknod d4/d6/d1d/d20/d23/d3e/d3f/c86 0 2026-03-09T16:14:34.082 INFO:tasks.workunit.client.0.vm03.stdout:8/498: rename da/c49 to da/d10/ca2 0 2026-03-09T16:14:34.084 INFO:tasks.workunit.client.0.vm03.stdout:2/472: rename db/d12/f4b to db/d12/d2a/d61/d6d/fa8 0 2026-03-09T16:14:34.084 INFO:tasks.workunit.client.0.vm03.stdout:8/499: mknod da/d10/d63/ca3 0 2026-03-09T16:14:34.087 INFO:tasks.workunit.client.0.vm03.stdout:7/432: sync 2026-03-09T16:14:34.087 INFO:tasks.workunit.client.0.vm03.stdout:5/532: sync 2026-03-09T16:14:34.090 INFO:tasks.workunit.client.0.vm03.stdout:6/457: dwrite d9/f40 [0,4194304] 0 2026-03-09T16:14:34.107 INFO:tasks.workunit.client.0.vm03.stdout:8/500: creat da/d10/fa4 x:0 0 0 2026-03-09T16:14:34.109 INFO:tasks.workunit.client.0.vm03.stdout:9/515: dwrite d2/d4/d11/d12/f3d [0,4194304] 0 2026-03-09T16:14:34.115 INFO:tasks.workunit.client.0.vm03.stdout:3/459: write d5/d1e/d42/f1d [1960017,18484] 0 2026-03-09T16:14:34.125 INFO:tasks.workunit.client.0.vm03.stdout:4/461: dwrite d5/dd/d1f/f58 [0,4194304] 0 2026-03-09T16:14:34.126 INFO:tasks.workunit.client.0.vm03.stdout:0/470: dwrite d0/da/f1c [0,4194304] 0 2026-03-09T16:14:34.126 INFO:tasks.workunit.client.0.vm03.stdout:3/460: dwrite d5/d1e/d42/f74 [0,4194304] 0 2026-03-09T16:14:34.130 INFO:tasks.workunit.client.0.vm03.stdout:3/461: readlink d5/d1e/d42/d34/l59 0 2026-03-09T16:14:34.136 INFO:tasks.workunit.client.0.vm03.stdout:1/376: truncate d4/db/f21 1088583 0 2026-03-09T16:14:34.140 INFO:tasks.workunit.client.0.vm03.stdout:5/533: unlink d2/d7/d8/fa5 0 2026-03-09T16:14:34.143 INFO:tasks.workunit.client.0.vm03.stdout:5/534: dread - d2/d7/d3c/fb2 zero size 2026-03-09T16:14:34.148 INFO:tasks.workunit.client.0.vm03.stdout:6/458: stat d9/d42/d45/d65/f7f 0 2026-03-09T16:14:34.157 INFO:tasks.workunit.client.0.vm03.stdout:5/535: mkdir d2/d7/de/d11/d19/dbb 0 2026-03-09T16:14:34.160 INFO:tasks.workunit.client.0.vm03.stdout:8/501: symlink da/la5 0 2026-03-09T16:14:34.161 INFO:tasks.workunit.client.0.vm03.stdout:3/462: symlink d5/d53/d88/l89 0 2026-03-09T16:14:34.163 INFO:tasks.workunit.client.0.vm03.stdout:2/473: rename db/d12/d2a/d61/d79/d83/d52/c90 to db/ca9 0 2026-03-09T16:14:34.174 INFO:tasks.workunit.client.0.vm03.stdout:4/462: rmdir d5/d17/d44/d87 0 2026-03-09T16:14:34.181 INFO:tasks.workunit.client.0.vm03.stdout:2/474: symlink db/laa 0 2026-03-09T16:14:34.184 INFO:tasks.workunit.client.0.vm03.stdout:4/463: creat d5/db/d25/d31/d33/d55/d81/f8c x:0 0 0 2026-03-09T16:14:34.185 INFO:tasks.workunit.client.0.vm03.stdout:4/464: truncate d5/d17/f83 256375 0 2026-03-09T16:14:34.187 INFO:tasks.workunit.client.0.vm03.stdout:5/536: getdents d2/d7/d3c/d3d 0 2026-03-09T16:14:34.190 INFO:tasks.workunit.client.0.vm03.stdout:5/537: dread d2/d7/de/d11/d38/d3b/fa2 [0,4194304] 0 2026-03-09T16:14:34.196 INFO:tasks.workunit.client.0.vm03.stdout:2/475: mknod db/d12/d2a/d61/d79/d83/cab 0 2026-03-09T16:14:34.211 INFO:tasks.workunit.client.0.vm03.stdout:5/538: mkdir d2/d7/d8/d24/d27/d43/d4b/dbc 0 2026-03-09T16:14:34.233 INFO:tasks.workunit.client.0.vm03.stdout:7/433: dwrite d4/da/d18/d22/d24/d16/d6e/f73 [0,4194304] 0 2026-03-09T16:14:34.248 INFO:tasks.workunit.client.0.vm03.stdout:2/476: symlink db/d12/d2a/d61/d79/d83/d64/d68/da0/lac 0 2026-03-09T16:14:34.250 INFO:tasks.workunit.client.0.vm03.stdout:9/516: dwrite d2/f33 [0,4194304] 0 2026-03-09T16:14:34.252 INFO:tasks.workunit.client.0.vm03.stdout:5/539: symlink d2/d7/d8/d24/d27/d43/d4b/dbc/lbd 0 2026-03-09T16:14:34.252 INFO:tasks.workunit.client.0.vm03.stdout:7/434: symlink d4/da/d45/d51/d36/l88 0 2026-03-09T16:14:34.255 INFO:tasks.workunit.client.0.vm03.stdout:7/435: fsync d4/da/d18/d22/d24/d16/d6e/f7b 0 2026-03-09T16:14:34.255 INFO:tasks.workunit.client.0.vm03.stdout:0/471: dwrite d0/d7/d48/f43 [0,4194304] 0 2026-03-09T16:14:34.268 INFO:tasks.workunit.client.0.vm03.stdout:9/517: creat d2/df/d84/fa0 x:0 0 0 2026-03-09T16:14:34.272 INFO:tasks.workunit.client.0.vm03.stdout:1/377: truncate d4/d6/d3b/f35 4447253 0 2026-03-09T16:14:34.277 INFO:tasks.workunit.client.0.vm03.stdout:6/459: truncate d9/d22/f3f 1301295 0 2026-03-09T16:14:34.277 INFO:tasks.workunit.client.0.vm03.stdout:6/460: chown d9/d22/l6d 258133184 1 2026-03-09T16:14:34.277 INFO:tasks.workunit.client.0.vm03.stdout:8/502: dwrite da/f52 [0,4194304] 0 2026-03-09T16:14:34.285 INFO:tasks.workunit.client.0.vm03.stdout:3/463: dwrite d5/d1e/d42/f29 [4194304,4194304] 0 2026-03-09T16:14:34.288 INFO:tasks.workunit.client.0.vm03.stdout:3/464: write d5/d1e/d42/d55/f57 [759505,122739] 0 2026-03-09T16:14:34.302 INFO:tasks.workunit.client.0.vm03.stdout:7/436: fdatasync d4/da/d18/d22/d24/d15/d71/f7d 0 2026-03-09T16:14:34.304 INFO:tasks.workunit.client.0.vm03.stdout:4/465: dwrite d5/db/d25/d31/d4d/f85 [0,4194304] 0 2026-03-09T16:14:34.305 INFO:tasks.workunit.client.0.vm03.stdout:4/466: chown d5/db/d25 1174804 1 2026-03-09T16:14:34.308 INFO:tasks.workunit.client.0.vm03.stdout:6/461: mknod d9/d42/d45/d63/d66/c8d 0 2026-03-09T16:14:34.312 INFO:tasks.workunit.client.0.vm03.stdout:7/437: rmdir d4/dc/d61 39 2026-03-09T16:14:34.316 INFO:tasks.workunit.client.0.vm03.stdout:9/518: read d2/d4/d11/f41 [132540,89050] 0 2026-03-09T16:14:34.319 INFO:tasks.workunit.client.0.vm03.stdout:4/467: creat d5/d17/f8d x:0 0 0 2026-03-09T16:14:34.319 INFO:tasks.workunit.client.0.vm03.stdout:4/468: write d5/dd/d1f/f48 [5377331,123583] 0 2026-03-09T16:14:34.322 INFO:tasks.workunit.client.0.vm03.stdout:0/472: creat d0/da/d7a/d98/f9d x:0 0 0 2026-03-09T16:14:34.326 INFO:tasks.workunit.client.0.vm03.stdout:7/438: unlink d4/da/d18/d22/d24/d15/d71/l78 0 2026-03-09T16:14:34.327 INFO:tasks.workunit.client.0.vm03.stdout:7/439: chown d4/dc/c54 31 1 2026-03-09T16:14:34.330 INFO:tasks.workunit.client.0.vm03.stdout:9/519: mknod d2/d4/d11/d29/ca1 0 2026-03-09T16:14:34.345 INFO:tasks.workunit.client.0.vm03.stdout:0/473: write d0/d7/d3e/d95/f99 [512734,86824] 0 2026-03-09T16:14:34.345 INFO:tasks.workunit.client.0.vm03.stdout:1/378: rename d4/d6/d1d/d20/d23/f28 to d4/d6/d3b/d6b/d25/f87 0 2026-03-09T16:14:34.352 INFO:tasks.workunit.client.0.vm03.stdout:1/379: dread d4/d6/d1d/d20/d23/f62 [0,4194304] 0 2026-03-09T16:14:34.356 INFO:tasks.workunit.client.0.vm03.stdout:2/477: truncate db/f23 3533812 0 2026-03-09T16:14:34.358 INFO:tasks.workunit.client.0.vm03.stdout:9/520: sync 2026-03-09T16:14:34.362 INFO:tasks.workunit.client.0.vm03.stdout:9/521: dwrite d2/df/d89/f7e [0,4194304] 0 2026-03-09T16:14:34.365 INFO:tasks.workunit.client.0.vm03.stdout:4/469: creat d5/db/d25/d31/d4d/d5b/d72/d82/f8e x:0 0 0 2026-03-09T16:14:34.367 INFO:tasks.workunit.client.0.vm03.stdout:4/470: fsync d5/db/d25/d31/d33/d55/d81/f8c 0 2026-03-09T16:14:34.374 INFO:tasks.workunit.client.0.vm03.stdout:5/540: truncate d2/d7/de/d11/d19/d31/d35/d87/f8d 3165199 0 2026-03-09T16:14:34.381 INFO:tasks.workunit.client.0.vm03.stdout:5/541: dwrite d2/d7/de/faa [0,4194304] 0 2026-03-09T16:14:34.397 INFO:tasks.workunit.client.0.vm03.stdout:3/465: rename d5/d1e/d42/d34/f5f to d5/d6d/d5a/d63/f8a 0 2026-03-09T16:14:34.401 INFO:tasks.workunit.client.0.vm03.stdout:8/503: getdents da 0 2026-03-09T16:14:34.403 INFO:tasks.workunit.client.0.vm03.stdout:2/478: mkdir db/d12/d2a/d61/d6d/d8c/d94/dad 0 2026-03-09T16:14:34.404 INFO:tasks.workunit.client.0.vm03.stdout:2/479: chown db/d12/d2a/d61/d79/d83/d64/d68 360230029 1 2026-03-09T16:14:34.408 INFO:tasks.workunit.client.0.vm03.stdout:7/440: creat d4/da/d18/d22/d24/d16/d69/f89 x:0 0 0 2026-03-09T16:14:34.410 INFO:tasks.workunit.client.0.vm03.stdout:3/466: dread d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:34.422 INFO:tasks.workunit.client.0.vm03.stdout:9/522: rmdir d2/df/d89 39 2026-03-09T16:14:34.423 INFO:tasks.workunit.client.0.vm03.stdout:9/523: write d2/d4/d11/d29/d2a/d46/f81 [555601,110232] 0 2026-03-09T16:14:34.429 INFO:tasks.workunit.client.0.vm03.stdout:4/471: mknod d5/db/d25/d31/d4d/d5b/d72/d82/c8f 0 2026-03-09T16:14:34.429 INFO:tasks.workunit.client.0.vm03.stdout:4/472: write d5/db/d25/f78 [4518106,52630] 0 2026-03-09T16:14:34.433 INFO:tasks.workunit.client.0.vm03.stdout:4/473: dwrite d5/db/f5d [8388608,4194304] 0 2026-03-09T16:14:34.439 INFO:tasks.workunit.client.0.vm03.stdout:5/542: mkdir d2/d7/de/d11/d19/d29/d90/dbe 0 2026-03-09T16:14:34.440 INFO:tasks.workunit.client.0.vm03.stdout:4/474: chown d5/d17/d44/f61 440944 1 2026-03-09T16:14:34.451 INFO:tasks.workunit.client.0.vm03.stdout:0/474: rename d0/d7/d3e/d45/d8e/f94 to d0/d7/d3e/d57/d5a/d74/f9e 0 2026-03-09T16:14:34.451 INFO:tasks.workunit.client.0.vm03.stdout:0/475: chown d0/d7/c1d 356012 1 2026-03-09T16:14:34.452 INFO:tasks.workunit.client.0.vm03.stdout:0/476: fdatasync d0/d7/d48/f43 0 2026-03-09T16:14:34.457 INFO:tasks.workunit.client.0.vm03.stdout:1/380: creat d4/d39/d7f/f88 x:0 0 0 2026-03-09T16:14:34.457 INFO:tasks.workunit.client.0.vm03.stdout:8/504: creat da/d32/fa6 x:0 0 0 2026-03-09T16:14:34.458 INFO:tasks.workunit.client.0.vm03.stdout:8/505: chown da/d6c/d7a/la0 59210886 1 2026-03-09T16:14:34.458 INFO:tasks.workunit.client.0.vm03.stdout:1/381: chown d4/d6/d1d/d20/d23/d3e/d3f/f6f 2161417 1 2026-03-09T16:14:34.458 INFO:tasks.workunit.client.0.vm03.stdout:1/382: stat d4/db 0 2026-03-09T16:14:34.461 INFO:tasks.workunit.client.0.vm03.stdout:6/462: getdents d9 0 2026-03-09T16:14:34.470 INFO:tasks.workunit.client.0.vm03.stdout:3/467: mkdir d5/d1e/d42/d8b 0 2026-03-09T16:14:34.473 INFO:tasks.workunit.client.0.vm03.stdout:3/468: fsync d5/d2e/f65 0 2026-03-09T16:14:34.486 INFO:tasks.workunit.client.0.vm03.stdout:9/524: mknod d2/d4/d11/d29/d2a/d46/ca2 0 2026-03-09T16:14:34.492 INFO:tasks.workunit.client.0.vm03.stdout:7/441: dread d4/da/d18/d22/d24/d15/f34 [0,4194304] 0 2026-03-09T16:14:34.496 INFO:tasks.workunit.client.0.vm03.stdout:1/383: creat d4/d6/d3b/d63/f89 x:0 0 0 2026-03-09T16:14:34.498 INFO:tasks.workunit.client.0.vm03.stdout:2/480: mknod db/d12/d2a/d61/d79/d83/cae 0 2026-03-09T16:14:34.507 INFO:tasks.workunit.client.0.vm03.stdout:3/469: fsync d5/d1e/d42/d55/f7e 0 2026-03-09T16:14:34.507 INFO:tasks.workunit.client.0.vm03.stdout:3/470: stat d5/d1e/f72 0 2026-03-09T16:14:34.509 INFO:tasks.workunit.client.0.vm03.stdout:5/543: mkdir d2/d7/de/d11/dbf 0 2026-03-09T16:14:34.512 INFO:tasks.workunit.client.0.vm03.stdout:5/544: dwrite d2/d7/de/d11/f80 [0,4194304] 0 2026-03-09T16:14:34.513 INFO:tasks.workunit.client.0.vm03.stdout:4/475: unlink d5/d40/l5a 0 2026-03-09T16:14:34.520 INFO:tasks.workunit.client.0.vm03.stdout:4/476: dwrite d5/dd/d1f/f5e [0,4194304] 0 2026-03-09T16:14:34.526 INFO:tasks.workunit.client.0.vm03.stdout:9/525: symlink d2/d4/la3 0 2026-03-09T16:14:34.533 INFO:tasks.workunit.client.0.vm03.stdout:0/477: write d0/da/d5c/f31 [4515450,93010] 0 2026-03-09T16:14:34.542 INFO:tasks.workunit.client.0.vm03.stdout:0/478: chown d0/da 9 1 2026-03-09T16:14:34.543 INFO:tasks.workunit.client.0.vm03.stdout:3/471: stat d5/d6d/d5a/d63/l77 0 2026-03-09T16:14:34.544 INFO:tasks.workunit.client.0.vm03.stdout:6/463: write d9/f3b [2100758,115960] 0 2026-03-09T16:14:34.546 INFO:tasks.workunit.client.0.vm03.stdout:1/384: dread d4/db/f21 [0,4194304] 0 2026-03-09T16:14:34.547 INFO:tasks.workunit.client.0.vm03.stdout:1/385: dread - d4/d39/d7f/f88 zero size 2026-03-09T16:14:34.547 INFO:tasks.workunit.client.0.vm03.stdout:1/386: write d4/d31/f4f [394559,105914] 0 2026-03-09T16:14:34.548 INFO:tasks.workunit.client.0.vm03.stdout:6/464: dwrite d9/d42/d45/d50/f51 [0,4194304] 0 2026-03-09T16:14:34.567 INFO:tasks.workunit.client.0.vm03.stdout:4/477: creat d5/d17/d44/f90 x:0 0 0 2026-03-09T16:14:34.575 INFO:tasks.workunit.client.0.vm03.stdout:8/506: creat da/d10/d28/d4f/d68/fa7 x:0 0 0 2026-03-09T16:14:34.575 INFO:tasks.workunit.client.0.vm03.stdout:7/442: symlink d4/dc/d61/l8a 0 2026-03-09T16:14:34.579 INFO:tasks.workunit.client.0.vm03.stdout:8/507: dwrite da/db/f6a [4194304,4194304] 0 2026-03-09T16:14:34.579 INFO:tasks.workunit.client.0.vm03.stdout:0/479: mkdir d0/d7/d3e/d57/d5a/d52/d9f 0 2026-03-09T16:14:34.581 INFO:tasks.workunit.client.0.vm03.stdout:6/465: dread d9/d42/d45/d47/f6a [0,4194304] 0 2026-03-09T16:14:34.585 INFO:tasks.workunit.client.0.vm03.stdout:2/481: mkdir db/d12/da5/daf 0 2026-03-09T16:14:34.585 INFO:tasks.workunit.client.0.vm03.stdout:3/472: mkdir d5/d2e/d8c 0 2026-03-09T16:14:34.585 INFO:tasks.workunit.client.0.vm03.stdout:3/473: rename d5/d1e to d5/d1e/d42/d8b/d8d 22 2026-03-09T16:14:34.587 INFO:tasks.workunit.client.0.vm03.stdout:5/545: symlink d2/d7/d8/d24/d27/lc0 0 2026-03-09T16:14:34.589 INFO:tasks.workunit.client.0.vm03.stdout:7/443: read d4/da/d18/d22/d24/d15/f2a [2496261,69682] 0 2026-03-09T16:14:34.594 INFO:tasks.workunit.client.0.vm03.stdout:6/466: dread d9/d42/d45/d63/d66/f7d [0,4194304] 0 2026-03-09T16:14:34.594 INFO:tasks.workunit.client.0.vm03.stdout:7/444: dwrite d4/da/d18/d22/d24/f59 [0,4194304] 0 2026-03-09T16:14:34.594 INFO:tasks.workunit.client.0.vm03.stdout:2/482: creat db/d12/d2a/d61/d6d/fb0 x:0 0 0 2026-03-09T16:14:34.595 INFO:tasks.workunit.client.0.vm03.stdout:2/483: readlink db/d12/d2a/d61/l6f 0 2026-03-09T16:14:34.595 INFO:tasks.workunit.client.0.vm03.stdout:6/467: chown d9/d42/d45/d63/f64 7704597 1 2026-03-09T16:14:34.606 INFO:tasks.workunit.client.0.vm03.stdout:1/387: mknod d4/d6/d3b/d6b/c8a 0 2026-03-09T16:14:34.615 INFO:tasks.workunit.client.0.vm03.stdout:9/526: dwrite d2/d4/d11/d12/f1e [0,4194304] 0 2026-03-09T16:14:34.616 INFO:tasks.workunit.client.0.vm03.stdout:9/527: write d2/df/d84/fa0 [238753,31107] 0 2026-03-09T16:14:34.625 INFO:tasks.workunit.client.0.vm03.stdout:4/478: creat d5/db/d25/d31/d4d/d5b/d72/d77/f91 x:0 0 0 2026-03-09T16:14:34.625 INFO:tasks.workunit.client.0.vm03.stdout:4/479: write d5/dd/d1f/f48 [2871440,66429] 0 2026-03-09T16:14:34.626 INFO:tasks.workunit.client.0.vm03.stdout:5/546: symlink d2/d7/d1a/d1c/d6c/lc1 0 2026-03-09T16:14:34.626 INFO:tasks.workunit.client.0.vm03.stdout:0/480: dwrite d0/d7/d75/f69 [4194304,4194304] 0 2026-03-09T16:14:34.628 INFO:tasks.workunit.client.0.vm03.stdout:4/480: chown d5/db/d25/d31/d33/d79 3286 1 2026-03-09T16:14:34.632 INFO:tasks.workunit.client.0.vm03.stdout:4/481: write d5/db/d25/d31/d33/f69 [4840718,33644] 0 2026-03-09T16:14:34.633 INFO:tasks.workunit.client.0.vm03.stdout:8/508: mkdir da/db/da8 0 2026-03-09T16:14:34.634 INFO:tasks.workunit.client.0.vm03.stdout:8/509: stat da/d10/d28/d4f/d68/f8f 0 2026-03-09T16:14:34.656 INFO:tasks.workunit.client.0.vm03.stdout:7/445: fsync d4/da/d18/d22/d24/d16/f6c 0 2026-03-09T16:14:34.656 INFO:tasks.workunit.client.0.vm03.stdout:6/468: readlink d9/d14/d71/l85 0 2026-03-09T16:14:34.657 INFO:tasks.workunit.client.0.vm03.stdout:6/469: chown d9/f15 124012 1 2026-03-09T16:14:34.658 INFO:tasks.workunit.client.0.vm03.stdout:6/470: fdatasync d9/d42/d45/d63/d66/f81 0 2026-03-09T16:14:34.658 INFO:tasks.workunit.client.0.vm03.stdout:7/446: write d4/da/d18/d22/d24/d16/d3e/f75 [304854,77178] 0 2026-03-09T16:14:34.668 INFO:tasks.workunit.client.0.vm03.stdout:0/481: creat d0/d7/d3e/d57/d5a/d82/fa0 x:0 0 0 2026-03-09T16:14:34.681 INFO:tasks.workunit.client.0.vm03.stdout:4/482: unlink d5/db/d25/d31/c7e 0 2026-03-09T16:14:34.693 INFO:tasks.workunit.client.0.vm03.stdout:6/471: mkdir d9/d8e 0 2026-03-09T16:14:34.696 INFO:tasks.workunit.client.0.vm03.stdout:1/388: mkdir d4/db/d8b 0 2026-03-09T16:14:34.699 INFO:tasks.workunit.client.0.vm03.stdout:0/482: mknod d0/d7/d75/ca1 0 2026-03-09T16:14:34.703 INFO:tasks.workunit.client.0.vm03.stdout:5/547: mknod d2/d7/de/d11/dbf/cc2 0 2026-03-09T16:14:34.711 INFO:tasks.workunit.client.0.vm03.stdout:9/528: dwrite d2/d4/d11/d29/d2a/f58 [0,4194304] 0 2026-03-09T16:14:34.714 INFO:tasks.workunit.client.0.vm03.stdout:4/483: rmdir d5/db/d25 39 2026-03-09T16:14:34.723 INFO:tasks.workunit.client.0.vm03.stdout:3/474: getdents d5/d44/d61 0 2026-03-09T16:14:34.726 INFO:tasks.workunit.client.0.vm03.stdout:3/475: dwrite d5/d1e/f66 [0,4194304] 0 2026-03-09T16:14:34.737 INFO:tasks.workunit.client.0.vm03.stdout:3/476: truncate d5/d6d/d5a/d63/f8a 1977980 0 2026-03-09T16:14:34.738 INFO:tasks.workunit.client.0.vm03.stdout:1/389: unlink d4/d6/d1d/d20/d23/d3e/d3f/c56 0 2026-03-09T16:14:34.740 INFO:tasks.workunit.client.0.vm03.stdout:0/483: creat d0/d7/d3e/d45/fa2 x:0 0 0 2026-03-09T16:14:34.745 INFO:tasks.workunit.client.0.vm03.stdout:1/390: dread d4/d6/d1d/f66 [0,4194304] 0 2026-03-09T16:14:34.753 INFO:tasks.workunit.client.0.vm03.stdout:9/529: unlink d2/d4/d11/d29/d2a/d46/l4f 0 2026-03-09T16:14:34.756 INFO:tasks.workunit.client.0.vm03.stdout:4/484: dread d5/d17/f39 [0,4194304] 0 2026-03-09T16:14:34.760 INFO:tasks.workunit.client.0.vm03.stdout:4/485: dwrite d5/db/f6e [0,4194304] 0 2026-03-09T16:14:34.761 INFO:tasks.workunit.client.0.vm03.stdout:3/477: read d5/f43 [744685,46923] 0 2026-03-09T16:14:34.764 INFO:tasks.workunit.client.0.vm03.stdout:3/478: dwrite d5/d1e/f72 [0,4194304] 0 2026-03-09T16:14:34.773 INFO:tasks.workunit.client.0.vm03.stdout:6/472: fsync d9/d42/d45/d65/f7f 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:7/447: link d4/da/d18/d22/f48 d4/dc/d61/f8b 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:9/530: fdatasync d2/df/f76 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:8/510: getdents da/d1d 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:2/484: link db/d12/d2a/f88 db/d12/fb1 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:2/485: fdatasync db/d12/d2a/d61/d79/f95 0 2026-03-09T16:14:34.782 INFO:tasks.workunit.client.0.vm03.stdout:9/531: dwrite d2/d4/d11/f66 [0,4194304] 0 2026-03-09T16:14:34.785 INFO:tasks.workunit.client.0.vm03.stdout:6/473: write d9/f73 [1217050,80591] 0 2026-03-09T16:14:34.786 INFO:tasks.workunit.client.0.vm03.stdout:5/548: rename d2/d7/de/d11/d19/d31/d35/fb0 to d2/d7/d8/d24/d27/fc3 0 2026-03-09T16:14:34.790 INFO:tasks.workunit.client.0.vm03.stdout:9/532: dread d2/d4/d11/d29/d2a/f58 [0,4194304] 0 2026-03-09T16:14:34.797 INFO:tasks.workunit.client.0.vm03.stdout:7/448: rename d4/da/d18/d22/d24/d16/d69/f89 to d4/d2d/f8c 0 2026-03-09T16:14:34.803 INFO:tasks.workunit.client.0.vm03.stdout:5/549: dread - d2/d7/de/d11/d19/d31/f7e zero size 2026-03-09T16:14:34.803 INFO:tasks.workunit.client.0.vm03.stdout:8/511: creat da/d10/d28/d4f/d68/fa9 x:0 0 0 2026-03-09T16:14:34.803 INFO:tasks.workunit.client.0.vm03.stdout:6/474: dread d9/f20 [4194304,4194304] 0 2026-03-09T16:14:34.804 INFO:tasks.workunit.client.0.vm03.stdout:2/486: rename db/ca9 to db/d12/d2a/d61/d79/cb2 0 2026-03-09T16:14:34.805 INFO:tasks.workunit.client.0.vm03.stdout:2/487: write db/d12/d2a/f5f [1030155,108046] 0 2026-03-09T16:14:34.807 INFO:tasks.workunit.client.0.vm03.stdout:8/512: read da/db/f75 [122137,59968] 0 2026-03-09T16:14:34.809 INFO:tasks.workunit.client.0.vm03.stdout:7/449: truncate d4/da/d18/d22/d24/d16/d6e/f7b 1887743 0 2026-03-09T16:14:34.809 INFO:tasks.workunit.client.0.vm03.stdout:5/550: symlink d2/d7/de/d11/d19/d29/d90/dbe/lc4 0 2026-03-09T16:14:34.811 INFO:tasks.workunit.client.0.vm03.stdout:2/488: fsync db/d12/f39 0 2026-03-09T16:14:34.811 INFO:tasks.workunit.client.0.vm03.stdout:0/484: sync 2026-03-09T16:14:34.811 INFO:tasks.workunit.client.0.vm03.stdout:2/489: chown db/d12/d2a/d61/d6d/f81 494111250 1 2026-03-09T16:14:34.815 INFO:tasks.workunit.client.0.vm03.stdout:2/490: dwrite db/f55 [0,4194304] 0 2026-03-09T16:14:34.817 INFO:tasks.workunit.client.0.vm03.stdout:2/491: chown db/d12/d2a/l67 0 1 2026-03-09T16:14:34.819 INFO:tasks.workunit.client.0.vm03.stdout:5/551: creat d2/d7/de/d11/dbf/fc5 x:0 0 0 2026-03-09T16:14:34.821 INFO:tasks.workunit.client.0.vm03.stdout:8/513: fdatasync da/db/d30/f36 0 2026-03-09T16:14:34.822 INFO:tasks.workunit.client.0.vm03.stdout:6/475: dread d9/d14/f44 [0,4194304] 0 2026-03-09T16:14:34.822 INFO:tasks.workunit.client.0.vm03.stdout:2/492: rmdir db/d12/d2a 39 2026-03-09T16:14:34.823 INFO:tasks.workunit.client.0.vm03.stdout:8/514: rename da/d32/f56 to da/d45/faa 0 2026-03-09T16:14:34.824 INFO:tasks.workunit.client.0.vm03.stdout:5/552: creat d2/d7/de/d11/d38/d52/fc6 x:0 0 0 2026-03-09T16:14:34.825 INFO:tasks.workunit.client.0.vm03.stdout:7/450: creat d4/f8d x:0 0 0 2026-03-09T16:14:34.825 INFO:tasks.workunit.client.0.vm03.stdout:4/486: sync 2026-03-09T16:14:34.832 INFO:tasks.workunit.client.0.vm03.stdout:2/493: dwrite db/d12/d2a/d61/d6d/f8f [0,4194304] 0 2026-03-09T16:14:34.835 INFO:tasks.workunit.client.0.vm03.stdout:2/494: chown db/d12/d2a/d61/d79 22 1 2026-03-09T16:14:34.837 INFO:tasks.workunit.client.0.vm03.stdout:4/487: dwrite d5/d17/f2b [4194304,4194304] 0 2026-03-09T16:14:34.843 INFO:tasks.workunit.client.0.vm03.stdout:6/476: dwrite d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:34.849 INFO:tasks.workunit.client.0.vm03.stdout:6/477: chown d9/d42/d45/l76 1000201 1 2026-03-09T16:14:34.872 INFO:tasks.workunit.client.0.vm03.stdout:4/488: write d5/f7 [5973323,7003] 0 2026-03-09T16:14:34.880 INFO:tasks.workunit.client.0.vm03.stdout:5/553: link d2/d7/c47 d2/d7/d3c/cc7 0 2026-03-09T16:14:34.880 INFO:tasks.workunit.client.0.vm03.stdout:5/554: write d2/d7/de/faa [2555353,106247] 0 2026-03-09T16:14:34.888 INFO:tasks.workunit.client.0.vm03.stdout:4/489: chown d5/db/d25/d31/d33/d55/f62 6 1 2026-03-09T16:14:34.888 INFO:tasks.workunit.client.0.vm03.stdout:4/490: write d5/dd/d1f/d5f/f7c [1098890,105810] 0 2026-03-09T16:14:34.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:34 vm03.local ceph-mon[51019]: pgmap v9: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 MiB/s rd, 83 MiB/s wr, 215 op/s 2026-03-09T16:14:34.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:34 vm03.local ceph-mon[51019]: Upgrade: Updating mgr.vm03.gbgzmu 2026-03-09T16:14:34.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:34 vm03.local ceph-mon[51019]: Deploying daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:14:34.891 INFO:tasks.workunit.client.0.vm03.stdout:0/485: dread d0/d7/d48/f18 [0,4194304] 0 2026-03-09T16:14:34.896 INFO:tasks.workunit.client.0.vm03.stdout:4/491: dwrite d5/db/d25/d31/d4d/d5b/d72/f75 [0,4194304] 0 2026-03-09T16:14:34.905 INFO:tasks.workunit.client.0.vm03.stdout:4/492: fdatasync d5/d17/d44/f90 0 2026-03-09T16:14:34.905 INFO:tasks.workunit.client.0.vm03.stdout:0/486: dwrite d0/f4d [0,4194304] 0 2026-03-09T16:14:34.905 INFO:tasks.workunit.client.0.vm03.stdout:0/487: stat d0/d7/d3e/c55 0 2026-03-09T16:14:34.910 INFO:tasks.workunit.client.0.vm03.stdout:6/478: mknod d9/d42/d45/d50/d80/d8a/c8f 0 2026-03-09T16:14:34.914 INFO:tasks.workunit.client.0.vm03.stdout:6/479: dread d9/d42/d45/d63/d66/f7d [0,4194304] 0 2026-03-09T16:14:34.919 INFO:tasks.workunit.client.0.vm03.stdout:2/495: dread db/d12/d2a/d61/f5c [0,4194304] 0 2026-03-09T16:14:34.930 INFO:tasks.workunit.client.0.vm03.stdout:0/488: dread d0/d7/d3e/d57/d5a/d52/f68 [0,4194304] 0 2026-03-09T16:14:34.930 INFO:tasks.workunit.client.0.vm03.stdout:5/555: mknod d2/d7/de/d11/d19/d29/d90/cc8 0 2026-03-09T16:14:34.930 INFO:tasks.workunit.client.0.vm03.stdout:4/493: creat d5/db/d25/d31/d33/f92 x:0 0 0 2026-03-09T16:14:34.934 INFO:tasks.workunit.client.0.vm03.stdout:7/451: getdents d4/da/d18/d22/d24/d16 0 2026-03-09T16:14:34.937 INFO:tasks.workunit.client.0.vm03.stdout:6/480: chown d9/d22/f3e 692 1 2026-03-09T16:14:34.941 INFO:tasks.workunit.client.0.vm03.stdout:1/391: dwrite d4/d6/d1d/d20/f2a [0,4194304] 0 2026-03-09T16:14:34.943 INFO:tasks.workunit.client.0.vm03.stdout:2/496: chown db/c24 1013 1 2026-03-09T16:14:34.951 INFO:tasks.workunit.client.0.vm03.stdout:0/489: mknod d0/d7/ca3 0 2026-03-09T16:14:34.951 INFO:tasks.workunit.client.0.vm03.stdout:0/490: dread - d0/da/f8b zero size 2026-03-09T16:14:34.954 INFO:tasks.workunit.client.0.vm03.stdout:5/556: mknod d2/d7/de/d11/d19/d29/d90/dbe/cc9 0 2026-03-09T16:14:34.954 INFO:tasks.workunit.client.0.vm03.stdout:5/557: chown d2/d7/de/d11/d19/d31 18868 1 2026-03-09T16:14:34.955 INFO:tasks.workunit.client.0.vm03.stdout:5/558: fdatasync d2/d7/d1a/d1c/f5e 0 2026-03-09T16:14:34.958 INFO:tasks.workunit.client.0.vm03.stdout:5/559: dwrite d2/d7/d3c/fb2 [0,4194304] 0 2026-03-09T16:14:34.960 INFO:tasks.workunit.client.0.vm03.stdout:5/560: fdatasync d2/d7/d8/f86 0 2026-03-09T16:14:34.964 INFO:tasks.workunit.client.0.vm03.stdout:4/494: symlink d5/db/d25/d31/d4d/d5b/l93 0 2026-03-09T16:14:34.969 INFO:tasks.workunit.client.0.vm03.stdout:7/452: symlink d4/da/d45/l8e 0 2026-03-09T16:14:34.969 INFO:tasks.workunit.client.0.vm03.stdout:3/479: write d5/d44/f56 [73650,49893] 0 2026-03-09T16:14:34.969 INFO:tasks.workunit.client.0.vm03.stdout:7/453: read d4/da/d18/d22/d24/d15/d71/f7d [417844,19155] 0 2026-03-09T16:14:34.974 INFO:tasks.workunit.client.0.vm03.stdout:6/481: rename d9/d42/d45/d63 to d9/d42/d45/d50/d80/d90 0 2026-03-09T16:14:34.975 INFO:tasks.workunit.client.0.vm03.stdout:9/533: truncate d2/d4/d11/d12/f3d 1327426 0 2026-03-09T16:14:34.980 INFO:tasks.workunit.client.0.vm03.stdout:1/392: unlink d4/d6/d3b/d6b/f71 0 2026-03-09T16:14:34.980 INFO:tasks.workunit.client.0.vm03.stdout:1/393: fdatasync d4/d31/f81 0 2026-03-09T16:14:34.980 INFO:tasks.workunit.client.0.vm03.stdout:1/394: chown d4/db/d8b 7 1 2026-03-09T16:14:34.988 INFO:tasks.workunit.client.0.vm03.stdout:8/515: truncate da/d1d/f4a 7222282 0 2026-03-09T16:14:34.988 INFO:tasks.workunit.client.0.vm03.stdout:0/491: mknod d0/da/d7a/d98/ca4 0 2026-03-09T16:14:35.000 INFO:tasks.workunit.client.0.vm03.stdout:3/480: creat d5/d6d/d6a/f8e x:0 0 0 2026-03-09T16:14:35.000 INFO:tasks.workunit.client.0.vm03.stdout:4/495: creat d5/db/d25/d31/d4d/d5b/d72/f94 x:0 0 0 2026-03-09T16:14:35.001 INFO:tasks.workunit.client.0.vm03.stdout:3/481: chown d5/d53/d6c/d79 21 1 2026-03-09T16:14:35.007 INFO:tasks.workunit.client.0.vm03.stdout:6/482: readlink d9/d14/l25 0 2026-03-09T16:14:35.010 INFO:tasks.workunit.client.0.vm03.stdout:9/534: creat d2/d54/d7d/fa4 x:0 0 0 2026-03-09T16:14:35.016 INFO:tasks.workunit.client.0.vm03.stdout:2/497: mkdir db/d12/d2a/d61/d6d/d8c/d94/dad/db3 0 2026-03-09T16:14:35.019 INFO:tasks.workunit.client.0.vm03.stdout:8/516: creat da/d10/d28/d64/fab x:0 0 0 2026-03-09T16:14:35.022 INFO:tasks.workunit.client.0.vm03.stdout:8/517: dwrite da/d10/d28/f8b [0,4194304] 0 2026-03-09T16:14:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:34 vm05.local ceph-mon[58702]: pgmap v9: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 MiB/s rd, 83 MiB/s wr, 215 op/s 2026-03-09T16:14:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:34 vm05.local ceph-mon[58702]: Upgrade: Updating mgr.vm03.gbgzmu 2026-03-09T16:14:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:34 vm05.local ceph-mon[58702]: Deploying daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:14:35.032 INFO:tasks.workunit.client.0.vm03.stdout:4/496: mkdir d5/dd/d1f/d95 0 2026-03-09T16:14:35.041 INFO:tasks.workunit.client.0.vm03.stdout:7/454: creat d4/f8f x:0 0 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:4/497: symlink d5/db/d25/d31/d33/d79/l96 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:4/498: write d5/db/d25/d31/d33/f92 [873736,74656] 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:0/492: readlink d0/d7/d3e/d57/d5a/d52/l7d 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:5/561: link d2/c15 d2/d7/de/d11/d19/d29/d90/db6/cca 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:5/562: chown d2/d7/de/d11/d38/d3b/f68 116414011 1 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:7/455: truncate d4/da/d18/d22/d24/d15/f2a 34742 0 2026-03-09T16:14:35.051 INFO:tasks.workunit.client.0.vm03.stdout:4/499: symlink d5/d17/l97 0 2026-03-09T16:14:35.052 INFO:tasks.workunit.client.0.vm03.stdout:2/498: creat db/d12/d2a/d61/d79/d83/fb4 x:0 0 0 2026-03-09T16:14:35.052 INFO:tasks.workunit.client.0.vm03.stdout:0/493: creat d0/d7/d3e/d45/fa5 x:0 0 0 2026-03-09T16:14:35.054 INFO:tasks.workunit.client.0.vm03.stdout:6/483: getdents d9/d42/d45 0 2026-03-09T16:14:35.054 INFO:tasks.workunit.client.0.vm03.stdout:9/535: getdents d2/df/d84 0 2026-03-09T16:14:35.055 INFO:tasks.workunit.client.0.vm03.stdout:7/456: creat d4/d2d/f90 x:0 0 0 2026-03-09T16:14:35.057 INFO:tasks.workunit.client.0.vm03.stdout:4/500: creat d5/dd/d1f/d5f/f98 x:0 0 0 2026-03-09T16:14:35.058 INFO:tasks.workunit.client.0.vm03.stdout:8/518: getdents da/d6c/d7a 0 2026-03-09T16:14:35.059 INFO:tasks.workunit.client.0.vm03.stdout:8/519: truncate da/d32/d79/f90 697026 0 2026-03-09T16:14:35.065 INFO:tasks.workunit.client.0.vm03.stdout:0/494: symlink d0/d7/d3e/d57/d5a/d74/la6 0 2026-03-09T16:14:35.065 INFO:tasks.workunit.client.0.vm03.stdout:0/495: write d0/da/d7a/d98/f9d [760757,8953] 0 2026-03-09T16:14:35.065 INFO:tasks.workunit.client.0.vm03.stdout:3/482: sync 2026-03-09T16:14:35.065 INFO:tasks.workunit.client.0.vm03.stdout:8/520: dread da/d10/f23 [0,4194304] 0 2026-03-09T16:14:35.069 INFO:tasks.workunit.client.0.vm03.stdout:8/521: write da/d10/d28/d4f/d85/fa1 [61805,24893] 0 2026-03-09T16:14:35.070 INFO:tasks.workunit.client.0.vm03.stdout:2/499: link db/d12/f69 db/d12/d2a/d61/d6d/d8c/d94/dad/fb5 0 2026-03-09T16:14:35.071 INFO:tasks.workunit.client.0.vm03.stdout:3/483: dwrite d5/d6d/d5a/f78 [0,4194304] 0 2026-03-09T16:14:35.076 INFO:tasks.workunit.client.0.vm03.stdout:6/484: creat d9/d84/f91 x:0 0 0 2026-03-09T16:14:35.079 INFO:tasks.workunit.client.0.vm03.stdout:3/484: dwrite d5/d1e/d42/f84 [0,4194304] 0 2026-03-09T16:14:35.093 INFO:tasks.workunit.client.0.vm03.stdout:0/496: truncate d0/f4e 2917045 0 2026-03-09T16:14:35.096 INFO:tasks.workunit.client.0.vm03.stdout:1/395: write d4/d6/d1d/d3d/f45 [1681335,22663] 0 2026-03-09T16:14:35.097 INFO:tasks.workunit.client.0.vm03.stdout:1/396: dread - d4/d31/f81 zero size 2026-03-09T16:14:35.097 INFO:tasks.workunit.client.0.vm03.stdout:1/397: chown d4/d6/d3b 43 1 2026-03-09T16:14:35.098 INFO:tasks.workunit.client.0.vm03.stdout:1/398: fsync d4/d6/d3b/d6b/d25/f84 0 2026-03-09T16:14:35.110 INFO:tasks.workunit.client.0.vm03.stdout:2/500: mkdir db/d12/d2a/d61/d79/d83/d64/d68/da0/db6 0 2026-03-09T16:14:35.114 INFO:tasks.workunit.client.0.vm03.stdout:6/485: dwrite d9/d22/f3e [0,4194304] 0 2026-03-09T16:14:35.118 INFO:tasks.workunit.client.0.vm03.stdout:0/497: read d0/da/d1b/f6e [1481077,96424] 0 2026-03-09T16:14:35.127 INFO:tasks.workunit.client.0.vm03.stdout:1/399: creat d4/d6/d1d/d20/d23/d3e/d3f/f8c x:0 0 0 2026-03-09T16:14:35.127 INFO:tasks.workunit.client.0.vm03.stdout:1/400: readlink d4/db/l4c 0 2026-03-09T16:14:35.129 INFO:tasks.workunit.client.0.vm03.stdout:5/563: write d2/d7/de/d11/d19/d31/f42 [7562393,50005] 0 2026-03-09T16:14:35.132 INFO:tasks.workunit.client.0.vm03.stdout:2/501: dread fa [0,4194304] 0 2026-03-09T16:14:35.133 INFO:tasks.workunit.client.0.vm03.stdout:2/502: dread - db/d12/d2a/d61/d79/d83/fa7 zero size 2026-03-09T16:14:35.133 INFO:tasks.workunit.client.0.vm03.stdout:2/503: write db/d12/d2a/d61/d79/d83/fa7 [817311,98801] 0 2026-03-09T16:14:35.142 INFO:tasks.workunit.client.0.vm03.stdout:9/536: write d2/d4/d11/d29/d2a/d4d/f53 [78419,125950] 0 2026-03-09T16:14:35.142 INFO:tasks.workunit.client.0.vm03.stdout:9/537: readlink d2/d4/d11/d12/l1b 0 2026-03-09T16:14:35.151 INFO:tasks.workunit.client.0.vm03.stdout:6/486: truncate d9/f15 4250488 0 2026-03-09T16:14:35.157 INFO:tasks.workunit.client.0.vm03.stdout:0/498: symlink d0/d7/d3e/d57/d5a/d82/la7 0 2026-03-09T16:14:35.163 INFO:tasks.workunit.client.0.vm03.stdout:1/401: dread d4/f1b [0,4194304] 0 2026-03-09T16:14:35.165 INFO:tasks.workunit.client.0.vm03.stdout:4/501: write d5/f9 [119298,3969] 0 2026-03-09T16:14:35.165 INFO:tasks.workunit.client.0.vm03.stdout:4/502: readlink d5/l88 0 2026-03-09T16:14:35.169 INFO:tasks.workunit.client.0.vm03.stdout:2/504: creat db/d12/d2a/d61/d79/fb7 x:0 0 0 2026-03-09T16:14:35.172 INFO:tasks.workunit.client.0.vm03.stdout:8/522: dwrite da/d10/f23 [0,4194304] 0 2026-03-09T16:14:35.175 INFO:tasks.workunit.client.0.vm03.stdout:0/499: sync 2026-03-09T16:14:35.184 INFO:tasks.workunit.client.0.vm03.stdout:0/500: dread d0/d7/d3e/f72 [0,4194304] 0 2026-03-09T16:14:35.184 INFO:tasks.workunit.client.0.vm03.stdout:7/457: link d4/da/d18/f6a d4/da/d45/d51/f91 0 2026-03-09T16:14:35.187 INFO:tasks.workunit.client.0.vm03.stdout:9/538: creat d2/de/d88/fa5 x:0 0 0 2026-03-09T16:14:35.188 INFO:tasks.workunit.client.0.vm03.stdout:6/487: rmdir d9/d14/d71 39 2026-03-09T16:14:35.190 INFO:tasks.workunit.client.0.vm03.stdout:3/485: link d5/d1e/l80 d5/d1e/d42/d55/d86/l8f 0 2026-03-09T16:14:35.191 INFO:tasks.workunit.client.0.vm03.stdout:1/402: fdatasync d4/f1b 0 2026-03-09T16:14:35.198 INFO:tasks.workunit.client.0.vm03.stdout:0/501: sync 2026-03-09T16:14:35.198 INFO:tasks.workunit.client.0.vm03.stdout:0/502: stat d0/da/d1b/d9b/f93 0 2026-03-09T16:14:35.199 INFO:tasks.workunit.client.0.vm03.stdout:9/539: rmdir d2/d54/d7d 39 2026-03-09T16:14:35.204 INFO:tasks.workunit.client.0.vm03.stdout:6/488: mkdir d9/d42/d45/d50/d80/d90/d66/d92 0 2026-03-09T16:14:35.209 INFO:tasks.workunit.client.0.vm03.stdout:2/505: dwrite db/d12/f69 [0,4194304] 0 2026-03-09T16:14:35.211 INFO:tasks.workunit.client.0.vm03.stdout:7/458: dwrite d4/da/d18/d22/f33 [0,4194304] 0 2026-03-09T16:14:35.216 INFO:tasks.workunit.client.0.vm03.stdout:5/564: creat d2/d7/de/d11/d19/d31/fcb x:0 0 0 2026-03-09T16:14:35.228 INFO:tasks.workunit.client.0.vm03.stdout:8/523: mknod da/d10/d28/d4f/d68/cac 0 2026-03-09T16:14:35.228 INFO:tasks.workunit.client.0.vm03.stdout:8/524: fdatasync da/d1d/f99 0 2026-03-09T16:14:35.228 INFO:tasks.workunit.client.0.vm03.stdout:8/525: chown da/d6c/d7a/f97 4007 1 2026-03-09T16:14:35.232 INFO:tasks.workunit.client.0.vm03.stdout:0/503: creat d0/d7/d3e/d57/fa8 x:0 0 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:9/540: rmdir d2/d4/d11/d29/d2a/d38 39 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:6/489: fdatasync d9/d42/d45/d47/f8c 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:3/486: mknod d5/d6d/c90 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:7/459: mknod d4/da/d18/d22/d24/d15/c92 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:2/506: creat db/d12/d2a/d99/fb8 x:0 0 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:2/507: fdatasync db/d12/d2a/d61/d79/f95 0 2026-03-09T16:14:35.246 INFO:tasks.workunit.client.0.vm03.stdout:2/508: write db/d12/d2a/d61/f54 [366900,24766] 0 2026-03-09T16:14:35.251 INFO:tasks.workunit.client.0.vm03.stdout:5/565: chown d2/d7/de/cb4 1725 1 2026-03-09T16:14:35.257 INFO:tasks.workunit.client.0.vm03.stdout:5/566: dwrite d2/d7/d8/d16/d5c/f94 [0,4194304] 0 2026-03-09T16:14:35.265 INFO:tasks.workunit.client.0.vm03.stdout:5/567: sync 2026-03-09T16:14:35.266 INFO:tasks.workunit.client.0.vm03.stdout:8/526: mkdir da/d32/dad 0 2026-03-09T16:14:35.268 INFO:tasks.workunit.client.0.vm03.stdout:0/504: unlink d0/d7/d3e/d57/d5a/d52/f9a 0 2026-03-09T16:14:35.271 INFO:tasks.workunit.client.0.vm03.stdout:0/505: readlink d0/d7/d3e/d57/d5a/d52/l7d 0 2026-03-09T16:14:35.280 INFO:tasks.workunit.client.0.vm03.stdout:6/490: creat d9/d42/d45/d47/f93 x:0 0 0 2026-03-09T16:14:35.303 INFO:tasks.workunit.client.0.vm03.stdout:2/509: rename db/d12/d2a/d61/d79/d83/d52/f9c to db/d12/d2a/d61/d6d/d8c/d94/da4/fb9 0 2026-03-09T16:14:35.305 INFO:tasks.workunit.client.0.vm03.stdout:2/510: truncate db/d12/d2a/f38 1717979 0 2026-03-09T16:14:35.307 INFO:tasks.workunit.client.0.vm03.stdout:2/511: fsync db/d12/d2a/d61/d6d/f8f 0 2026-03-09T16:14:35.312 INFO:tasks.workunit.client.0.vm03.stdout:4/503: getdents d5/db 0 2026-03-09T16:14:35.313 INFO:tasks.workunit.client.0.vm03.stdout:3/487: dwrite d5/f43 [0,4194304] 0 2026-03-09T16:14:35.317 INFO:tasks.workunit.client.0.vm03.stdout:4/504: truncate d5/dd/d1f/d5f/f71 431968 0 2026-03-09T16:14:35.331 INFO:tasks.workunit.client.0.vm03.stdout:5/568: unlink d2/d7/d3c/fb2 0 2026-03-09T16:14:35.331 INFO:tasks.workunit.client.0.vm03.stdout:8/527: creat da/d6c/fae x:0 0 0 2026-03-09T16:14:35.331 INFO:tasks.workunit.client.0.vm03.stdout:5/569: dread - d2/d7/de/d33/f9e zero size 2026-03-09T16:14:35.335 INFO:tasks.workunit.client.0.vm03.stdout:8/528: read f8 [3528961,105731] 0 2026-03-09T16:14:35.339 INFO:tasks.workunit.client.0.vm03.stdout:6/491: fsync d9/f1e 0 2026-03-09T16:14:35.339 INFO:tasks.workunit.client.0.vm03.stdout:8/529: stat da/d10/d63/f73 0 2026-03-09T16:14:35.339 INFO:tasks.workunit.client.0.vm03.stdout:9/541: symlink d2/d4/d11/d29/d2a/d38/la6 0 2026-03-09T16:14:35.339 INFO:tasks.workunit.client.0.vm03.stdout:6/492: chown d9/d42/d45/d47/l5a 0 1 2026-03-09T16:14:35.340 INFO:tasks.workunit.client.0.vm03.stdout:7/460: truncate d4/da/d18/d22/f48 412361 0 2026-03-09T16:14:35.354 INFO:tasks.workunit.client.0.vm03.stdout:2/512: rmdir db/d12 39 2026-03-09T16:14:35.354 INFO:tasks.workunit.client.0.vm03.stdout:1/403: link d4/d31/l52 d4/d6/d3b/d6b/d25/l8d 0 2026-03-09T16:14:35.354 INFO:tasks.workunit.client.0.vm03.stdout:3/488: truncate d5/d1e/d42/d55/f7e 184557 0 2026-03-09T16:14:35.354 INFO:tasks.workunit.client.0.vm03.stdout:4/505: read - d5/f74 zero size 2026-03-09T16:14:35.355 INFO:tasks.workunit.client.0.vm03.stdout:3/489: truncate d5/d6d/f7a 943401 0 2026-03-09T16:14:35.356 INFO:tasks.workunit.client.0.vm03.stdout:4/506: dread - d5/f74 zero size 2026-03-09T16:14:35.357 INFO:tasks.workunit.client.0.vm03.stdout:5/570: read d2/d7/d1a/d1c/d6c/f79 [2306677,88215] 0 2026-03-09T16:14:35.359 INFO:tasks.workunit.client.0.vm03.stdout:0/506: dwrite d0/d7/d48/f13 [0,4194304] 0 2026-03-09T16:14:35.373 INFO:tasks.workunit.client.0.vm03.stdout:8/530: mkdir da/d10/d28/d4f/daf 0 2026-03-09T16:14:35.374 INFO:tasks.workunit.client.0.vm03.stdout:9/542: creat d2/d4/d11/d29/d2a/d38/fa7 x:0 0 0 2026-03-09T16:14:35.382 INFO:tasks.workunit.client.0.vm03.stdout:7/461: mknod d4/da/c93 0 2026-03-09T16:14:35.391 INFO:tasks.workunit.client.0.vm03.stdout:7/462: dwrite d4/f8f [0,4194304] 0 2026-03-09T16:14:35.408 INFO:tasks.workunit.client.0.vm03.stdout:6/493: dwrite d9/d42/d45/d47/f6a [0,4194304] 0 2026-03-09T16:14:35.443 INFO:tasks.workunit.client.0.vm03.stdout:9/543: dread d2/d4/d11/d12/d28/f2f [0,4194304] 0 2026-03-09T16:14:35.444 INFO:tasks.workunit.client.0.vm03.stdout:9/544: write d2/d4/d11/d29/d2a/d46/f9e [1018455,129424] 0 2026-03-09T16:14:35.445 INFO:tasks.workunit.client.0.vm03.stdout:9/545: truncate d2/de/f87 936985 0 2026-03-09T16:14:35.446 INFO:tasks.workunit.client.0.vm03.stdout:9/546: write d2/d4/d11/d29/f95 [250630,54816] 0 2026-03-09T16:14:35.452 INFO:tasks.workunit.client.0.vm03.stdout:1/404: mkdir d4/d6/d3b/d8e 0 2026-03-09T16:14:35.454 INFO:tasks.workunit.client.0.vm03.stdout:4/507: rmdir d5/db/d25/d31/d33/d55/d81 39 2026-03-09T16:14:35.455 INFO:tasks.workunit.client.0.vm03.stdout:1/405: write d4/d6/d3b/d63/f77 [137656,103099] 0 2026-03-09T16:14:35.459 INFO:tasks.workunit.client.0.vm03.stdout:5/571: symlink d2/d7/de/d11/d38/d3b/lcc 0 2026-03-09T16:14:35.464 INFO:tasks.workunit.client.0.vm03.stdout:0/507: symlink d0/d7/d3e/d57/d5a/d82/d89/la9 0 2026-03-09T16:14:35.470 INFO:tasks.workunit.client.0.vm03.stdout:8/531: creat da/d10/d28/fb0 x:0 0 0 2026-03-09T16:14:35.488 INFO:tasks.workunit.client.0.vm03.stdout:3/490: mkdir d5/d53/d6c/d79/d91 0 2026-03-09T16:14:35.491 INFO:tasks.workunit.client.0.vm03.stdout:3/491: dwrite d5/d1e/d42/f74 [4194304,4194304] 0 2026-03-09T16:14:35.497 INFO:tasks.workunit.client.0.vm03.stdout:4/508: write d5/db/f34 [2817926,12434] 0 2026-03-09T16:14:35.502 INFO:tasks.workunit.client.0.vm03.stdout:1/406: mknod d4/d6/d1d/d3d/c8f 0 2026-03-09T16:14:35.507 INFO:tasks.workunit.client.0.vm03.stdout:5/572: rename d2/d7/de/d11/c1b to d2/d7/de/d11/d19/d29/d90/dbe/ccd 0 2026-03-09T16:14:35.515 INFO:tasks.workunit.client.0.vm03.stdout:8/532: unlink da/l12 0 2026-03-09T16:14:35.522 INFO:tasks.workunit.client.0.vm03.stdout:6/494: creat d9/d8e/f94 x:0 0 0 2026-03-09T16:14:35.525 INFO:tasks.workunit.client.0.vm03.stdout:3/492: fdatasync d5/d44/f5d 0 2026-03-09T16:14:35.526 INFO:tasks.workunit.client.0.vm03.stdout:3/493: fdatasync d5/fb 0 2026-03-09T16:14:35.527 INFO:tasks.workunit.client.0.vm03.stdout:4/509: chown d5/c42 678787 1 2026-03-09T16:14:35.528 INFO:tasks.workunit.client.0.vm03.stdout:1/407: sync 2026-03-09T16:14:35.528 INFO:tasks.workunit.client.0.vm03.stdout:8/533: sync 2026-03-09T16:14:35.529 INFO:tasks.workunit.client.0.vm03.stdout:8/534: fsync da/d10/d28/d4f/d68/f8f 0 2026-03-09T16:14:35.530 INFO:tasks.workunit.client.0.vm03.stdout:4/510: read d5/db/d25/d31/d33/d55/f62 [1356529,125830] 0 2026-03-09T16:14:35.536 INFO:tasks.workunit.client.0.vm03.stdout:8/535: dwrite da/db/d30/f76 [0,4194304] 0 2026-03-09T16:14:35.539 INFO:tasks.workunit.client.0.vm03.stdout:4/511: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:35.549 INFO:tasks.workunit.client.0.vm03.stdout:0/508: dwrite d0/da/d1b/f6e [0,4194304] 0 2026-03-09T16:14:35.559 INFO:tasks.workunit.client.0.vm03.stdout:6/495: readlink d9/d42/d45/d65/l7b 0 2026-03-09T16:14:35.563 INFO:tasks.workunit.client.0.vm03.stdout:9/547: creat d2/d4/d11/fa8 x:0 0 0 2026-03-09T16:14:35.571 INFO:tasks.workunit.client.0.vm03.stdout:8/536: mkdir da/d10/d28/db1 0 2026-03-09T16:14:35.576 INFO:tasks.workunit.client.0.vm03.stdout:5/573: mkdir d2/d7/de/d54/dce 0 2026-03-09T16:14:35.576 INFO:tasks.workunit.client.0.vm03.stdout:0/509: creat d0/d7/d3e/d57/faa x:0 0 0 2026-03-09T16:14:35.577 INFO:tasks.workunit.client.0.vm03.stdout:5/574: fdatasync d2/d7/d1a/d1c/f5e 0 2026-03-09T16:14:35.578 INFO:tasks.workunit.client.0.vm03.stdout:5/575: dread - d2/d7/de/d11/d19/d31/fcb zero size 2026-03-09T16:14:35.581 INFO:tasks.workunit.client.0.vm03.stdout:2/513: getdents db/d12/d2a 0 2026-03-09T16:14:35.582 INFO:tasks.workunit.client.0.vm03.stdout:3/494: rename d5/d6d/l2a to d5/d2e/l92 0 2026-03-09T16:14:35.582 INFO:tasks.workunit.client.0.vm03.stdout:1/408: creat d4/d7b/f90 x:0 0 0 2026-03-09T16:14:35.584 INFO:tasks.workunit.client.0.vm03.stdout:9/548: read d2/d4/d11/d12/f50 [62789,113540] 0 2026-03-09T16:14:35.584 INFO:tasks.workunit.client.0.vm03.stdout:4/512: mknod d5/db/d25/d31/d33/d55/d81/c99 0 2026-03-09T16:14:35.584 INFO:tasks.workunit.client.0.vm03.stdout:9/549: chown d2/df/l32 31081172 1 2026-03-09T16:14:35.588 INFO:tasks.workunit.client.0.vm03.stdout:6/496: creat d9/d14/d71/f95 x:0 0 0 2026-03-09T16:14:35.588 INFO:tasks.workunit.client.0.vm03.stdout:5/576: chown d2/d7/de/d54/c9c 248 1 2026-03-09T16:14:35.588 INFO:tasks.workunit.client.0.vm03.stdout:6/497: chown d9/d8e 38 1 2026-03-09T16:14:35.590 INFO:tasks.workunit.client.0.vm03.stdout:9/550: mknod d2/d4/d11/d29/d2a/d46/ca9 0 2026-03-09T16:14:35.593 INFO:tasks.workunit.client.0.vm03.stdout:5/577: mkdir d2/d7/d8/d16/d5c/dcf 0 2026-03-09T16:14:35.593 INFO:tasks.workunit.client.0.vm03.stdout:6/498: symlink d9/d42/d45/d50/d80/d90/l96 0 2026-03-09T16:14:35.593 INFO:tasks.workunit.client.0.vm03.stdout:3/495: sync 2026-03-09T16:14:35.593 INFO:tasks.workunit.client.0.vm03.stdout:1/409: sync 2026-03-09T16:14:35.595 INFO:tasks.workunit.client.0.vm03.stdout:7/463: rename d4/da/d18/d22/d24/d16/d6e/f7b to d4/da/d45/d51/d36/f94 0 2026-03-09T16:14:35.598 INFO:tasks.workunit.client.0.vm03.stdout:2/514: mknod db/d12/d2a/d61/d6d/d8c/d94/dad/db3/cba 0 2026-03-09T16:14:35.598 INFO:tasks.workunit.client.0.vm03.stdout:4/513: getdents d5/db/d25/d8b 0 2026-03-09T16:14:35.599 INFO:tasks.workunit.client.0.vm03.stdout:2/515: readlink db/d12/d2a/l4f 0 2026-03-09T16:14:35.599 INFO:tasks.workunit.client.0.vm03.stdout:4/514: readlink d5/db/d25/d31/d33/d79/l96 0 2026-03-09T16:14:35.602 INFO:tasks.workunit.client.0.vm03.stdout:0/510: truncate d0/da/ff 4335817 0 2026-03-09T16:14:35.603 INFO:tasks.workunit.client.0.vm03.stdout:0/511: readlink d0/d7/d3e/d57/d5a/d52/l7c 0 2026-03-09T16:14:35.605 INFO:tasks.workunit.client.0.vm03.stdout:9/551: creat d2/d4/d11/d29/d2a/d38/faa x:0 0 0 2026-03-09T16:14:35.605 INFO:tasks.workunit.client.0.vm03.stdout:9/552: dread - d2/de/d88/fa5 zero size 2026-03-09T16:14:35.606 INFO:tasks.workunit.client.0.vm03.stdout:9/553: readlink d2/d4/d11/d29/d2a/d46/l52 0 2026-03-09T16:14:35.606 INFO:tasks.workunit.client.0.vm03.stdout:0/512: sync 2026-03-09T16:14:35.610 INFO:tasks.workunit.client.0.vm03.stdout:7/464: creat d4/d2d/f95 x:0 0 0 2026-03-09T16:14:35.613 INFO:tasks.workunit.client.0.vm03.stdout:6/499: write d9/d22/f43 [550546,3096] 0 2026-03-09T16:14:35.617 INFO:tasks.workunit.client.0.vm03.stdout:6/500: sync 2026-03-09T16:14:35.625 INFO:tasks.workunit.client.0.vm03.stdout:2/516: dread db/d12/f84 [0,4194304] 0 2026-03-09T16:14:35.627 INFO:tasks.workunit.client.0.vm03.stdout:4/515: rename d5/d40 to d5/db/d25/d31/d4d/d5b/d9a 0 2026-03-09T16:14:35.632 INFO:tasks.workunit.client.0.vm03.stdout:5/578: dwrite d2/d7/d1a/f6e [0,4194304] 0 2026-03-09T16:14:35.639 INFO:tasks.workunit.client.0.vm03.stdout:5/579: dwrite d2/d7/d8/d16/d5c/f94 [0,4194304] 0 2026-03-09T16:14:35.647 INFO:tasks.workunit.client.0.vm03.stdout:5/580: dread d2/d7/de/d11/d38/d52/f7d [0,4194304] 0 2026-03-09T16:14:35.660 INFO:tasks.workunit.client.0.vm03.stdout:1/410: symlink d4/d6/d1d/d20/d23/l91 0 2026-03-09T16:14:35.662 INFO:tasks.workunit.client.0.vm03.stdout:8/537: getdents da/d10/d28/d4f/d68/d80 0 2026-03-09T16:14:35.666 INFO:tasks.workunit.client.0.vm03.stdout:6/501: mkdir d9/d42/d45/d47/d97 0 2026-03-09T16:14:35.668 INFO:tasks.workunit.client.0.vm03.stdout:2/517: mkdir db/d12/da5/dbb 0 2026-03-09T16:14:35.675 INFO:tasks.workunit.client.0.vm03.stdout:9/554: rename d2/de/f1c to d2/d4/d11/d29/d2a/d4d/fab 0 2026-03-09T16:14:35.686 INFO:tasks.workunit.client.0.vm03.stdout:0/513: dwrite d0/d7/f3d [0,4194304] 0 2026-03-09T16:14:35.689 INFO:tasks.workunit.client.0.vm03.stdout:5/581: chown d2/d7/d8/d16/d5c/c72 172 1 2026-03-09T16:14:35.699 INFO:tasks.workunit.client.0.vm03.stdout:7/465: symlink d4/da/d18/l96 0 2026-03-09T16:14:35.701 INFO:tasks.workunit.client.0.vm03.stdout:1/411: fdatasync d4/db/f21 0 2026-03-09T16:14:35.705 INFO:tasks.workunit.client.0.vm03.stdout:1/412: dread d4/d6/d1d/d3d/f45 [0,4194304] 0 2026-03-09T16:14:35.711 INFO:tasks.workunit.client.0.vm03.stdout:1/413: dwrite d4/d6/d1d/d20/f72 [0,4194304] 0 2026-03-09T16:14:35.721 INFO:tasks.workunit.client.0.vm03.stdout:6/502: symlink d9/d84/l98 0 2026-03-09T16:14:35.723 INFO:tasks.workunit.client.0.vm03.stdout:2/518: truncate fa 135222 0 2026-03-09T16:14:35.723 INFO:tasks.workunit.client.0.vm03.stdout:2/519: fdatasync db/d12/d2a/d61/f5d 0 2026-03-09T16:14:35.732 INFO:tasks.workunit.client.0.vm03.stdout:5/582: unlink d2/d7/l50 0 2026-03-09T16:14:35.732 INFO:tasks.workunit.client.0.vm03.stdout:3/496: link d5/d1e/d42/d34/c5b d5/c93 0 2026-03-09T16:14:35.737 INFO:tasks.workunit.client.0.vm03.stdout:8/538: write da/d32/f61 [510120,54431] 0 2026-03-09T16:14:35.737 INFO:tasks.workunit.client.0.vm03.stdout:8/539: dread - da/d6c/f8e zero size 2026-03-09T16:14:35.740 INFO:tasks.workunit.client.0.vm03.stdout:8/540: dwrite da/d10/f23 [4194304,4194304] 0 2026-03-09T16:14:35.744 INFO:tasks.workunit.client.0.vm03.stdout:7/466: unlink d4/da/d45/d51/d36/c85 0 2026-03-09T16:14:35.745 INFO:tasks.workunit.client.0.vm03.stdout:7/467: stat d4/da/d18/d22/f33 0 2026-03-09T16:14:35.748 INFO:tasks.workunit.client.0.vm03.stdout:8/541: dwrite da/d10/d28/d4f/d68/f8f [0,4194304] 0 2026-03-09T16:14:35.753 INFO:tasks.workunit.client.0.vm03.stdout:1/414: rename d4/d6/d3b/d6b/c80 to d4/d6/d3b/c92 0 2026-03-09T16:14:35.757 INFO:tasks.workunit.client.0.vm03.stdout:9/555: dread d2/d4/d11/d12/d28/f2c [0,4194304] 0 2026-03-09T16:14:35.759 INFO:tasks.workunit.client.0.vm03.stdout:9/556: dwrite d2/de/d88/f6f [0,4194304] 0 2026-03-09T16:14:35.762 INFO:tasks.workunit.client.0.vm03.stdout:9/557: read d2/d4/d11/f13 [7242354,98766] 0 2026-03-09T16:14:35.772 INFO:tasks.workunit.client.0.vm03.stdout:2/520: write db/d12/d2a/d61/f65 [848033,74514] 0 2026-03-09T16:14:35.773 INFO:tasks.workunit.client.0.vm03.stdout:2/521: write db/d12/d2a/d61/d6d/d8c/fa3 [103021,96462] 0 2026-03-09T16:14:35.774 INFO:tasks.workunit.client.0.vm03.stdout:2/522: write db/d12/d2a/f58 [1408926,80495] 0 2026-03-09T16:14:35.777 INFO:tasks.workunit.client.0.vm03.stdout:2/523: read db/d12/d2a/d61/f9d [2269567,1851] 0 2026-03-09T16:14:35.781 INFO:tasks.workunit.client.0.vm03.stdout:0/514: fsync d0/da/ff 0 2026-03-09T16:14:35.781 INFO:tasks.workunit.client.0.vm03.stdout:0/515: chown d0/da/d5c/c83 19440 1 2026-03-09T16:14:35.784 INFO:tasks.workunit.client.0.vm03.stdout:5/583: creat d2/d75/fd0 x:0 0 0 2026-03-09T16:14:35.794 INFO:tasks.workunit.client.0.vm03.stdout:3/497: rmdir d5/d6d 39 2026-03-09T16:14:35.794 INFO:tasks.workunit.client.0.vm03.stdout:7/468: rename d4/da/d18/d22/d24/d16/d2b/c72 to d4/da/d18/d22/c97 0 2026-03-09T16:14:35.794 INFO:tasks.workunit.client.0.vm03.stdout:2/524: mknod db/d12/d2a/d61/d79/d83/d64/d68/cbc 0 2026-03-09T16:14:35.794 INFO:tasks.workunit.client.0.vm03.stdout:4/516: link d5/cc d5/db/d25/c9b 0 2026-03-09T16:14:35.794 INFO:tasks.workunit.client.0.vm03.stdout:0/516: mkdir d0/d7/d3e/d45/dab 0 2026-03-09T16:14:35.797 INFO:tasks.workunit.client.0.vm03.stdout:5/584: dread d2/d7/de/d11/d38/d52/f7d [0,4194304] 0 2026-03-09T16:14:35.797 INFO:tasks.workunit.client.0.vm03.stdout:5/585: chown d2/d75/l9d 21 1 2026-03-09T16:14:35.799 INFO:tasks.workunit.client.0.vm03.stdout:3/498: write d5/d6d/d5a/f7c [601767,75640] 0 2026-03-09T16:14:35.802 INFO:tasks.workunit.client.0.vm03.stdout:8/542: rename da/db/d43/c4e to da/d6c/d7a/cb2 0 2026-03-09T16:14:35.806 INFO:tasks.workunit.client.0.vm03.stdout:7/469: chown d4/da/l76 56303 1 2026-03-09T16:14:35.806 INFO:tasks.workunit.client.0.vm03.stdout:9/558: mkdir d2/d4/d11/dac 0 2026-03-09T16:14:35.806 INFO:tasks.workunit.client.0.vm03.stdout:2/525: fdatasync db/d12/d2a/d61/d79/d83/d64/f80 0 2026-03-09T16:14:35.810 INFO:tasks.workunit.client.0.vm03.stdout:0/517: creat d0/da/d7a/fac x:0 0 0 2026-03-09T16:14:35.810 INFO:tasks.workunit.client.0.vm03.stdout:0/518: dread - d0/d7/d3e/d45/fa5 zero size 2026-03-09T16:14:35.811 INFO:tasks.workunit.client.0.vm03.stdout:0/519: fdatasync d0/d7/d3e/d45/f76 0 2026-03-09T16:14:35.815 INFO:tasks.workunit.client.0.vm03.stdout:0/520: dwrite d0/da/f8b [0,4194304] 0 2026-03-09T16:14:35.821 INFO:tasks.workunit.client.0.vm03.stdout:8/543: sync 2026-03-09T16:14:35.823 INFO:tasks.workunit.client.0.vm03.stdout:5/586: dread d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:35.824 INFO:tasks.workunit.client.0.vm03.stdout:1/415: rename d4/d6/d1d/d20/d23/d3e/d3f to d4/d6/d1d/d20/d93 0 2026-03-09T16:14:35.828 INFO:tasks.workunit.client.0.vm03.stdout:6/503: getdents d9/d42/d45/d50/d80 0 2026-03-09T16:14:35.831 INFO:tasks.workunit.client.0.vm03.stdout:2/526: fsync db/d12/f57 0 2026-03-09T16:14:35.833 INFO:tasks.workunit.client.0.vm03.stdout:2/527: dwrite f0 [0,4194304] 0 2026-03-09T16:14:35.833 INFO:tasks.workunit.client.0.vm03.stdout:2/528: stat f7 0 2026-03-09T16:14:35.846 INFO:tasks.workunit.client.0.vm03.stdout:5/587: creat d2/d7/d8/d24/d27/d43/d4b/fd1 x:0 0 0 2026-03-09T16:14:35.853 INFO:tasks.workunit.client.0.vm03.stdout:1/416: dwrite d4/d6/d1d/f66 [0,4194304] 0 2026-03-09T16:14:35.856 INFO:tasks.workunit.client.0.vm03.stdout:7/470: write d4/dc/d61/f84 [3624898,65228] 0 2026-03-09T16:14:35.866 INFO:tasks.workunit.client.0.vm03.stdout:8/544: dwrite da/d10/d28/f5c [0,4194304] 0 2026-03-09T16:14:35.868 INFO:tasks.workunit.client.0.vm03.stdout:8/545: stat da/d10/d63/l93 0 2026-03-09T16:14:35.869 INFO:tasks.workunit.client.0.vm03.stdout:8/546: chown da/db/d30/f76 42954922 1 2026-03-09T16:14:35.872 INFO:tasks.workunit.client.0.vm03.stdout:8/547: dread da/d10/d28/f8b [0,4194304] 0 2026-03-09T16:14:35.881 INFO:tasks.workunit.client.0.vm03.stdout:3/499: link d5/d53/d88/l89 d5/d1e/d42/d8b/l94 0 2026-03-09T16:14:35.886 INFO:tasks.workunit.client.0.vm03.stdout:0/521: mknod d0/d7/d75/cad 0 2026-03-09T16:14:35.886 INFO:tasks.workunit.client.0.vm03.stdout:0/522: write d0/d7/d3e/d57/d5a/d47/f88 [3855077,63262] 0 2026-03-09T16:14:35.886 INFO:tasks.workunit.client.0.vm03.stdout:5/588: write d2/d7/d1a/f9f [2421555,14248] 0 2026-03-09T16:14:35.889 INFO:tasks.workunit.client.0.vm03.stdout:7/471: sync 2026-03-09T16:14:35.891 INFO:tasks.workunit.client.0.vm03.stdout:6/504: creat d9/d42/d45/d47/d97/f99 x:0 0 0 2026-03-09T16:14:35.891 INFO:tasks.workunit.client.0.vm03.stdout:6/505: chown d9/d42/l60 20 1 2026-03-09T16:14:35.894 INFO:tasks.workunit.client.0.vm03.stdout:9/559: mkdir d2/d54/d7d/d8f/dad 0 2026-03-09T16:14:35.894 INFO:tasks.workunit.client.0.vm03.stdout:6/506: dwrite d9/d42/d45/d47/d97/f99 [0,4194304] 0 2026-03-09T16:14:35.899 INFO:tasks.workunit.client.0.vm03.stdout:4/517: getdents d5/dd 0 2026-03-09T16:14:35.901 INFO:tasks.workunit.client.0.vm03.stdout:4/518: write d5/db/d25/d31/d33/d79/f89 [387777,108581] 0 2026-03-09T16:14:35.902 INFO:tasks.workunit.client.0.vm03.stdout:4/519: write d5/db/d25/d31/d33/d79/f7b [3522555,40572] 0 2026-03-09T16:14:35.914 INFO:tasks.workunit.client.0.vm03.stdout:1/417: dread d4/d6/d1d/d20/d23/f30 [0,4194304] 0 2026-03-09T16:14:35.923 INFO:tasks.workunit.client.0.vm03.stdout:9/560: truncate d2/d4/d1f/f44 4124831 0 2026-03-09T16:14:35.934 INFO:tasks.workunit.client.0.vm03.stdout:2/529: rename db/d12/d2a/d61/d79/d83/d64/d68 to db/d12/d2a/d61/d79/d83/d64/dbd 0 2026-03-09T16:14:35.947 INFO:tasks.workunit.client.0.vm03.stdout:3/500: truncate d5/d1e/d42/d4c/f7d 346096 0 2026-03-09T16:14:35.977 INFO:tasks.workunit.client.0.vm03.stdout:4/520: creat d5/d17/f9c x:0 0 0 2026-03-09T16:14:35.977 INFO:tasks.workunit.client.0.vm03.stdout:0/523: link d0/d7/d3e/d45/f5e d0/da/d5c/fae 0 2026-03-09T16:14:35.979 INFO:tasks.workunit.client.0.vm03.stdout:7/472: symlink d4/dc/l98 0 2026-03-09T16:14:35.981 INFO:tasks.workunit.client.0.vm03.stdout:7/473: fsync d4/da/d18/d22/d24/d16/d6e/f73 0 2026-03-09T16:14:35.981 INFO:tasks.workunit.client.0.vm03.stdout:4/521: write d5/dd/d1f/f58 [3972483,128189] 0 2026-03-09T16:14:35.982 INFO:tasks.workunit.client.0.vm03.stdout:5/589: dwrite d2/d7/de/d11/d19/d31/f7e [0,4194304] 0 2026-03-09T16:14:35.990 INFO:tasks.workunit.client.0.vm03.stdout:1/418: mknod d4/d6/d1d/d20/d93/c94 0 2026-03-09T16:14:35.992 INFO:tasks.workunit.client.0.vm03.stdout:3/501: unlink d5/d44/f82 0 2026-03-09T16:14:35.993 INFO:tasks.workunit.client.0.vm03.stdout:8/548: dwrite da/d6c/d7a/f91 [0,4194304] 0 2026-03-09T16:14:35.993 INFO:tasks.workunit.client.0.vm03.stdout:5/590: read d2/d7/d8/d16/d5c/fb5 [1927640,96315] 0 2026-03-09T16:14:35.995 INFO:tasks.workunit.client.0.vm03.stdout:8/549: fdatasync da/d32/f61 0 2026-03-09T16:14:35.996 INFO:tasks.workunit.client.0.vm03.stdout:5/591: chown d2/d7/d1a/f6e 430291 1 2026-03-09T16:14:36.010 INFO:tasks.workunit.client.0.vm03.stdout:7/474: write d4/da/f42 [2122368,123376] 0 2026-03-09T16:14:36.011 INFO:tasks.workunit.client.0.vm03.stdout:7/475: stat d4/da/d18/d22/d24/d15/c92 0 2026-03-09T16:14:36.011 INFO:tasks.workunit.client.0.vm03.stdout:7/476: readlink d4/da/d18/d22/l65 0 2026-03-09T16:14:36.014 INFO:tasks.workunit.client.0.vm03.stdout:6/507: write d9/d22/f4e [505830,48316] 0 2026-03-09T16:14:36.015 INFO:tasks.workunit.client.0.vm03.stdout:0/524: read d0/f3 [3398963,115204] 0 2026-03-09T16:14:36.015 INFO:tasks.workunit.client.0.vm03.stdout:6/508: dread - d9/d42/d45/d47/f8c zero size 2026-03-09T16:14:36.019 INFO:tasks.workunit.client.0.vm03.stdout:2/530: dwrite db/d12/d2a/d61/d79/d83/d64/f80 [0,4194304] 0 2026-03-09T16:14:36.028 INFO:tasks.workunit.client.0.vm03.stdout:1/419: creat d4/d6/d3b/f95 x:0 0 0 2026-03-09T16:14:36.031 INFO:tasks.workunit.client.0.vm03.stdout:3/502: mknod d5/d1e/d42/d55/c95 0 2026-03-09T16:14:36.040 INFO:tasks.workunit.client.0.vm03.stdout:0/525: dread - d0/d7/d3e/d57/d5a/d47/f7b zero size 2026-03-09T16:14:36.041 INFO:tasks.workunit.client.0.vm03.stdout:0/526: readlink d0/d7/d3e/d57/d5a/d74/l91 0 2026-03-09T16:14:36.042 INFO:tasks.workunit.client.0.vm03.stdout:0/527: stat d0/d7/d3e/d57/d5a/d52/l7c 0 2026-03-09T16:14:36.043 INFO:tasks.workunit.client.0.vm03.stdout:4/522: creat d5/db/d25/d31/d4d/d5b/d7d/f9d x:0 0 0 2026-03-09T16:14:36.044 INFO:tasks.workunit.client.0.vm03.stdout:2/531: chown db/d12/d2a/d61/la6 370963 1 2026-03-09T16:14:36.044 INFO:tasks.workunit.client.0.vm03.stdout:2/532: stat db/d12/f85 0 2026-03-09T16:14:36.045 INFO:tasks.workunit.client.0.vm03.stdout:4/523: chown d5/db/d25/d31/d4d/d5b/d72/d82/f8e 3 1 2026-03-09T16:14:36.045 INFO:tasks.workunit.client.0.vm03.stdout:3/503: rmdir d5/d1e/d42/d8b 39 2026-03-09T16:14:36.049 INFO:tasks.workunit.client.0.vm03.stdout:7/477: dread d4/da/f5f [0,4194304] 0 2026-03-09T16:14:36.057 INFO:tasks.workunit.client.0.vm03.stdout:9/561: getdents d2/d4/d11/d29/d2a/d46 0 2026-03-09T16:14:36.057 INFO:tasks.workunit.client.0.vm03.stdout:1/420: symlink d4/d6/d1d/l96 0 2026-03-09T16:14:36.058 INFO:tasks.workunit.client.0.vm03.stdout:2/533: mkdir db/d12/d2a/d61/dbe 0 2026-03-09T16:14:36.059 INFO:tasks.workunit.client.0.vm03.stdout:4/524: mkdir d5/db/d25/d31/d4d/d5b/d72/d82/d9e 0 2026-03-09T16:14:36.060 INFO:tasks.workunit.client.0.vm03.stdout:4/525: write d5/d17/f83 [202341,128516] 0 2026-03-09T16:14:36.064 INFO:tasks.workunit.client.0.vm03.stdout:7/478: sync 2026-03-09T16:14:36.066 INFO:tasks.workunit.client.0.vm03.stdout:6/509: dread d9/d22/f37 [0,4194304] 0 2026-03-09T16:14:36.069 INFO:tasks.workunit.client.0.vm03.stdout:6/510: dread d9/d42/d45/d47/d97/f99 [0,4194304] 0 2026-03-09T16:14:36.076 INFO:tasks.workunit.client.0.vm03.stdout:8/550: rename da/d10/d28/d4f/d68/f8a to da/d10/d28/d4f/d68/fb3 0 2026-03-09T16:14:36.077 INFO:tasks.workunit.client.0.vm03.stdout:8/551: chown da/d32/f4d 202 1 2026-03-09T16:14:36.077 INFO:tasks.workunit.client.0.vm03.stdout:8/552: truncate da/d10/fa4 572697 0 2026-03-09T16:14:36.078 INFO:tasks.workunit.client.0.vm03.stdout:5/592: link d2/d7/de/d54/c9c d2/d7/de/d54/dce/cd2 0 2026-03-09T16:14:36.089 INFO:tasks.workunit.client.0.vm03.stdout:5/593: dread d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:36.093 INFO:tasks.workunit.client.0.vm03.stdout:0/528: mknod d0/d7/d3e/d57/d5a/d52/d9f/caf 0 2026-03-09T16:14:36.094 INFO:tasks.workunit.client.0.vm03.stdout:0/529: write d0/d7/d48/f2e [8067911,71441] 0 2026-03-09T16:14:36.103 INFO:tasks.workunit.client.0.vm03.stdout:9/562: truncate d2/d4/d1f/f23 5035259 0 2026-03-09T16:14:36.103 INFO:tasks.workunit.client.0.vm03.stdout:9/563: dread - d2/d4/d11/d12/f9a zero size 2026-03-09T16:14:36.111 INFO:tasks.workunit.client.0.vm03.stdout:0/530: dread d0/f3 [0,4194304] 0 2026-03-09T16:14:36.113 INFO:tasks.workunit.client.0.vm03.stdout:2/534: dread - db/d12/d2a/d61/d79/d83/d64/f66 zero size 2026-03-09T16:14:36.117 INFO:tasks.workunit.client.0.vm03.stdout:3/504: write d5/d1e/d42/f1d [2002020,27728] 0 2026-03-09T16:14:36.130 INFO:tasks.workunit.client.0.vm03.stdout:3/505: dread d5/d1e/d42/f29 [0,4194304] 0 2026-03-09T16:14:36.135 INFO:tasks.workunit.client.0.vm03.stdout:4/526: mkdir d5/db/d25/d9f 0 2026-03-09T16:14:36.139 INFO:tasks.workunit.client.0.vm03.stdout:4/527: dwrite d5/db/d25/d31/d33/d79/f4f [0,4194304] 0 2026-03-09T16:14:36.180 INFO:tasks.workunit.client.0.vm03.stdout:5/594: creat d2/d7/de/d11/d19/d31/d35/fd3 x:0 0 0 2026-03-09T16:14:36.186 INFO:tasks.workunit.client.0.vm03.stdout:6/511: write d9/d42/d45/d50/d80/d90/f64 [791566,55813] 0 2026-03-09T16:14:36.188 INFO:tasks.workunit.client.0.vm03.stdout:6/512: fdatasync d9/d22/f4e 0 2026-03-09T16:14:36.191 INFO:tasks.workunit.client.0.vm03.stdout:6/513: read d9/d22/f83 [34098,31878] 0 2026-03-09T16:14:36.191 INFO:tasks.workunit.client.0.vm03.stdout:1/421: write d4/d6/d3b/f35 [4040425,81198] 0 2026-03-09T16:14:36.194 INFO:tasks.workunit.client.0.vm03.stdout:5/595: dread d2/d7/d1a/f9f [4194304,4194304] 0 2026-03-09T16:14:36.194 INFO:tasks.workunit.client.0.vm03.stdout:5/596: stat d2/d7/de/d11/d19/d31 0 2026-03-09T16:14:36.198 INFO:tasks.workunit.client.0.vm03.stdout:9/564: fdatasync d2/df/f64 0 2026-03-09T16:14:36.202 INFO:tasks.workunit.client.0.vm03.stdout:9/565: dwrite d2/d54/f90 [0,4194304] 0 2026-03-09T16:14:36.220 INFO:tasks.workunit.client.0.vm03.stdout:3/506: stat d5/l21 0 2026-03-09T16:14:36.230 INFO:tasks.workunit.client.0.vm03.stdout:7/479: symlink d4/dc/l99 0 2026-03-09T16:14:36.234 INFO:tasks.workunit.client.0.vm03.stdout:4/528: mkdir d5/d17/da0 0 2026-03-09T16:14:36.235 INFO:tasks.workunit.client.0.vm03.stdout:4/529: chown d5/f9 31799276 1 2026-03-09T16:14:36.235 INFO:tasks.workunit.client.0.vm03.stdout:4/530: fdatasync d5/db/d25/d31/d33/d79/f4f 0 2026-03-09T16:14:36.264 INFO:tasks.workunit.client.0.vm03.stdout:5/597: creat d2/fd4 x:0 0 0 2026-03-09T16:14:36.264 INFO:tasks.workunit.client.0.vm03.stdout:6/514: dread d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:36.280 INFO:tasks.workunit.client.0.vm03.stdout:2/535: getdents db/d12/da5/daf 0 2026-03-09T16:14:36.282 INFO:tasks.workunit.client.0.vm03.stdout:4/531: stat d5/db/d25/c9b 0 2026-03-09T16:14:36.282 INFO:tasks.workunit.client.0.vm03.stdout:8/553: link c4 da/d32/dad/cb4 0 2026-03-09T16:14:36.285 INFO:tasks.workunit.client.0.vm03.stdout:1/422: getdents d4/d39/d70 0 2026-03-09T16:14:36.288 INFO:tasks.workunit.client.0.vm03.stdout:5/598: read d2/d7/de/f78 [2979588,126066] 0 2026-03-09T16:14:36.292 INFO:tasks.workunit.client.0.vm03.stdout:1/423: dwrite d4/d7b/f90 [0,4194304] 0 2026-03-09T16:14:36.301 INFO:tasks.workunit.client.0.vm03.stdout:6/515: rmdir d9/d42/d45/d50/d80 39 2026-03-09T16:14:36.326 INFO:tasks.workunit.client.0.vm03.stdout:0/531: link d0/d7/d3e/d57/d5a/d74/f7e d0/da/fb0 0 2026-03-09T16:14:36.326 INFO:tasks.workunit.client.0.vm03.stdout:9/566: creat d2/d54/d7d/d8f/dad/fae x:0 0 0 2026-03-09T16:14:36.334 INFO:tasks.workunit.client.0.vm03.stdout:2/536: mknod db/d12/d2a/d61/d6d/d8c/d94/da4/cbf 0 2026-03-09T16:14:36.334 INFO:tasks.workunit.client.0.vm03.stdout:2/537: write db/f2d [245633,23769] 0 2026-03-09T16:14:36.349 INFO:tasks.workunit.client.0.vm03.stdout:4/532: rename d5/d17/f18 to d5/db/d25/d31/fa1 0 2026-03-09T16:14:36.349 INFO:tasks.workunit.client.0.vm03.stdout:4/533: stat d5/d17/f80 0 2026-03-09T16:14:36.352 INFO:tasks.workunit.client.0.vm03.stdout:8/554: unlink da/d10/c39 0 2026-03-09T16:14:36.366 INFO:tasks.workunit.client.0.vm03.stdout:6/516: dwrite d9/d22/f83 [0,4194304] 0 2026-03-09T16:14:36.374 INFO:tasks.workunit.client.0.vm03.stdout:5/599: dread d2/d7/de/d11/d19/d31/d35/d87/f8d [0,4194304] 0 2026-03-09T16:14:36.374 INFO:tasks.workunit.client.0.vm03.stdout:5/600: write d2/d7/de/faa [2774997,2588] 0 2026-03-09T16:14:36.390 INFO:tasks.workunit.client.0.vm03.stdout:6/517: dread d9/d42/d45/d50/f51 [0,4194304] 0 2026-03-09T16:14:36.394 INFO:tasks.workunit.client.0.vm03.stdout:0/532: unlink d0/d7/d3e/d57/faa 0 2026-03-09T16:14:36.405 INFO:tasks.workunit.client.0.vm03.stdout:9/567: creat d2/d54/d7d/faf x:0 0 0 2026-03-09T16:14:36.407 INFO:tasks.workunit.client.0.vm03.stdout:9/568: dread d2/de/d88/f6f [0,4194304] 0 2026-03-09T16:14:36.424 INFO:tasks.workunit.client.0.vm03.stdout:2/538: write db/d12/d2a/d61/d6d/f81 [315739,32956] 0 2026-03-09T16:14:36.457 INFO:tasks.workunit.client.0.vm03.stdout:4/534: symlink d5/db/d25/d31/d4d/d5b/d72/la2 0 2026-03-09T16:14:36.462 INFO:tasks.workunit.client.0.vm03.stdout:1/424: creat d4/d39/d70/f97 x:0 0 0 2026-03-09T16:14:36.467 INFO:tasks.workunit.client.0.vm03.stdout:1/425: truncate d4/d6/d1d/d20/d5f/f57 992968 0 2026-03-09T16:14:36.468 INFO:tasks.workunit.client.0.vm03.stdout:1/426: stat d4/db/d8b 0 2026-03-09T16:14:36.481 INFO:tasks.workunit.client.0.vm03.stdout:3/507: getdents d5/d53/d6c/d79 0 2026-03-09T16:14:36.502 INFO:tasks.workunit.client.0.vm03.stdout:0/533: mknod d0/d7/d3e/d57/d5a/d82/cb1 0 2026-03-09T16:14:36.507 INFO:tasks.workunit.client.0.vm03.stdout:7/480: link d4/dc/c74 d4/da/d18/d22/d24/c9a 0 2026-03-09T16:14:36.509 INFO:tasks.workunit.client.0.vm03.stdout:5/601: dwrite d2/d7/de/d11/d38/d3b/fa2 [0,4194304] 0 2026-03-09T16:14:36.517 INFO:tasks.workunit.client.0.vm03.stdout:9/569: chown d2/d4/d1f/f44 1 1 2026-03-09T16:14:36.517 INFO:tasks.workunit.client.0.vm03.stdout:9/570: readlink d2/df/l16 0 2026-03-09T16:14:36.524 INFO:tasks.workunit.client.0.vm03.stdout:2/539: dwrite db/d12/d2a/d61/d6d/f91 [0,4194304] 0 2026-03-09T16:14:36.529 INFO:tasks.workunit.client.0.vm03.stdout:1/427: creat d4/d6/d3b/f98 x:0 0 0 2026-03-09T16:14:36.544 INFO:tasks.workunit.client.0.vm03.stdout:0/534: rename d0/d7/d3e/d45 to d0/d7/d3e/d57/d5a/d5f/db2 0 2026-03-09T16:14:36.553 INFO:tasks.workunit.client.0.vm03.stdout:5/602: sync 2026-03-09T16:14:36.553 INFO:tasks.workunit.client.0.vm03.stdout:1/428: mknod d4/d39/c99 0 2026-03-09T16:14:36.554 INFO:tasks.workunit.client.0.vm03.stdout:5/603: fsync d2/d7/d1a/d1c/f5e 0 2026-03-09T16:14:36.558 INFO:tasks.workunit.client.0.vm03.stdout:6/518: creat d9/d42/f9a x:0 0 0 2026-03-09T16:14:36.558 INFO:tasks.workunit.client.0.vm03.stdout:8/555: getdents da/db/d30 0 2026-03-09T16:14:36.559 INFO:tasks.workunit.client.0.vm03.stdout:4/535: getdents d5/db 0 2026-03-09T16:14:36.560 INFO:tasks.workunit.client.0.vm03.stdout:4/536: write d5/db/d25/d31/d4d/d5b/d72/f94 [904977,10495] 0 2026-03-09T16:14:36.561 INFO:tasks.workunit.client.0.vm03.stdout:5/604: dread d2/d7/d1a/f6e [0,4194304] 0 2026-03-09T16:14:36.562 INFO:tasks.workunit.client.0.vm03.stdout:1/429: chown d4/db/d59/l5b 0 1 2026-03-09T16:14:36.565 INFO:tasks.workunit.client.0.vm03.stdout:3/508: creat d5/d53/f96 x:0 0 0 2026-03-09T16:14:36.568 INFO:tasks.workunit.client.0.vm03.stdout:7/481: creat d4/da/d5d/f9b x:0 0 0 2026-03-09T16:14:36.573 INFO:tasks.workunit.client.0.vm03.stdout:0/535: creat d0/d7/d3e/d57/d5a/d5f/db2/dab/fb3 x:0 0 0 2026-03-09T16:14:36.573 INFO:tasks.workunit.client.0.vm03.stdout:7/482: write d4/d2d/f32 [4906544,70668] 0 2026-03-09T16:14:36.574 INFO:tasks.workunit.client.0.vm03.stdout:0/536: readlink d0/d7/d3e/d57/d5a/d52/l7c 0 2026-03-09T16:14:36.575 INFO:tasks.workunit.client.0.vm03.stdout:6/519: chown d9/c30 61 1 2026-03-09T16:14:36.576 INFO:tasks.workunit.client.0.vm03.stdout:0/537: dread d0/da/d5c/fae [0,4194304] 0 2026-03-09T16:14:36.580 INFO:tasks.workunit.client.0.vm03.stdout:8/556: mkdir da/d32/db5 0 2026-03-09T16:14:36.583 INFO:tasks.workunit.client.0.vm03.stdout:8/557: sync 2026-03-09T16:14:36.592 INFO:tasks.workunit.client.0.vm03.stdout:4/537: creat d5/db/d25/d8b/fa3 x:0 0 0 2026-03-09T16:14:36.592 INFO:tasks.workunit.client.0.vm03.stdout:2/540: creat db/d12/d2a/d61/d79/d83/d64/fc0 x:0 0 0 2026-03-09T16:14:36.592 INFO:tasks.workunit.client.0.vm03.stdout:5/605: mknod d2/d7/cd5 0 2026-03-09T16:14:36.592 INFO:tasks.workunit.client.0.vm03.stdout:3/509: write d5/d1e/d42/f29 [764522,12704] 0 2026-03-09T16:14:36.593 INFO:tasks.workunit.client.0.vm03.stdout:5/606: sync 2026-03-09T16:14:36.596 INFO:tasks.workunit.client.0.vm03.stdout:9/571: dwrite d2/d4/d11/d12/f35 [4194304,4194304] 0 2026-03-09T16:14:36.602 INFO:tasks.workunit.client.0.vm03.stdout:6/520: mknod d9/d84/c9b 0 2026-03-09T16:14:36.602 INFO:tasks.workunit.client.0.vm03.stdout:1/430: dread d4/d6/d1d/d20/d23/f62 [0,4194304] 0 2026-03-09T16:14:36.603 INFO:tasks.workunit.client.0.vm03.stdout:1/431: fsync d4/d6/d1d/d20/f72 0 2026-03-09T16:14:36.608 INFO:tasks.workunit.client.0.vm03.stdout:0/538: symlink d0/d7/d3e/d57/d5a/d5f/db2/dab/lb4 0 2026-03-09T16:14:36.610 INFO:tasks.workunit.client.0.vm03.stdout:2/541: dwrite db/d12/d2a/f5f [0,4194304] 0 2026-03-09T16:14:36.632 INFO:tasks.workunit.client.0.vm03.stdout:8/558: rename da/db/d30/f62 to da/d32/d79/d95/fb6 0 2026-03-09T16:14:36.632 INFO:tasks.workunit.client.0.vm03.stdout:4/538: unlink d5/dd/d1f/f48 0 2026-03-09T16:14:36.632 INFO:tasks.workunit.client.0.vm03.stdout:7/483: unlink d4/da/d18/d22/d24/c57 0 2026-03-09T16:14:36.634 INFO:tasks.workunit.client.0.vm03.stdout:8/559: read da/db/f6a [6680754,42621] 0 2026-03-09T16:14:36.636 INFO:tasks.workunit.client.0.vm03.stdout:5/607: symlink d2/d7/de/d11/d38/ld6 0 2026-03-09T16:14:36.636 INFO:tasks.workunit.client.0.vm03.stdout:8/560: dread - da/d10/d28/d4f/d68/fa7 zero size 2026-03-09T16:14:36.639 INFO:tasks.workunit.client.0.vm03.stdout:8/561: chown da/d10/d28/d64/c78 5 1 2026-03-09T16:14:36.645 INFO:tasks.workunit.client.0.vm03.stdout:9/572: dread - d2/df/d89/f9b zero size 2026-03-09T16:14:36.646 INFO:tasks.workunit.client.0.vm03.stdout:9/573: truncate d2/d4/d11/f66 4611074 0 2026-03-09T16:14:36.650 INFO:tasks.workunit.client.0.vm03.stdout:1/432: rmdir d4 39 2026-03-09T16:14:36.651 INFO:tasks.workunit.client.0.vm03.stdout:6/521: chown d9/d14/l39 390 1 2026-03-09T16:14:36.651 INFO:tasks.workunit.client.0.vm03.stdout:2/542: creat db/d12/d2a/d61/d79/fc1 x:0 0 0 2026-03-09T16:14:36.657 INFO:tasks.workunit.client.0.vm03.stdout:0/539: rename d0/d7/d3e/d57/d5a/d52/c6a to d0/da/d1b/d9b/cb5 0 2026-03-09T16:14:36.657 INFO:tasks.workunit.client.0.vm03.stdout:0/540: readlink d0/da/d5c/l49 0 2026-03-09T16:14:36.660 INFO:tasks.workunit.client.0.vm03.stdout:7/484: mknod d4/da/d45/d51/d36/c9c 0 2026-03-09T16:14:36.660 INFO:tasks.workunit.client.0.vm03.stdout:4/539: fdatasync d5/d17/d44/f61 0 2026-03-09T16:14:36.662 INFO:tasks.workunit.client.0.vm03.stdout:4/540: write d5/db/d25/d31/d4d/f85 [4044531,89815] 0 2026-03-09T16:14:36.663 INFO:tasks.workunit.client.0.vm03.stdout:6/522: dwrite d9/d22/f83 [0,4194304] 0 2026-03-09T16:14:36.679 INFO:tasks.workunit.client.0.vm03.stdout:3/510: write d5/d1e/d42/d55/f7e [152923,38950] 0 2026-03-09T16:14:36.682 INFO:tasks.workunit.client.0.vm03.stdout:9/574: mknod d2/d4/d11/d12/d28/cb0 0 2026-03-09T16:14:36.684 INFO:tasks.workunit.client.0.vm03.stdout:2/543: mkdir db/d12/da5/dc2 0 2026-03-09T16:14:36.684 INFO:tasks.workunit.client.0.vm03.stdout:1/433: write d4/d6/d1d/f66 [5004390,19012] 0 2026-03-09T16:14:36.699 INFO:tasks.workunit.client.0.vm03.stdout:1/434: dread d4/f6d [0,4194304] 0 2026-03-09T16:14:36.699 INFO:tasks.workunit.client.0.vm03.stdout:2/544: dread db/d12/f77 [0,4194304] 0 2026-03-09T16:14:36.707 INFO:tasks.workunit.client.0.vm03.stdout:4/541: rename d5/dd/d1f/f70 to d5/d17/d44/fa4 0 2026-03-09T16:14:36.712 INFO:tasks.workunit.client.0.vm03.stdout:8/562: truncate da/d1d/f4a 219873 0 2026-03-09T16:14:36.713 INFO:tasks.workunit.client.0.vm03.stdout:9/575: symlink d2/d4/d11/d12/lb1 0 2026-03-09T16:14:36.719 INFO:tasks.workunit.client.0.vm03.stdout:1/435: mknod d4/d31/d5c/c9a 0 2026-03-09T16:14:36.721 INFO:tasks.workunit.client.0.vm03.stdout:5/608: rmdir d2/d7/d8/da0 0 2026-03-09T16:14:36.730 INFO:tasks.workunit.client.0.vm03.stdout:1/436: dwrite d4/d6/d3b/d63/f82 [0,4194304] 0 2026-03-09T16:14:36.731 INFO:tasks.workunit.client.0.vm03.stdout:1/437: fsync d4/f6d 0 2026-03-09T16:14:36.742 INFO:tasks.workunit.client.0.vm03.stdout:7/485: fdatasync d4/da/d45/d51/f91 0 2026-03-09T16:14:36.742 INFO:tasks.workunit.client.0.vm03.stdout:7/486: stat d4/da/d45/d51/f50 0 2026-03-09T16:14:36.746 INFO:tasks.workunit.client.0.vm03.stdout:6/523: rename d9/d42/d45/d47 to d9/d42/d45/d50/d80/d8a/d9c 0 2026-03-09T16:14:36.749 INFO:tasks.workunit.client.0.vm03.stdout:6/524: dwrite d9/d14/d71/f95 [0,4194304] 0 2026-03-09T16:14:36.752 INFO:tasks.workunit.client.0.vm03.stdout:4/542: creat d5/db/d25/d31/d33/fa5 x:0 0 0 2026-03-09T16:14:36.753 INFO:tasks.workunit.client.0.vm03.stdout:4/543: write d5/dd/d1f/d5f/f71 [539232,83193] 0 2026-03-09T16:14:36.771 INFO:tasks.workunit.client.0.vm03.stdout:0/541: truncate d0/d7/d3e/d95/f99 443075 0 2026-03-09T16:14:36.789 INFO:tasks.workunit.client.0.vm03.stdout:1/438: creat d4/d6/d3b/f9b x:0 0 0 2026-03-09T16:14:36.793 INFO:tasks.workunit.client.0.vm03.stdout:2/545: mkdir db/d12/da5/dbb/dc3 0 2026-03-09T16:14:36.800 INFO:tasks.workunit.client.0.vm03.stdout:9/576: write d2/d4/f3e [1290265,26112] 0 2026-03-09T16:14:36.803 INFO:tasks.workunit.client.0.vm03.stdout:9/577: dwrite d2/df/f9f [0,4194304] 0 2026-03-09T16:14:36.817 INFO:tasks.workunit.client.0.vm03.stdout:4/544: creat d5/db/d25/d31/d33/d79/fa6 x:0 0 0 2026-03-09T16:14:36.829 INFO:tasks.workunit.client.0.vm03.stdout:0/542: rename d0/da/d1b/d79 to d0/da/d5c/db6 0 2026-03-09T16:14:36.830 INFO:tasks.workunit.client.0.vm03.stdout:0/543: dread - d0/f54 zero size 2026-03-09T16:14:36.831 INFO:tasks.workunit.client.0.vm03.stdout:0/544: dread - d0/d7/d3e/d57/d5a/d47/f7b zero size 2026-03-09T16:14:36.831 INFO:tasks.workunit.client.0.vm03.stdout:0/545: stat d0/da/c1f 0 2026-03-09T16:14:36.839 INFO:tasks.workunit.client.0.vm03.stdout:8/563: link da/d10/d28/f8c da/d10/d28/d4f/daf/fb7 0 2026-03-09T16:14:36.851 INFO:tasks.workunit.client.0.vm03.stdout:1/439: symlink d4/d7b/l9c 0 2026-03-09T16:14:36.860 INFO:tasks.workunit.client.0.vm03.stdout:7/487: mkdir d4/dc/d9d 0 2026-03-09T16:14:36.868 INFO:tasks.workunit.client.0.vm03.stdout:4/545: rename d5/db/d25/d31/d33/d79/f49 to d5/db/d25/d31/d4d/d5b/d72/d82/fa7 0 2026-03-09T16:14:36.871 INFO:tasks.workunit.client.0.vm03.stdout:3/511: getdents d5/d1e/d42/d34/d70 0 2026-03-09T16:14:36.873 INFO:tasks.workunit.client.0.vm03.stdout:4/546: dwrite d5/dd/f23 [0,4194304] 0 2026-03-09T16:14:36.880 INFO:tasks.workunit.client.0.vm03.stdout:0/546: symlink d0/d7/d3e/d57/d5a/d74/lb7 0 2026-03-09T16:14:36.888 INFO:tasks.workunit.client.0.vm03.stdout:5/609: link d2/d7/de/d11/d19/d31/d35/d87/laf d2/d7/de/d11/d19/d31/d35/ld7 0 2026-03-09T16:14:36.891 INFO:tasks.workunit.client.0.vm03.stdout:1/440: dread - d4/d6/d1d/d20/d93/f85 zero size 2026-03-09T16:14:36.896 INFO:tasks.workunit.client.0.vm03.stdout:1/441: dwrite d4/d7b/f90 [0,4194304] 0 2026-03-09T16:14:36.897 INFO:tasks.workunit.client.0.vm03.stdout:9/578: dwrite d2/d4/d11/f6c [0,4194304] 0 2026-03-09T16:14:36.898 INFO:tasks.workunit.client.0.vm03.stdout:1/442: write d4/d39/d70/f97 [695581,84721] 0 2026-03-09T16:14:36.912 INFO:tasks.workunit.client.0.vm03.stdout:7/488: unlink d4/da/c93 0 2026-03-09T16:14:36.915 INFO:tasks.workunit.client.0.vm03.stdout:2/546: truncate db/d12/f85 4181724 0 2026-03-09T16:14:36.918 INFO:tasks.workunit.client.0.vm03.stdout:6/525: creat d9/d42/d45/d50/d80/d8a/d9c/d97/f9d x:0 0 0 2026-03-09T16:14:36.919 INFO:tasks.workunit.client.0.vm03.stdout:6/526: truncate d9/d42/d45/d50/d80/d8a/d9c/f93 163606 0 2026-03-09T16:14:36.920 INFO:tasks.workunit.client.0.vm03.stdout:6/527: fdatasync d9/d42/d45/d50/d80/d90/d66/f81 0 2026-03-09T16:14:36.922 INFO:tasks.workunit.client.0.vm03.stdout:3/512: read d5/d1e/d42/f25 [146041,123432] 0 2026-03-09T16:14:36.922 INFO:tasks.workunit.client.0.vm03.stdout:3/513: chown d5/c2f 3702 1 2026-03-09T16:14:36.923 INFO:tasks.workunit.client.0.vm03.stdout:4/547: rename d5/db/d25/d31/d33/d55 to d5/db/d25/d8b/da8 0 2026-03-09T16:14:36.924 INFO:tasks.workunit.client.0.vm03.stdout:4/548: chown d5/d17/d44/c7a 120 1 2026-03-09T16:14:36.924 INFO:tasks.workunit.client.0.vm03.stdout:8/564: mkdir da/db/da8/db8 0 2026-03-09T16:14:36.927 INFO:tasks.workunit.client.0.vm03.stdout:7/489: creat d4/da/d45/d51/f9e x:0 0 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:2/547: symlink db/d12/d2a/lc4 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:3/514: write d5/d6d/d5a/d63/f8a [613170,126755] 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:4/549: mkdir d5/db/d25/d31/d4d/da9 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:8/565: unlink da/d1d/l22 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:8/566: chown da/db/d43/l83 891 1 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:7/490: fsync d4/da/f20 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:0/547: creat d0/d7/d48/fb8 x:0 0 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:0/548: readlink d0/d7/d3e/d57/d5a/d74/l91 0 2026-03-09T16:14:36.936 INFO:tasks.workunit.client.0.vm03.stdout:3/515: mknod d5/d1e/d42/d8b/c97 0 2026-03-09T16:14:36.938 INFO:tasks.workunit.client.0.vm03.stdout:5/610: creat d2/d7/de/fd8 x:0 0 0 2026-03-09T16:14:36.938 INFO:tasks.workunit.client.0.vm03.stdout:4/550: creat d5/d17/faa x:0 0 0 2026-03-09T16:14:36.941 INFO:tasks.workunit.client.0.vm03.stdout:7/491: rename d4/da/d18/d22/d24/d15/d71/f7d to d4/d2d/d4b/f9f 0 2026-03-09T16:14:36.942 INFO:tasks.workunit.client.0.vm03.stdout:4/551: dwrite d5/dd/d1f/f5e [4194304,4194304] 0 2026-03-09T16:14:36.943 INFO:tasks.workunit.client.0.vm03.stdout:0/549: symlink d0/d7/d3e/d57/d5a/d5f/db2/dab/lb9 0 2026-03-09T16:14:36.945 INFO:tasks.workunit.client.0.vm03.stdout:9/579: getdents d2/d54/d7d/d8f/dad 0 2026-03-09T16:14:36.946 INFO:tasks.workunit.client.0.vm03.stdout:5/611: rename d2/d7/d8/d24/d27/ca8 to d2/d7/de/d11/d19/dbb/cd9 0 2026-03-09T16:14:36.946 INFO:tasks.workunit.client.0.vm03.stdout:4/552: dread d5/f9 [0,4194304] 0 2026-03-09T16:14:36.948 INFO:tasks.workunit.client.0.vm03.stdout:5/612: chown d2/d7/de/d11/f26 260341 1 2026-03-09T16:14:36.948 INFO:tasks.workunit.client.0.vm03.stdout:1/443: sync 2026-03-09T16:14:36.949 INFO:tasks.workunit.client.0.vm03.stdout:4/553: chown d5/d17/d44 715 1 2026-03-09T16:14:36.957 INFO:tasks.workunit.client.0.vm03.stdout:0/550: mkdir d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba 0 2026-03-09T16:14:36.958 INFO:tasks.workunit.client.0.vm03.stdout:0/551: chown d0/d7/d3e/d57/d5a/d5f 2087228 1 2026-03-09T16:14:36.964 INFO:tasks.workunit.client.0.vm03.stdout:9/580: mkdir d2/d4/d11/d12/db2 0 2026-03-09T16:14:36.970 INFO:tasks.workunit.client.0.vm03.stdout:7/492: mknod d4/da/d45/d51/d36/d66/ca0 0 2026-03-09T16:14:36.970 INFO:tasks.workunit.client.0.vm03.stdout:1/444: creat d4/db/d59/f9d x:0 0 0 2026-03-09T16:14:36.970 INFO:tasks.workunit.client.0.vm03.stdout:7/493: write d4/d2d/f95 [96701,63752] 0 2026-03-09T16:14:36.970 INFO:tasks.workunit.client.0.vm03.stdout:4/554: rename d5/dd/d1f/f60 to d5/db/d25/d31/d4d/da9/fab 0 2026-03-09T16:14:36.976 INFO:tasks.workunit.client.0.vm03.stdout:1/445: dwrite d4/d6/d1d/d20/d93/f48 [0,4194304] 0 2026-03-09T16:14:36.996 INFO:tasks.workunit.client.0.vm03.stdout:9/581: dread d2/df/d84/fa0 [0,4194304] 0 2026-03-09T16:14:36.997 INFO:tasks.workunit.client.0.vm03.stdout:0/552: dread d0/da/d7a/d98/f9d [0,4194304] 0 2026-03-09T16:14:37.007 INFO:tasks.workunit.client.0.vm03.stdout:4/555: dread d5/d17/d44/fa4 [0,4194304] 0 2026-03-09T16:14:37.015 INFO:tasks.workunit.client.0.vm03.stdout:6/528: dwrite d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:37.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:36 vm05.local ceph-mon[58702]: pgmap v10: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 MiB/s rd, 72 MiB/s wr, 189 op/s 2026-03-09T16:14:37.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:36 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:37.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:36 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:37.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:36 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:37.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:36 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:37.026 INFO:tasks.workunit.client.0.vm03.stdout:3/516: truncate d5/d1e/d42/f84 851671 0 2026-03-09T16:14:37.029 INFO:tasks.workunit.client.0.vm03.stdout:3/517: dwrite d5/d6d/d5a/d63/f8a [0,4194304] 0 2026-03-09T16:14:37.037 INFO:tasks.workunit.client.0.vm03.stdout:7/494: creat d4/da/d18/d22/d24/d16/d6e/fa1 x:0 0 0 2026-03-09T16:14:37.049 INFO:tasks.workunit.client.0.vm03.stdout:3/518: write d5/d1e/d42/f74 [1760309,49404] 0 2026-03-09T16:14:37.054 INFO:tasks.workunit.client.0.vm03.stdout:8/567: dwrite da/d10/d28/f8b [0,4194304] 0 2026-03-09T16:14:37.059 INFO:tasks.workunit.client.0.vm03.stdout:1/446: creat d4/d31/d5c/f9e x:0 0 0 2026-03-09T16:14:37.059 INFO:tasks.workunit.client.0.vm03.stdout:5/613: write d2/d7/de/d11/d38/d3b/f68 [652904,38978] 0 2026-03-09T16:14:37.059 INFO:tasks.workunit.client.0.vm03.stdout:8/568: fsync da/d10/d28/fb0 0 2026-03-09T16:14:37.063 INFO:tasks.workunit.client.0.vm03.stdout:4/556: creat d5/db/d25/d31/d4d/d5b/d9a/fac x:0 0 0 2026-03-09T16:14:37.063 INFO:tasks.workunit.client.0.vm03.stdout:4/557: truncate d5/f74 411020 0 2026-03-09T16:14:37.067 INFO:tasks.workunit.client.0.vm03.stdout:6/529: dread d9/d14/f29 [0,4194304] 0 2026-03-09T16:14:37.072 INFO:tasks.workunit.client.0.vm03.stdout:6/530: read d9/d14/f29 [430291,27723] 0 2026-03-09T16:14:37.072 INFO:tasks.workunit.client.0.vm03.stdout:4/558: dread d5/d17/f39 [0,4194304] 0 2026-03-09T16:14:37.072 INFO:tasks.workunit.client.0.vm03.stdout:7/495: chown d4/da/c31 3173964 1 2026-03-09T16:14:37.081 INFO:tasks.workunit.client.0.vm03.stdout:5/614: unlink d2/d7/d3c/d3d/fa3 0 2026-03-09T16:14:37.082 INFO:tasks.workunit.client.0.vm03.stdout:8/569: mkdir da/d32/d79/db9 0 2026-03-09T16:14:37.082 INFO:tasks.workunit.client.0.vm03.stdout:9/582: mkdir d2/d4/d11/d29/d2a/db3 0 2026-03-09T16:14:37.082 INFO:tasks.workunit.client.0.vm03.stdout:8/570: chown da/d6c/d7a/c9d 4223 1 2026-03-09T16:14:37.096 INFO:tasks.workunit.client.0.vm03.stdout:6/531: fdatasync d9/d42/d45/f4a 0 2026-03-09T16:14:37.096 INFO:tasks.workunit.client.0.vm03.stdout:6/532: write d9/d22/f43 [471894,105608] 0 2026-03-09T16:14:37.102 INFO:tasks.workunit.client.0.vm03.stdout:0/553: creat d0/d7/fbb x:0 0 0 2026-03-09T16:14:37.103 INFO:tasks.workunit.client.0.vm03.stdout:0/554: fdatasync d0/d7/d48/f43 0 2026-03-09T16:14:37.103 INFO:tasks.workunit.client.0.vm03.stdout:0/555: chown d0/da/d5c/fae 8 1 2026-03-09T16:14:37.114 INFO:tasks.workunit.client.0.vm03.stdout:3/519: rename d5/d6d/l62 to d5/d2e/l98 0 2026-03-09T16:14:37.122 INFO:tasks.workunit.client.0.vm03.stdout:3/520: dwrite d5/d1e/d42/d34/f73 [0,4194304] 0 2026-03-09T16:14:37.131 INFO:tasks.workunit.client.0.vm03.stdout:4/559: creat d5/dd/d1f/d95/fad x:0 0 0 2026-03-09T16:14:37.133 INFO:tasks.workunit.client.0.vm03.stdout:7/496: symlink d4/dc/la2 0 2026-03-09T16:14:37.135 INFO:tasks.workunit.client.0.vm03.stdout:1/447: creat d4/d6/d1d/d20/d23/f9f x:0 0 0 2026-03-09T16:14:37.136 INFO:tasks.workunit.client.0.vm03.stdout:5/615: creat d2/d7/d8/d16/d5c/dcf/fda x:0 0 0 2026-03-09T16:14:37.137 INFO:tasks.workunit.client.0.vm03.stdout:9/583: dwrite d2/d4/d11/d29/d2a/d4d/f56 [0,4194304] 0 2026-03-09T16:14:37.139 INFO:tasks.workunit.client.0.vm03.stdout:2/548: link db/f34 db/d12/d2a/d61/d6d/fc5 0 2026-03-09T16:14:37.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:36 vm03.local ceph-mon[51019]: pgmap v10: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 MiB/s rd, 72 MiB/s wr, 189 op/s 2026-03-09T16:14:37.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:36 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:37.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:36 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:37.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:36 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:37.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:36 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:37.144 INFO:tasks.workunit.client.0.vm03.stdout:8/571: dwrite da/db/f75 [0,4194304] 0 2026-03-09T16:14:37.145 INFO:tasks.workunit.client.0.vm03.stdout:2/549: dread db/d12/d2a/f60 [0,4194304] 0 2026-03-09T16:14:37.158 INFO:tasks.workunit.client.0.vm03.stdout:2/550: dwrite db/d12/d2a/d61/d79/f7f [0,4194304] 0 2026-03-09T16:14:37.177 INFO:tasks.workunit.client.0.vm03.stdout:1/448: dread d4/d6/d3b/d6b/d25/f4e [0,4194304] 0 2026-03-09T16:14:37.187 INFO:tasks.workunit.client.0.vm03.stdout:4/560: dread d5/db/d25/d31/d4d/d5b/d72/d82/fa7 [0,4194304] 0 2026-03-09T16:14:37.191 INFO:tasks.workunit.client.0.vm03.stdout:6/533: getdents d9/d8e 0 2026-03-09T16:14:37.198 INFO:tasks.workunit.client.0.vm03.stdout:2/551: fdatasync db/d12/d2a/d61/f74 0 2026-03-09T16:14:37.198 INFO:tasks.workunit.client.0.vm03.stdout:7/497: creat d4/da/d18/d22/d24/fa3 x:0 0 0 2026-03-09T16:14:37.199 INFO:tasks.workunit.client.0.vm03.stdout:7/498: chown d4/dc 1986 1 2026-03-09T16:14:37.199 INFO:tasks.workunit.client.0.vm03.stdout:3/521: creat d5/d1e/d42/f99 x:0 0 0 2026-03-09T16:14:37.200 INFO:tasks.workunit.client.0.vm03.stdout:6/534: rmdir d9/d42/d45 39 2026-03-09T16:14:37.200 INFO:tasks.workunit.client.0.vm03.stdout:7/499: unlink d4/da/d45/f4e 0 2026-03-09T16:14:37.201 INFO:tasks.workunit.client.0.vm03.stdout:6/535: truncate d9/d84/f91 281051 0 2026-03-09T16:14:37.204 INFO:tasks.workunit.client.0.vm03.stdout:1/449: mknod d4/db/ca0 0 2026-03-09T16:14:37.207 INFO:tasks.workunit.client.0.vm03.stdout:6/536: readlink d9/d42/d45/d50/d80/d8a/d9c/l82 0 2026-03-09T16:14:37.208 INFO:tasks.workunit.client.0.vm03.stdout:1/450: mkdir d4/d6/d1d/d20/d23/d3e/da1 0 2026-03-09T16:14:37.209 INFO:tasks.workunit.client.0.vm03.stdout:1/451: dread - d4/d6/d1d/d20/d93/f8c zero size 2026-03-09T16:14:37.221 INFO:tasks.workunit.client.0.vm03.stdout:6/537: dwrite d9/d42/d45/f4d [0,4194304] 0 2026-03-09T16:14:37.223 INFO:tasks.workunit.client.0.vm03.stdout:4/561: sync 2026-03-09T16:14:37.229 INFO:tasks.workunit.client.0.vm03.stdout:0/556: write d0/d7/d75/f69 [2259290,89502] 0 2026-03-09T16:14:37.229 INFO:tasks.workunit.client.0.vm03.stdout:3/522: link d5/ca d5/d53/c9a 0 2026-03-09T16:14:37.230 INFO:tasks.workunit.client.0.vm03.stdout:9/584: write d2/de/f85 [561442,92983] 0 2026-03-09T16:14:37.235 INFO:tasks.workunit.client.0.vm03.stdout:5/616: dwrite d2/d7/de/d11/d19/d31/f99 [0,4194304] 0 2026-03-09T16:14:37.240 INFO:tasks.workunit.client.0.vm03.stdout:8/572: write da/d10/d28/f8c [8920403,55927] 0 2026-03-09T16:14:37.242 INFO:tasks.workunit.client.0.vm03.stdout:7/500: write d4/da/d18/f6a [2756743,45898] 0 2026-03-09T16:14:37.244 INFO:tasks.workunit.client.0.vm03.stdout:2/552: dwrite db/d12/d2a/d61/f5c [0,4194304] 0 2026-03-09T16:14:37.264 INFO:tasks.workunit.client.0.vm03.stdout:0/557: chown d0/da/d1b/fd 32230939 1 2026-03-09T16:14:37.265 INFO:tasks.workunit.client.0.vm03.stdout:3/523: rmdir d5/d1e/d42/d34/d70 39 2026-03-09T16:14:37.266 INFO:tasks.workunit.client.0.vm03.stdout:3/524: fsync d5/d44/f54 0 2026-03-09T16:14:37.271 INFO:tasks.workunit.client.0.vm03.stdout:9/585: creat d2/d4/d11/d29/d2a/d38/fb4 x:0 0 0 2026-03-09T16:14:37.272 INFO:tasks.workunit.client.0.vm03.stdout:9/586: fdatasync d2/d4/d11/fa8 0 2026-03-09T16:14:37.275 INFO:tasks.workunit.client.0.vm03.stdout:8/573: creat da/fba x:0 0 0 2026-03-09T16:14:37.276 INFO:tasks.workunit.client.0.vm03.stdout:7/501: creat d4/da/d45/fa4 x:0 0 0 2026-03-09T16:14:37.277 INFO:tasks.workunit.client.0.vm03.stdout:6/538: mknod d9/d42/d45/d65/c9e 0 2026-03-09T16:14:37.279 INFO:tasks.workunit.client.0.vm03.stdout:3/525: creat d5/d1e/f9b x:0 0 0 2026-03-09T16:14:37.279 INFO:tasks.workunit.client.0.vm03.stdout:3/526: fsync d5/d1e/d42/f29 0 2026-03-09T16:14:37.281 INFO:tasks.workunit.client.0.vm03.stdout:0/558: dread d0/d7/d3e/d57/d5a/f4b [0,4194304] 0 2026-03-09T16:14:37.283 INFO:tasks.workunit.client.0.vm03.stdout:3/527: dread d5/d6d/f7a [0,4194304] 0 2026-03-09T16:14:37.283 INFO:tasks.workunit.client.0.vm03.stdout:0/559: dread d0/da/d5c/fae [0,4194304] 0 2026-03-09T16:14:37.291 INFO:tasks.workunit.client.0.vm03.stdout:3/528: dwrite d5/d6d/f7a [0,4194304] 0 2026-03-09T16:14:37.293 INFO:tasks.workunit.client.0.vm03.stdout:9/587: creat d2/df/d84/d8a/fb5 x:0 0 0 2026-03-09T16:14:37.300 INFO:tasks.workunit.client.0.vm03.stdout:2/553: dread db/d12/d2a/f88 [0,4194304] 0 2026-03-09T16:14:37.301 INFO:tasks.workunit.client.0.vm03.stdout:2/554: write db/d12/d2a/d61/d79/fb7 [829027,96856] 0 2026-03-09T16:14:37.303 INFO:tasks.workunit.client.0.vm03.stdout:2/555: chown db/d12/d2a/d61/d6d/f81 458755667 1 2026-03-09T16:14:37.305 INFO:tasks.workunit.client.0.vm03.stdout:7/502: readlink d4/da/l4a 0 2026-03-09T16:14:37.309 INFO:tasks.workunit.client.0.vm03.stdout:6/539: chown d9/d42/c5f 1015 1 2026-03-09T16:14:37.316 INFO:tasks.workunit.client.0.vm03.stdout:1/452: dwrite d4/d6/d1d/d20/d23/f74 [0,4194304] 0 2026-03-09T16:14:37.316 INFO:tasks.workunit.client.0.vm03.stdout:4/562: creat d5/db/d25/d8b/da8/fae x:0 0 0 2026-03-09T16:14:37.318 INFO:tasks.workunit.client.0.vm03.stdout:5/617: dwrite d2/d7/de/d11/f32 [0,4194304] 0 2026-03-09T16:14:37.321 INFO:tasks.workunit.client.0.vm03.stdout:0/560: creat d0/da/d5c/db6/fbc x:0 0 0 2026-03-09T16:14:37.322 INFO:tasks.workunit.client.0.vm03.stdout:0/561: stat d0/d7/d3e/d57 0 2026-03-09T16:14:37.326 INFO:tasks.workunit.client.0.vm03.stdout:3/529: mkdir d5/d6d/d6a/d9c 0 2026-03-09T16:14:37.330 INFO:tasks.workunit.client.0.vm03.stdout:9/588: mkdir d2/d4/d11/d29/d2a/d38/db6 0 2026-03-09T16:14:37.333 INFO:tasks.workunit.client.0.vm03.stdout:9/589: dwrite d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:37.337 INFO:tasks.workunit.client.0.vm03.stdout:9/590: stat d2/df/d89 0 2026-03-09T16:14:37.342 INFO:tasks.workunit.client.0.vm03.stdout:8/574: dwrite da/d1d/f4a [0,4194304] 0 2026-03-09T16:14:37.347 INFO:tasks.workunit.client.0.vm03.stdout:2/556: creat db/d12/d2a/d61/d6d/d8c/d94/fc6 x:0 0 0 2026-03-09T16:14:37.357 INFO:tasks.workunit.client.0.vm03.stdout:8/575: dwrite da/d10/d28/d4f/d68/fa7 [0,4194304] 0 2026-03-09T16:14:37.364 INFO:tasks.workunit.client.0.vm03.stdout:8/576: fsync da/d10/d28/f5c 0 2026-03-09T16:14:37.364 INFO:tasks.workunit.client.0.vm03.stdout:7/503: fsync d4/da/d18/f37 0 2026-03-09T16:14:37.370 INFO:tasks.workunit.client.0.vm03.stdout:1/453: unlink d4/d31/d5c/f75 0 2026-03-09T16:14:37.375 INFO:tasks.workunit.client.0.vm03.stdout:2/557: sync 2026-03-09T16:14:37.378 INFO:tasks.workunit.client.0.vm03.stdout:4/563: read d5/db/d25/d31/d4d/da9/fab [559653,13878] 0 2026-03-09T16:14:37.381 INFO:tasks.workunit.client.0.vm03.stdout:5/618: creat d2/d7/d3c/fdb x:0 0 0 2026-03-09T16:14:37.387 INFO:tasks.workunit.client.0.vm03.stdout:0/562: rename d0/d7/d3e/d57/d5a/d74 to d0/d7/d3e/d57/d5a/d82/d89/dbd 0 2026-03-09T16:14:37.429 INFO:tasks.workunit.client.0.vm03.stdout:7/504: mknod d4/da/d18/d22/d24/d16/d3e/d77/ca5 0 2026-03-09T16:14:37.444 INFO:tasks.workunit.client.0.vm03.stdout:1/454: read d4/db/f21 [210004,39550] 0 2026-03-09T16:14:37.446 INFO:tasks.workunit.client.0.vm03.stdout:1/455: fsync d4/d39/d7f/f88 0 2026-03-09T16:14:37.447 INFO:tasks.workunit.client.0.vm03.stdout:1/456: stat d4/d6/d1d/d20/d5f/f57 0 2026-03-09T16:14:37.452 INFO:tasks.workunit.client.0.vm03.stdout:2/558: symlink db/d12/d2a/d61/d6d/d8c/lc7 0 2026-03-09T16:14:37.453 INFO:tasks.workunit.client.0.vm03.stdout:9/591: link d2/d4/d11/d12/f9a d2/d4/d11/d29/d2a/d38/fb7 0 2026-03-09T16:14:37.453 INFO:tasks.workunit.client.0.vm03.stdout:2/559: chown db/f2d 332027 1 2026-03-09T16:14:37.456 INFO:tasks.workunit.client.0.vm03.stdout:7/505: symlink d4/da/d18/d22/d24/d16/d3e/d77/la6 0 2026-03-09T16:14:37.459 INFO:tasks.workunit.client.0.vm03.stdout:8/577: creat da/d10/d28/db1/fbb x:0 0 0 2026-03-09T16:14:37.472 INFO:tasks.workunit.client.0.vm03.stdout:3/530: truncate d5/d1e/f72 3337001 0 2026-03-09T16:14:37.476 INFO:tasks.workunit.client.0.vm03.stdout:3/531: write d5/d1e/d42/d34/f73 [2826631,68995] 0 2026-03-09T16:14:37.480 INFO:tasks.workunit.client.0.vm03.stdout:6/540: truncate d9/f5c 372988 0 2026-03-09T16:14:37.487 INFO:tasks.workunit.client.0.vm03.stdout:4/564: dwrite d5/db/f28 [0,4194304] 0 2026-03-09T16:14:37.488 INFO:tasks.workunit.client.0.vm03.stdout:4/565: write d5/db/d25/d31/d33/f92 [162669,112947] 0 2026-03-09T16:14:37.488 INFO:tasks.workunit.client.0.vm03.stdout:4/566: write d5/db/f34 [1486720,24389] 0 2026-03-09T16:14:37.490 INFO:tasks.workunit.client.0.vm03.stdout:5/619: truncate d2/d7/d8/f7a 1723614 0 2026-03-09T16:14:37.490 INFO:tasks.workunit.client.0.vm03.stdout:4/567: dread - d5/d17/f8a zero size 2026-03-09T16:14:37.490 INFO:tasks.workunit.client.0.vm03.stdout:5/620: fsync d2/d7/de/d11/d38/d52/fc6 0 2026-03-09T16:14:37.492 INFO:tasks.workunit.client.0.vm03.stdout:2/560: symlink db/d12/d2a/d61/d6d/lc8 0 2026-03-09T16:14:37.492 INFO:tasks.workunit.client.0.vm03.stdout:8/578: symlink da/d10/d63/lbc 0 2026-03-09T16:14:37.502 INFO:tasks.workunit.client.0.vm03.stdout:3/532: dread d5/d1e/d42/f74 [0,4194304] 0 2026-03-09T16:14:37.504 INFO:tasks.workunit.client.0.vm03.stdout:1/457: mkdir d4/d6/da2 0 2026-03-09T16:14:37.508 INFO:tasks.workunit.client.0.vm03.stdout:3/533: chown d5/d1e/d42/d55/f7e 3125 1 2026-03-09T16:14:37.510 INFO:tasks.workunit.client.0.vm03.stdout:1/458: chown d4/d6/d3b/d63/f89 994009481 1 2026-03-09T16:14:37.513 INFO:tasks.workunit.client.0.vm03.stdout:1/459: dread d4/db/f47 [0,4194304] 0 2026-03-09T16:14:37.521 INFO:tasks.workunit.client.0.vm03.stdout:3/534: dwrite d5/d6d/d5a/f78 [4194304,4194304] 0 2026-03-09T16:14:37.521 INFO:tasks.workunit.client.0.vm03.stdout:3/535: stat d5/lf 0 2026-03-09T16:14:37.522 INFO:tasks.workunit.client.0.vm03.stdout:3/536: readlink d5/d1e/d42/d34/l47 0 2026-03-09T16:14:37.533 INFO:tasks.workunit.client.0.vm03.stdout:9/592: dread d2/df/f42 [0,4194304] 0 2026-03-09T16:14:37.535 INFO:tasks.workunit.client.0.vm03.stdout:3/537: dwrite d5/d6d/d5a/f7c [0,4194304] 0 2026-03-09T16:14:37.551 INFO:tasks.workunit.client.0.vm03.stdout:4/568: mknod d5/db/d25/d8b/da8/d81/caf 0 2026-03-09T16:14:37.553 INFO:tasks.workunit.client.0.vm03.stdout:0/563: link d0/da/d1b/f46 d0/da/fbe 0 2026-03-09T16:14:37.554 INFO:tasks.workunit.client.0.vm03.stdout:8/579: write da/d6c/d7a/f7f [830805,62953] 0 2026-03-09T16:14:37.566 INFO:tasks.workunit.client.0.vm03.stdout:3/538: creat d5/d53/d6c/d79/f9d x:0 0 0 2026-03-09T16:14:37.570 INFO:tasks.workunit.client.0.vm03.stdout:2/561: mkdir db/d12/da5/dc2/dc9 0 2026-03-09T16:14:37.579 INFO:tasks.workunit.client.0.vm03.stdout:8/580: sync 2026-03-09T16:14:37.579 INFO:tasks.workunit.client.0.vm03.stdout:3/539: dread d5/d1e/d42/d55/f57 [0,4194304] 0 2026-03-09T16:14:37.581 INFO:tasks.workunit.client.0.vm03.stdout:3/540: stat d5/d1e/l38 0 2026-03-09T16:14:37.591 INFO:tasks.workunit.client.0.vm03.stdout:4/569: dread d5/db/d25/d8b/da8/f62 [8388608,4194304] 0 2026-03-09T16:14:37.591 INFO:tasks.workunit.client.0.vm03.stdout:4/570: dread - d5/db/d25/d31/d33/d79/fa6 zero size 2026-03-09T16:14:37.595 INFO:tasks.workunit.client.0.vm03.stdout:1/460: symlink d4/db/d8b/la3 0 2026-03-09T16:14:37.595 INFO:tasks.workunit.client.0.vm03.stdout:9/593: truncate d2/d4/d1f/f23 4243891 0 2026-03-09T16:14:37.596 INFO:tasks.workunit.client.0.vm03.stdout:2/562: mkdir db/d12/d2a/d61/dca 0 2026-03-09T16:14:37.598 INFO:tasks.workunit.client.0.vm03.stdout:9/594: truncate d2/d54/d7d/d8f/dad/fae 786 0 2026-03-09T16:14:37.598 INFO:tasks.workunit.client.0.vm03.stdout:1/461: dread - d4/d31/d5c/f9e zero size 2026-03-09T16:14:37.599 INFO:tasks.workunit.client.0.vm03.stdout:1/462: read d4/db/f47 [11877,11679] 0 2026-03-09T16:14:37.599 INFO:tasks.workunit.client.0.vm03.stdout:1/463: chown d4/d39/d70/f97 5 1 2026-03-09T16:14:37.600 INFO:tasks.workunit.client.0.vm03.stdout:6/541: link d9/d22/f3f d9/d8e/f9f 0 2026-03-09T16:14:37.602 INFO:tasks.workunit.client.0.vm03.stdout:3/541: dwrite d5/d1e/f9b [0,4194304] 0 2026-03-09T16:14:37.613 INFO:tasks.workunit.client.0.vm03.stdout:7/506: truncate d4/da/d18/d22/d24/d15/f34 2671156 0 2026-03-09T16:14:37.619 INFO:tasks.workunit.client.0.vm03.stdout:1/464: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:14:37.649 INFO:tasks.workunit.client.0.vm03.stdout:0/564: mknod d0/d7/cbf 0 2026-03-09T16:14:37.649 INFO:tasks.workunit.client.0.vm03.stdout:8/581: write da/d10/d63/f73 [1488859,5355] 0 2026-03-09T16:14:37.651 INFO:tasks.workunit.client.0.vm03.stdout:0/565: stat d0/d7/d75/cad 0 2026-03-09T16:14:37.657 INFO:tasks.workunit.client.0.vm03.stdout:0/566: fdatasync d0/f60 0 2026-03-09T16:14:37.664 INFO:tasks.workunit.client.0.vm03.stdout:2/563: creat db/d12/d2a/d61/d79/d83/d64/dbd/da0/fcb x:0 0 0 2026-03-09T16:14:37.665 INFO:tasks.workunit.client.0.vm03.stdout:8/582: dwrite da/d1d/f99 [0,4194304] 0 2026-03-09T16:14:37.670 INFO:tasks.workunit.client.0.vm03.stdout:9/595: rmdir d2/d54 39 2026-03-09T16:14:37.671 INFO:tasks.workunit.client.0.vm03.stdout:5/621: dwrite d2/d7/de/d11/d19/f8e [0,4194304] 0 2026-03-09T16:14:37.672 INFO:tasks.workunit.client.0.vm03.stdout:3/542: mknod d5/d6d/d5a/d63/c9e 0 2026-03-09T16:14:37.672 INFO:tasks.workunit.client.0.vm03.stdout:6/542: rmdir d9/d42/d45/d50/d80/d90 39 2026-03-09T16:14:37.674 INFO:tasks.workunit.client.0.vm03.stdout:5/622: write d2/d7/de/d11/f80 [2205722,63465] 0 2026-03-09T16:14:37.677 INFO:tasks.workunit.client.0.vm03.stdout:9/596: write d2/de/f85 [1001043,23715] 0 2026-03-09T16:14:37.687 INFO:tasks.workunit.client.0.vm03.stdout:7/507: symlink d4/da/d18/d22/d24/d15/la7 0 2026-03-09T16:14:37.688 INFO:tasks.workunit.client.0.vm03.stdout:1/465: truncate d4/d6/f9 5207215 0 2026-03-09T16:14:37.693 INFO:tasks.workunit.client.0.vm03.stdout:1/466: truncate d4/db/f7d 1039781 0 2026-03-09T16:14:37.693 INFO:tasks.workunit.client.0.vm03.stdout:1/467: write d4/d6/d1d/d20/f72 [2288000,5279] 0 2026-03-09T16:14:37.696 INFO:tasks.workunit.client.0.vm03.stdout:4/571: link d5/f7 d5/db/d25/d31/d33/d79/fb0 0 2026-03-09T16:14:37.697 INFO:tasks.workunit.client.0.vm03.stdout:4/572: read - d5/db/d25/d8b/fa3 zero size 2026-03-09T16:14:37.715 INFO:tasks.workunit.client.0.vm03.stdout:0/567: mkdir d0/d7/d3e/d57/d5a/d82/d89/dc0 0 2026-03-09T16:14:37.737 INFO:tasks.workunit.client.0.vm03.stdout:9/597: mknod d2/d4/d11/d29/d2a/d46/cb8 0 2026-03-09T16:14:37.740 INFO:tasks.workunit.client.0.vm03.stdout:9/598: dread d2/de/f85 [0,4194304] 0 2026-03-09T16:14:37.745 INFO:tasks.workunit.client.0.vm03.stdout:9/599: fsync d2/df/d84/d8a/fb5 0 2026-03-09T16:14:37.747 INFO:tasks.workunit.client.0.vm03.stdout:4/573: rmdir d5/db/d25/d31/d33 39 2026-03-09T16:14:37.748 INFO:tasks.workunit.client.0.vm03.stdout:4/574: fdatasync d5/d17/f9c 0 2026-03-09T16:14:37.749 INFO:tasks.workunit.client.0.vm03.stdout:2/564: fsync db/f23 0 2026-03-09T16:14:37.750 INFO:tasks.workunit.client.0.vm03.stdout:4/575: stat d5/d17/c53 0 2026-03-09T16:14:37.752 INFO:tasks.workunit.client.0.vm03.stdout:0/568: read d0/da/fbe [1223037,113742] 0 2026-03-09T16:14:37.755 INFO:tasks.workunit.client.0.vm03.stdout:4/576: write d5/db/d25/d8b/da8/f7f [719695,34807] 0 2026-03-09T16:14:37.758 INFO:tasks.workunit.client.0.vm03.stdout:4/577: write d5/d17/d44/f90 [294851,65259] 0 2026-03-09T16:14:37.758 INFO:tasks.workunit.client.0.vm03.stdout:9/600: dread - d2/d54/d7d/faf zero size 2026-03-09T16:14:37.759 INFO:tasks.workunit.client.0.vm03.stdout:9/601: readlink d2/de/d88/l78 0 2026-03-09T16:14:37.762 INFO:tasks.workunit.client.0.vm03.stdout:1/468: dread d4/d6/d1d/d3d/f45 [0,4194304] 0 2026-03-09T16:14:37.765 INFO:tasks.workunit.client.0.vm03.stdout:3/543: rmdir d5/d6d/d6a/d85 0 2026-03-09T16:14:37.766 INFO:tasks.workunit.client.0.vm03.stdout:3/544: readlink d5/d2e/l36 0 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:1/469: readlink d4/db/l55 0 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:0/569: creat d0/d7/d3e/d57/d5a/fc1 x:0 0 0 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:0/570: chown d0/d7/d75/f69 65653 1 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:6/543: creat d9/d42/fa0 x:0 0 0 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:2/565: dwrite db/d12/d2a/d61/d79/f9a [0,4194304] 0 2026-03-09T16:14:37.774 INFO:tasks.workunit.client.0.vm03.stdout:1/470: write d4/d39/d7f/f88 [1018097,69367] 0 2026-03-09T16:14:37.778 INFO:tasks.workunit.client.0.vm03.stdout:8/583: getdents da/d10/d28/d4f/d85 0 2026-03-09T16:14:37.779 INFO:tasks.workunit.client.0.vm03.stdout:9/602: creat d2/d4/d1f/d83/fb9 x:0 0 0 2026-03-09T16:14:37.780 INFO:tasks.workunit.client.0.vm03.stdout:9/603: fdatasync d2/de/f87 0 2026-03-09T16:14:37.781 INFO:tasks.workunit.client.0.vm03.stdout:0/571: dwrite d0/d7/d3e/d57/f90 [0,4194304] 0 2026-03-09T16:14:37.784 INFO:tasks.workunit.client.0.vm03.stdout:9/604: truncate d2/d4/d11/d29/f95 612245 0 2026-03-09T16:14:37.785 INFO:tasks.workunit.client.0.vm03.stdout:4/578: mknod d5/db/d25/d31/d4d/d5b/d7d/cb1 0 2026-03-09T16:14:37.785 INFO:tasks.workunit.client.0.vm03.stdout:3/545: creat d5/d53/d6c/f9f x:0 0 0 2026-03-09T16:14:37.789 INFO:tasks.workunit.client.0.vm03.stdout:5/623: rename d2/d7/de/d11/d19/dbb/cd9 to d2/d7/de/d11/cdc 0 2026-03-09T16:14:37.792 INFO:tasks.workunit.client.0.vm03.stdout:5/624: stat d2/fd4 0 2026-03-09T16:14:37.793 INFO:tasks.workunit.client.0.vm03.stdout:9/605: dread d2/d54/d7d/d8f/dad/fae [0,4194304] 0 2026-03-09T16:14:37.796 INFO:tasks.workunit.client.0.vm03.stdout:9/606: write d2/d4/d11/f66 [3040564,32411] 0 2026-03-09T16:14:37.799 INFO:tasks.workunit.client.0.vm03.stdout:2/566: dwrite db/d12/d2a/d61/f9b [0,4194304] 0 2026-03-09T16:14:37.805 INFO:tasks.workunit.client.0.vm03.stdout:8/584: mkdir da/d32/d79/d95/dbd 0 2026-03-09T16:14:37.805 INFO:tasks.workunit.client.0.vm03.stdout:1/471: symlink d4/d6/d1d/d20/d23/d3e/la4 0 2026-03-09T16:14:37.805 INFO:tasks.workunit.client.0.vm03.stdout:1/472: stat d4/db/l4b 0 2026-03-09T16:14:37.822 INFO:tasks.workunit.client.0.vm03.stdout:5/625: dread d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:37.832 INFO:tasks.workunit.client.0.vm03.stdout:5/626: dwrite d2/d7/de/d11/d19/d31/f42 [4194304,4194304] 0 2026-03-09T16:14:37.846 INFO:tasks.workunit.client.0.vm03.stdout:7/508: rename d4/dc to d4/da/d18/d22/d24/d16/d3e/d77/da8 0 2026-03-09T16:14:37.858 INFO:tasks.workunit.client.0.vm03.stdout:7/509: dwrite d4/d2d/f90 [0,4194304] 0 2026-03-09T16:14:37.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:37 vm03.local ceph-mon[51019]: pgmap v11: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 MiB/s rd, 72 MiB/s wr, 189 op/s 2026-03-09T16:14:37.885 INFO:tasks.workunit.client.0.vm03.stdout:0/572: mknod d0/d7/cc2 0 2026-03-09T16:14:37.885 INFO:tasks.workunit.client.0.vm03.stdout:6/544: creat d9/d42/d45/d50/d80/fa1 x:0 0 0 2026-03-09T16:14:37.895 INFO:tasks.workunit.client.0.vm03.stdout:6/545: read f7 [180347,91248] 0 2026-03-09T16:14:37.895 INFO:tasks.workunit.client.0.vm03.stdout:3/546: rename d5/d6d/f67 to d5/d1e/d42/d34/d70/fa0 0 2026-03-09T16:14:37.895 INFO:tasks.workunit.client.0.vm03.stdout:3/547: write d5/d53/d6c/d79/f9d [283632,127740] 0 2026-03-09T16:14:37.895 INFO:tasks.workunit.client.0.vm03.stdout:9/607: getdents d2/d4/d11/d29/d2a/d38/db6 0 2026-03-09T16:14:37.896 INFO:tasks.workunit.client.0.vm03.stdout:2/567: symlink db/d12/da5/dbb/dc3/lcc 0 2026-03-09T16:14:37.900 INFO:tasks.workunit.client.0.vm03.stdout:0/573: dwrite d0/d7/d3e/d57/d5a/d82/fa0 [0,4194304] 0 2026-03-09T16:14:37.900 INFO:tasks.workunit.client.0.vm03.stdout:2/568: dread - db/d12/d2a/d61/d79/d83/d64/f66 zero size 2026-03-09T16:14:37.912 INFO:tasks.workunit.client.0.vm03.stdout:2/569: write db/f14 [3165668,124645] 0 2026-03-09T16:14:37.913 INFO:tasks.workunit.client.0.vm03.stdout:1/473: mkdir d4/d6/d3b/d6b/da5 0 2026-03-09T16:14:37.914 INFO:tasks.workunit.client.0.vm03.stdout:1/474: fsync d4/db/f60 0 2026-03-09T16:14:37.915 INFO:tasks.workunit.client.0.vm03.stdout:6/546: dwrite d9/d42/d45/d50/d80/fa1 [0,4194304] 0 2026-03-09T16:14:37.926 INFO:tasks.workunit.client.0.vm03.stdout:4/579: truncate d5/db/d25/d31/d33/d79/f4f 2803094 0 2026-03-09T16:14:37.927 INFO:tasks.workunit.client.0.vm03.stdout:4/580: chown d5/d17/f39 9987 1 2026-03-09T16:14:37.930 INFO:tasks.workunit.client.0.vm03.stdout:0/574: dread d0/d7/d48/f18 [0,4194304] 0 2026-03-09T16:14:37.935 INFO:tasks.workunit.client.0.vm03.stdout:5/627: mknod d2/d7/de/d11/d19/d29/d90/db6/cdd 0 2026-03-09T16:14:37.938 INFO:tasks.workunit.client.0.vm03.stdout:1/475: dread d4/d6/d3b/d63/f77 [0,4194304] 0 2026-03-09T16:14:37.938 INFO:tasks.workunit.client.0.vm03.stdout:1/476: rename d4 to d4/d6/d3b/da6 22 2026-03-09T16:14:37.948 INFO:tasks.workunit.client.0.vm03.stdout:3/548: mkdir d5/d1e/d42/d4c/da1 0 2026-03-09T16:14:37.951 INFO:tasks.workunit.client.0.vm03.stdout:8/585: dwrite da/d45/faa [0,4194304] 0 2026-03-09T16:14:37.953 INFO:tasks.workunit.client.0.vm03.stdout:4/581: dwrite d5/f74 [0,4194304] 0 2026-03-09T16:14:37.961 INFO:tasks.workunit.client.0.vm03.stdout:4/582: write d5/d17/f8d [600246,52853] 0 2026-03-09T16:14:37.968 INFO:tasks.workunit.client.0.vm03.stdout:7/510: dread d4/d2d/f52 [0,4194304] 0 2026-03-09T16:14:37.992 INFO:tasks.workunit.client.0.vm03.stdout:0/575: symlink d0/d7/d3e/d57/d5a/d5f/db2/d8e/lc3 0 2026-03-09T16:14:37.992 INFO:tasks.workunit.client.0.vm03.stdout:5/628: creat d2/d7/d1a/fde x:0 0 0 2026-03-09T16:14:37.992 INFO:tasks.workunit.client.0.vm03.stdout:5/629: readlink d2/d75/l8a 0 2026-03-09T16:14:37.993 INFO:tasks.workunit.client.0.vm03.stdout:4/583: creat d5/db/d25/d31/d4d/fb2 x:0 0 0 2026-03-09T16:14:37.994 INFO:tasks.workunit.client.0.vm03.stdout:2/570: creat db/d12/d2a/d99/fcd x:0 0 0 2026-03-09T16:14:37.995 INFO:tasks.workunit.client.0.vm03.stdout:1/477: symlink d4/d6/d3b/d6b/da5/la7 0 2026-03-09T16:14:37.995 INFO:tasks.workunit.client.0.vm03.stdout:0/576: creat d0/d7/d3e/d57/d5a/d5f/db2/dab/fc4 x:0 0 0 2026-03-09T16:14:37.996 INFO:tasks.workunit.client.0.vm03.stdout:1/478: chown d4/d6/d3b/d63/f82 35442 1 2026-03-09T16:14:37.998 INFO:tasks.workunit.client.0.vm03.stdout:6/547: dwrite d9/d42/d45/d50/d80/d90/d66/f81 [0,4194304] 0 2026-03-09T16:14:38.003 INFO:tasks.workunit.client.0.vm03.stdout:3/549: sync 2026-03-09T16:14:38.010 INFO:tasks.workunit.client.0.vm03.stdout:6/548: write d9/d42/d45/d65/f7f [49073,115939] 0 2026-03-09T16:14:38.013 INFO:tasks.workunit.client.0.vm03.stdout:3/550: dwrite d5/d6d/d6a/f8e [0,4194304] 0 2026-03-09T16:14:38.018 INFO:tasks.workunit.client.0.vm03.stdout:5/630: creat d2/fdf x:0 0 0 2026-03-09T16:14:38.025 INFO:tasks.workunit.client.0.vm03.stdout:5/631: fdatasync d2/d7/de/d11/d19/d31/fcb 0 2026-03-09T16:14:38.025 INFO:tasks.workunit.client.0.vm03.stdout:3/551: dwrite d5/d6d/d5a/f78 [0,4194304] 0 2026-03-09T16:14:38.025 INFO:tasks.workunit.client.0.vm03.stdout:5/632: dread - d2/d7/de/d11/d19/d31/fcb zero size 2026-03-09T16:14:38.025 INFO:tasks.workunit.client.0.vm03.stdout:5/633: readlink d2/d7/d1a/d1c/l53 0 2026-03-09T16:14:38.027 INFO:tasks.workunit.client.0.vm03.stdout:5/634: fdatasync d2/d7/de/d11/dbf/fc5 0 2026-03-09T16:14:38.036 INFO:tasks.workunit.client.0.vm03.stdout:9/608: link d2/d4/d11/d12/f35 d2/d54/fba 0 2026-03-09T16:14:38.037 INFO:tasks.workunit.client.0.vm03.stdout:7/511: mkdir d4/da/d18/d22/d24/d16/d3e/d77/da8/da9 0 2026-03-09T16:14:38.037 INFO:tasks.workunit.client.0.vm03.stdout:5/635: dwrite d2/d7/de/d11/d38/d3b/fb3 [0,4194304] 0 2026-03-09T16:14:38.041 INFO:tasks.workunit.client.0.vm03.stdout:4/584: creat d5/db/fb3 x:0 0 0 2026-03-09T16:14:38.053 INFO:tasks.workunit.client.0.vm03.stdout:5/636: dread d2/d7/de/d11/d38/d3b/f68 [0,4194304] 0 2026-03-09T16:14:38.057 INFO:tasks.workunit.client.0.vm03.stdout:2/571: symlink db/d12/d2a/d61/d6d/d8c/d94/da4/lce 0 2026-03-09T16:14:38.064 INFO:tasks.workunit.client.0.vm03.stdout:0/577: rename d0/d7/d3e/d57/d5a/d82/cb1 to d0/d7/d3e/d57/d5a/d52/cc5 0 2026-03-09T16:14:38.066 INFO:tasks.workunit.client.0.vm03.stdout:5/637: dwrite d2/d7/de/d11/f32 [4194304,4194304] 0 2026-03-09T16:14:38.070 INFO:tasks.workunit.client.0.vm03.stdout:6/549: symlink d9/d84/la2 0 2026-03-09T16:14:38.077 INFO:tasks.workunit.client.0.vm03.stdout:8/586: link da/d32/c72 da/db/d30/cbe 0 2026-03-09T16:14:38.077 INFO:tasks.workunit.client.0.vm03.stdout:8/587: readlink da/d10/l2b 0 2026-03-09T16:14:38.084 INFO:tasks.workunit.client.0.vm03.stdout:3/552: unlink d5/d6d/c90 0 2026-03-09T16:14:38.086 INFO:tasks.workunit.client.0.vm03.stdout:9/609: fsync d2/d4/d11/d12/f9a 0 2026-03-09T16:14:38.097 INFO:tasks.workunit.client.0.vm03.stdout:5/638: unlink d2/c21 0 2026-03-09T16:14:38.097 INFO:tasks.workunit.client.0.vm03.stdout:1/479: rename d4/d6/d1d/d20/d23/d3e to d4/d31/d5c/da8 0 2026-03-09T16:14:38.099 INFO:tasks.workunit.client.0.vm03.stdout:6/550: creat d9/d22/fa3 x:0 0 0 2026-03-09T16:14:38.103 INFO:tasks.workunit.client.0.vm03.stdout:7/512: symlink d4/da/d18/d22/d24/laa 0 2026-03-09T16:14:38.105 INFO:tasks.workunit.client.0.vm03.stdout:8/588: creat da/d1d/d3b/fbf x:0 0 0 2026-03-09T16:14:38.108 INFO:tasks.workunit.client.0.vm03.stdout:1/480: dwrite d4/d6/d3b/f36 [0,4194304] 0 2026-03-09T16:14:38.110 INFO:tasks.workunit.client.0.vm03.stdout:1/481: write d4/d6/d1d/d20/f2a [4980769,30909] 0 2026-03-09T16:14:38.126 INFO:tasks.workunit.client.0.vm03.stdout:9/610: truncate d2/d4/d1f/f25 8509148 0 2026-03-09T16:14:38.126 INFO:tasks.workunit.client.0.vm03.stdout:2/572: write db/d12/f37 [4154052,60733] 0 2026-03-09T16:14:38.130 INFO:tasks.workunit.client.0.vm03.stdout:0/578: symlink d0/d7/d3e/d57/d5a/d82/d89/dc0/lc6 0 2026-03-09T16:14:38.133 INFO:tasks.workunit.client.0.vm03.stdout:7/513: write d4/d2d/f8c [710861,82010] 0 2026-03-09T16:14:38.135 INFO:tasks.workunit.client.0.vm03.stdout:8/589: sync 2026-03-09T16:14:38.136 INFO:tasks.workunit.client.0.vm03.stdout:1/482: sync 2026-03-09T16:14:38.139 INFO:tasks.workunit.client.0.vm03.stdout:1/483: fsync d4/d6/d3b/d6b/d25/f84 0 2026-03-09T16:14:38.139 INFO:tasks.workunit.client.0.vm03.stdout:8/590: sync 2026-03-09T16:14:38.149 INFO:tasks.workunit.client.0.vm03.stdout:9/611: creat d2/d54/d7d/d8f/fbb x:0 0 0 2026-03-09T16:14:38.151 INFO:tasks.workunit.client.0.vm03.stdout:5/639: truncate d2/d7/d8/f86 228809 0 2026-03-09T16:14:38.154 INFO:tasks.workunit.client.0.vm03.stdout:4/585: getdents d5/db/d25/d31/d4d/d5b/d72 0 2026-03-09T16:14:38.156 INFO:tasks.workunit.client.0.vm03.stdout:7/514: mknod d4/da/d18/d22/d24/d16/d6e/d7e/cab 0 2026-03-09T16:14:38.156 INFO:tasks.workunit.client.0.vm03.stdout:4/586: write d5/db/f34 [1055905,113920] 0 2026-03-09T16:14:38.160 INFO:tasks.workunit.client.0.vm03.stdout:7/515: truncate d4/da/d45/fa4 754343 0 2026-03-09T16:14:38.160 INFO:tasks.workunit.client.0.vm03.stdout:4/587: chown d5/db/d25/d31/d4d/f85 15366 1 2026-03-09T16:14:38.170 INFO:tasks.workunit.client.0.vm03.stdout:8/591: rmdir da/d6c 39 2026-03-09T16:14:38.171 INFO:tasks.workunit.client.0.vm03.stdout:3/553: link d5/d53/d6c/l68 d5/d1e/d42/la2 0 2026-03-09T16:14:38.174 INFO:tasks.workunit.client.0.vm03.stdout:5/640: symlink d2/le0 0 2026-03-09T16:14:38.178 INFO:tasks.workunit.client.0.vm03.stdout:3/554: sync 2026-03-09T16:14:38.184 INFO:tasks.workunit.client.0.vm03.stdout:1/484: dwrite d4/d6/d1d/d3d/f45 [0,4194304] 0 2026-03-09T16:14:38.184 INFO:tasks.workunit.client.0.vm03.stdout:6/551: link d9/d84/la2 d9/d42/d45/d50/d80/d90/d66/la4 0 2026-03-09T16:14:38.190 INFO:tasks.workunit.client.0.vm03.stdout:6/552: chown d9/d14/c2e 447433 1 2026-03-09T16:14:38.195 INFO:tasks.workunit.client.0.vm03.stdout:8/592: mknod da/d10/d28/d64/cc0 0 2026-03-09T16:14:38.203 INFO:tasks.workunit.client.0.vm03.stdout:9/612: dwrite d2/d54/fba [0,4194304] 0 2026-03-09T16:14:38.203 INFO:tasks.workunit.client.0.vm03.stdout:6/553: dwrite d9/d42/f74 [0,4194304] 0 2026-03-09T16:14:38.203 INFO:tasks.workunit.client.0.vm03.stdout:9/613: write d2/d4/d1f/d83/fb9 [803019,110954] 0 2026-03-09T16:14:38.228 INFO:tasks.workunit.client.0.vm03.stdout:0/579: truncate d0/d7/d75/f69 1157722 0 2026-03-09T16:14:38.228 INFO:tasks.workunit.client.0.vm03.stdout:7/516: write d4/da/d45/d51/f5b [70307,27528] 0 2026-03-09T16:14:38.232 INFO:tasks.workunit.client.0.vm03.stdout:5/641: dread d2/d7/d1a/d1c/f5e [0,4194304] 0 2026-03-09T16:14:38.243 INFO:tasks.workunit.client.0.vm03.stdout:2/573: truncate db/d12/d2a/f58 2202284 0 2026-03-09T16:14:38.250 INFO:tasks.workunit.client.0.vm03.stdout:5/642: mknod d2/d7/d8/d24/d27/d43/d4b/dbc/ce1 0 2026-03-09T16:14:38.251 INFO:tasks.workunit.client.0.vm03.stdout:6/554: mkdir d9/d14/da5 0 2026-03-09T16:14:38.253 INFO:tasks.workunit.client.0.vm03.stdout:2/574: symlink db/d12/d2a/d61/d6d/d8c/lcf 0 2026-03-09T16:14:38.255 INFO:tasks.workunit.client.0.vm03.stdout:1/485: dwrite d4/f1b [0,4194304] 0 2026-03-09T16:14:38.257 INFO:tasks.workunit.client.0.vm03.stdout:7/517: creat d4/da/d18/d22/d24/d16/d3e/d77/da8/d9d/fac x:0 0 0 2026-03-09T16:14:38.257 INFO:tasks.workunit.client.0.vm03.stdout:4/588: getdents d5/d56 0 2026-03-09T16:14:38.264 INFO:tasks.workunit.client.0.vm03.stdout:5/643: dwrite d2/d7/d8/d16/d5c/f94 [0,4194304] 0 2026-03-09T16:14:38.264 INFO:tasks.workunit.client.0.vm03.stdout:5/644: chown d2/d7/d8/d24/d27/d43/d4b/dbc/lbd 9247039 1 2026-03-09T16:14:38.268 INFO:tasks.workunit.client.0.vm03.stdout:6/555: dread d9/d42/d45/d65/f7f [0,4194304] 0 2026-03-09T16:14:38.269 INFO:tasks.workunit.client.0.vm03.stdout:6/556: read d9/d22/f27 [2004241,31614] 0 2026-03-09T16:14:38.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:37 vm05.local ceph-mon[58702]: pgmap v11: 65 pgs: 65 active+clean; 951 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 MiB/s rd, 72 MiB/s wr, 189 op/s 2026-03-09T16:14:38.279 INFO:tasks.workunit.client.0.vm03.stdout:6/557: dwrite d9/d42/d45/d65/f7f [0,4194304] 0 2026-03-09T16:14:38.283 INFO:tasks.workunit.client.0.vm03.stdout:1/486: dwrite d4/d6/d1d/d3d/f49 [0,4194304] 0 2026-03-09T16:14:38.293 INFO:tasks.workunit.client.0.vm03.stdout:0/580: link d0/f54 d0/d7/d3e/d57/d5a/d82/d89/dbd/fc7 0 2026-03-09T16:14:38.305 INFO:tasks.workunit.client.0.vm03.stdout:1/487: dwrite d4/d31/f4f [0,4194304] 0 2026-03-09T16:14:38.317 INFO:tasks.workunit.client.0.vm03.stdout:8/593: creat da/d10/d28/d4f/d68/fc1 x:0 0 0 2026-03-09T16:14:38.319 INFO:tasks.workunit.client.0.vm03.stdout:8/594: read - da/d1d/d3b/fbf zero size 2026-03-09T16:14:38.321 INFO:tasks.workunit.client.0.vm03.stdout:3/555: truncate d5/d6d/d5a/d63/f8a 3263237 0 2026-03-09T16:14:38.325 INFO:tasks.workunit.client.0.vm03.stdout:3/556: dwrite d5/d1e/f9b [0,4194304] 0 2026-03-09T16:14:38.328 INFO:tasks.workunit.client.0.vm03.stdout:7/518: mknod d4/da/d18/d22/cad 0 2026-03-09T16:14:38.339 INFO:tasks.workunit.client.0.vm03.stdout:9/614: dwrite d2/df/f42 [4194304,4194304] 0 2026-03-09T16:14:38.367 INFO:tasks.workunit.client.0.vm03.stdout:0/581: mkdir d0/da/d1b/dc8 0 2026-03-09T16:14:38.374 INFO:tasks.workunit.client.0.vm03.stdout:8/595: dread da/f35 [0,4194304] 0 2026-03-09T16:14:38.376 INFO:tasks.workunit.client.0.vm03.stdout:8/596: dread da/d10/d28/d4f/d85/fa1 [0,4194304] 0 2026-03-09T16:14:38.390 INFO:tasks.workunit.client.0.vm03.stdout:8/597: dread da/d10/d63/f73 [0,4194304] 0 2026-03-09T16:14:38.391 INFO:tasks.workunit.client.0.vm03.stdout:8/598: readlink da/d32/d79/l9f 0 2026-03-09T16:14:38.395 INFO:tasks.workunit.client.0.vm03.stdout:7/519: symlink d4/da/d18/d22/lae 0 2026-03-09T16:14:38.401 INFO:tasks.workunit.client.0.vm03.stdout:5/645: symlink d2/d7/d8/d24/d27/le2 0 2026-03-09T16:14:38.402 INFO:tasks.workunit.client.0.vm03.stdout:5/646: chown d2/d7/de/d11/dbf 7545 1 2026-03-09T16:14:38.402 INFO:tasks.workunit.client.0.vm03.stdout:5/647: chown d2/d7/d3c/fdb 2563 1 2026-03-09T16:14:38.402 INFO:tasks.workunit.client.0.vm03.stdout:5/648: read - d2/d7/d8/d24/d27/d43/d4b/fd1 zero size 2026-03-09T16:14:38.404 INFO:tasks.workunit.client.0.vm03.stdout:5/649: write d2/d7/de/d11/f32 [7909636,123812] 0 2026-03-09T16:14:38.410 INFO:tasks.workunit.client.0.vm03.stdout:4/589: mknod d5/db/d25/d8b/da8/cb4 0 2026-03-09T16:14:38.411 INFO:tasks.workunit.client.0.vm03.stdout:9/615: symlink d2/df/d89/lbc 0 2026-03-09T16:14:38.413 INFO:tasks.workunit.client.0.vm03.stdout:5/650: dwrite d2/d7/d3c/fdb [0,4194304] 0 2026-03-09T16:14:38.441 INFO:tasks.workunit.client.0.vm03.stdout:4/590: sync 2026-03-09T16:14:38.527 INFO:tasks.workunit.client.0.vm03.stdout:1/488: creat d4/d6/d1d/d69/fa9 x:0 0 0 2026-03-09T16:14:38.541 INFO:tasks.workunit.client.0.vm03.stdout:1/489: dread d4/db/f60 [0,4194304] 0 2026-03-09T16:14:38.545 INFO:tasks.workunit.client.0.vm03.stdout:6/558: creat d9/d42/fa6 x:0 0 0 2026-03-09T16:14:38.546 INFO:tasks.workunit.client.0.vm03.stdout:4/591: mknod d5/db/cb5 0 2026-03-09T16:14:38.549 INFO:tasks.workunit.client.0.vm03.stdout:5/651: dwrite d2/d7/de/f48 [0,4194304] 0 2026-03-09T16:14:38.570 INFO:tasks.workunit.client.0.vm03.stdout:2/575: getdents db/d12/d2a/d61/d6d/d8c 0 2026-03-09T16:14:38.574 INFO:tasks.workunit.client.0.vm03.stdout:3/557: rename d5/d1e/d42/d8b/l94 to d5/d6d/la3 0 2026-03-09T16:14:38.584 INFO:tasks.workunit.client.0.vm03.stdout:8/599: getdents da/d32/d79/d95/dbd 0 2026-03-09T16:14:38.585 INFO:tasks.workunit.client.0.vm03.stdout:7/520: fsync d4/da/d18/d22/d24/d16/d3e/d77/da8/d61/f84 0 2026-03-09T16:14:38.586 INFO:tasks.workunit.client.0.vm03.stdout:8/600: fsync da/d10/d28/d4f/daf/fb7 0 2026-03-09T16:14:38.590 INFO:tasks.workunit.client.0.vm03.stdout:6/559: mknod d9/d8e/ca7 0 2026-03-09T16:14:38.597 INFO:tasks.workunit.client.0.vm03.stdout:2/576: rename db/d12/fb1 to db/d12/d2a/d61/d79/d83/d52/fd0 0 2026-03-09T16:14:38.620 INFO:tasks.workunit.client.0.vm03.stdout:1/490: mknod d4/d6/d3b/d8e/caa 0 2026-03-09T16:14:38.623 INFO:tasks.workunit.client.0.vm03.stdout:4/592: fsync d5/db/d25/d31/d33/d79/f4f 0 2026-03-09T16:14:38.632 INFO:tasks.workunit.client.0.vm03.stdout:6/560: rename d9/d42/d45/d50/d80/d90/d66 to d9/d42/d45/d50/d80/d8a/d9c/d97/da8 0 2026-03-09T16:14:38.634 INFO:tasks.workunit.client.0.vm03.stdout:0/582: write d0/da/fbe [876,11377] 0 2026-03-09T16:14:38.642 INFO:tasks.workunit.client.0.vm03.stdout:3/558: link d5/d6d/d6a/f8e d5/d1e/fa4 0 2026-03-09T16:14:38.643 INFO:tasks.workunit.client.0.vm03.stdout:3/559: fsync d5/d1e/d42/d55/f7e 0 2026-03-09T16:14:38.651 INFO:tasks.workunit.client.0.vm03.stdout:7/521: mknod d4/da/d18/d22/d24/d16/d3e/d77/da8/da9/caf 0 2026-03-09T16:14:38.665 INFO:tasks.workunit.client.0.vm03.stdout:0/583: mknod d0/d7/d3e/d57/d5a/d52/d9f/cc9 0 2026-03-09T16:14:38.667 INFO:tasks.workunit.client.0.vm03.stdout:9/616: write d2/d4/d1f/f25 [1726110,10194] 0 2026-03-09T16:14:38.676 INFO:tasks.workunit.client.0.vm03.stdout:5/652: dwrite d2/f5a [0,4194304] 0 2026-03-09T16:14:38.686 INFO:tasks.workunit.client.0.vm03.stdout:8/601: rmdir da/d32/d79/db9 0 2026-03-09T16:14:38.697 INFO:tasks.workunit.client.0.vm03.stdout:1/491: symlink d4/d6/d3b/d6b/d25/lab 0 2026-03-09T16:14:38.698 INFO:tasks.workunit.client.0.vm03.stdout:1/492: truncate d4/d6/d3b/f95 863950 0 2026-03-09T16:14:38.699 INFO:tasks.workunit.client.0.vm03.stdout:1/493: write d4/d6/d3b/d63/f89 [861357,78535] 0 2026-03-09T16:14:38.711 INFO:tasks.workunit.client.0.vm03.stdout:0/584: rename d0/da/c23 to d0/d7/d3e/d57/d5a/d5f/db2/dab/cca 0 2026-03-09T16:14:38.713 INFO:tasks.workunit.client.0.vm03.stdout:2/577: write db/d12/d2a/d61/d79/d83/f7e [246547,86741] 0 2026-03-09T16:14:38.760 INFO:tasks.workunit.client.0.vm03.stdout:1/494: creat d4/d7b/fac x:0 0 0 2026-03-09T16:14:38.772 INFO:tasks.workunit.client.0.vm03.stdout:3/560: write d5/d44/f5d [4443015,14070] 0 2026-03-09T16:14:38.772 INFO:tasks.workunit.client.0.vm03.stdout:3/561: write d5/fb [8759505,18619] 0 2026-03-09T16:14:38.772 INFO:tasks.workunit.client.0.vm03.stdout:6/561: link d9/d42/d45/d50/f51 d9/d84/fa9 0 2026-03-09T16:14:38.782 INFO:tasks.workunit.client.0.vm03.stdout:2/578: chown db/d12/d2a/f88 1417400 1 2026-03-09T16:14:38.784 INFO:tasks.workunit.client.0.vm03.stdout:9/617: mknod d2/d4/d11/d29/d2a/d38/db6/cbd 0 2026-03-09T16:14:38.794 INFO:tasks.workunit.client.0.vm03.stdout:4/593: getdents d5/db/d25/d31/d4d/d5b/d9a 0 2026-03-09T16:14:38.796 INFO:tasks.workunit.client.0.vm03.stdout:3/562: creat d5/d1e/d42/d8b/fa5 x:0 0 0 2026-03-09T16:14:38.797 INFO:tasks.workunit.client.0.vm03.stdout:1/495: symlink d4/db/lad 0 2026-03-09T16:14:38.800 INFO:tasks.workunit.client.0.vm03.stdout:6/562: symlink d9/d14/da5/laa 0 2026-03-09T16:14:38.802 INFO:tasks.workunit.client.0.vm03.stdout:6/563: fsync d9/d42/f78 0 2026-03-09T16:14:38.802 INFO:tasks.workunit.client.0.vm03.stdout:6/564: chown d9/d42/d45/d50/d80 3371 1 2026-03-09T16:14:38.803 INFO:tasks.workunit.client.0.vm03.stdout:6/565: chown d9/d42/d45/d65/l7b 71 1 2026-03-09T16:14:38.805 INFO:tasks.workunit.client.0.vm03.stdout:6/566: write d9/d42/d45/d65/f7f [1379744,87885] 0 2026-03-09T16:14:38.806 INFO:tasks.workunit.client.0.vm03.stdout:6/567: truncate d9/d42/fa6 429502 0 2026-03-09T16:14:38.808 INFO:tasks.workunit.client.0.vm03.stdout:6/568: symlink d9/d42/lab 0 2026-03-09T16:14:38.808 INFO:tasks.workunit.client.0.vm03.stdout:6/569: fdatasync d9/d84/f91 0 2026-03-09T16:14:38.810 INFO:tasks.workunit.client.0.vm03.stdout:6/570: creat d9/d14/d71/fac x:0 0 0 2026-03-09T16:14:38.813 INFO:tasks.workunit.client.0.vm03.stdout:9/618: sync 2026-03-09T16:14:38.813 INFO:tasks.workunit.client.0.vm03.stdout:3/563: sync 2026-03-09T16:14:38.816 INFO:tasks.workunit.client.0.vm03.stdout:2/579: rename db/d12/d2a/c35 to db/d12/d2a/d61/d79/cd1 0 2026-03-09T16:14:38.816 INFO:tasks.workunit.client.0.vm03.stdout:2/580: chown db/f2d 3453 1 2026-03-09T16:14:38.827 INFO:tasks.workunit.client.0.vm03.stdout:3/564: unlink d5/f43 0 2026-03-09T16:14:38.833 INFO:tasks.workunit.client.0.vm03.stdout:2/581: creat db/d12/da5/fd2 x:0 0 0 2026-03-09T16:14:38.835 INFO:tasks.workunit.client.0.vm03.stdout:9/619: mkdir d2/d4/d11/d29/d2a/db3/dbe 0 2026-03-09T16:14:38.843 INFO:tasks.workunit.client.0.vm03.stdout:9/620: dread d2/df/d89/f7e [0,4194304] 0 2026-03-09T16:14:38.843 INFO:tasks.workunit.client.0.vm03.stdout:9/621: chown d2/f7 881996572 1 2026-03-09T16:14:38.847 INFO:tasks.workunit.client.0.vm03.stdout:3/565: mknod d5/d2e/d8c/ca6 0 2026-03-09T16:14:38.849 INFO:tasks.workunit.client.0.vm03.stdout:3/566: dread d5/d6d/d5a/f78 [0,4194304] 0 2026-03-09T16:14:38.850 INFO:tasks.workunit.client.0.vm03.stdout:5/653: write d2/d7/de/d11/d38/d52/f7d [1535260,117599] 0 2026-03-09T16:14:38.855 INFO:tasks.workunit.client.0.vm03.stdout:4/594: creat d5/db/d25/d31/d33/fb6 x:0 0 0 2026-03-09T16:14:38.856 INFO:tasks.workunit.client.0.vm03.stdout:7/522: write d4/da/d18/d22/f48 [506444,83255] 0 2026-03-09T16:14:38.859 INFO:tasks.workunit.client.0.vm03.stdout:5/654: dread d2/d7/de/faa [0,4194304] 0 2026-03-09T16:14:38.867 INFO:tasks.workunit.client.0.vm03.stdout:9/622: stat d2/d4/d1f/f23 0 2026-03-09T16:14:38.871 INFO:tasks.workunit.client.0.vm03.stdout:8/602: symlink da/d32/lc2 0 2026-03-09T16:14:38.880 INFO:tasks.workunit.client.0.vm03.stdout:8/603: dwrite da/d10/d28/d4f/daf/fb7 [4194304,4194304] 0 2026-03-09T16:14:38.884 INFO:tasks.workunit.client.0.vm03.stdout:4/595: mkdir d5/d17/db7 0 2026-03-09T16:14:38.888 INFO:tasks.workunit.client.0.vm03.stdout:0/585: dwrite d0/da/d5c/f39 [0,4194304] 0 2026-03-09T16:14:38.889 INFO:tasks.workunit.client.0.vm03.stdout:3/567: dread d5/d1e/f31 [0,4194304] 0 2026-03-09T16:14:38.889 INFO:tasks.workunit.client.0.vm03.stdout:0/586: read - d0/da/d5c/f66 zero size 2026-03-09T16:14:38.890 INFO:tasks.workunit.client.0.vm03.stdout:3/568: chown d5/d6d/d6a/f8e 133211 1 2026-03-09T16:14:38.891 INFO:tasks.workunit.client.0.vm03.stdout:3/569: fsync d5/d1e/d42/d4c/f7d 0 2026-03-09T16:14:38.895 INFO:tasks.workunit.client.0.vm03.stdout:7/523: rename d4/da/d18/d22/d24/d16/d3e/d77/da8 to d4/da/d5d/db0 0 2026-03-09T16:14:38.906 INFO:tasks.workunit.client.0.vm03.stdout:5/655: creat d2/d75/fe3 x:0 0 0 2026-03-09T16:14:38.906 INFO:tasks.workunit.client.0.vm03.stdout:1/496: creat d4/d6/d1d/fae x:0 0 0 2026-03-09T16:14:38.906 INFO:tasks.workunit.client.0.vm03.stdout:1/497: write d4/d6/d1d/d69/f76 [815457,63454] 0 2026-03-09T16:14:38.906 INFO:tasks.workunit.client.0.vm03.stdout:9/623: truncate d2/d4/d1f/f44 242447 0 2026-03-09T16:14:38.906 INFO:tasks.workunit.client.0.vm03.stdout:8/604: truncate da/d6c/d7a/f7f 1616660 0 2026-03-09T16:14:38.908 INFO:tasks.workunit.client.0.vm03.stdout:4/596: creat d5/db/d25/d31/d4d/fb8 x:0 0 0 2026-03-09T16:14:38.911 INFO:tasks.workunit.client.0.vm03.stdout:3/570: dread d5/d44/f54 [0,4194304] 0 2026-03-09T16:14:38.919 INFO:tasks.workunit.client.0.vm03.stdout:6/571: dwrite d9/d22/f1c [0,4194304] 0 2026-03-09T16:14:38.928 INFO:tasks.workunit.client.0.vm03.stdout:0/587: rename d0/da/d5c/c83 to d0/d7/d3e/d57/d5a/d82/d89/dbd/ccb 0 2026-03-09T16:14:38.931 INFO:tasks.workunit.client.0.vm03.stdout:0/588: dread d0/d7/d3e/d57/f90 [0,4194304] 0 2026-03-09T16:14:38.932 INFO:tasks.workunit.client.0.vm03.stdout:7/524: mkdir d4/da/d5d/db0/d9d/db1 0 2026-03-09T16:14:38.933 INFO:tasks.workunit.client.0.vm03.stdout:7/525: stat d4/da/d18/c60 0 2026-03-09T16:14:38.938 INFO:tasks.workunit.client.0.vm03.stdout:0/589: dread d0/da/fbe [0,4194304] 0 2026-03-09T16:14:38.938 INFO:tasks.workunit.client.0.vm03.stdout:0/590: chown d0/d7/d3e/d57/d5a/d82/d89/dbd/la6 204344 1 2026-03-09T16:14:38.939 INFO:tasks.workunit.client.0.vm03.stdout:0/591: chown d0/d7/d3e/d57/d5a/fc1 14917 1 2026-03-09T16:14:38.941 INFO:tasks.workunit.client.0.vm03.stdout:9/624: rmdir d2/d4 39 2026-03-09T16:14:38.942 INFO:tasks.workunit.client.0.vm03.stdout:8/605: symlink da/db/d43/lc3 0 2026-03-09T16:14:38.946 INFO:tasks.workunit.client.0.vm03.stdout:4/597: unlink d5/db/d25/f4e 0 2026-03-09T16:14:38.946 INFO:tasks.workunit.client.0.vm03.stdout:4/598: chown d5/db/d25/d31/d4d/d5b/d72/d77 7 1 2026-03-09T16:14:38.952 INFO:tasks.workunit.client.0.vm03.stdout:0/592: creat d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc x:0 0 0 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:8/606: mkdir da/d6c/dc4 0 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:8/607: write da/d10/d28/d4f/d68/f8f [2144487,50938] 0 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:8/608: readlink da/d32/d79/l87 0 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:8/609: chown da/d32/d79/f84 83201299 1 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:2/582: mknod db/d12/d2a/d99/cd3 0 2026-03-09T16:14:38.960 INFO:tasks.workunit.client.0.vm03.stdout:6/572: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad 0 2026-03-09T16:14:38.965 INFO:tasks.workunit.client.0.vm03.stdout:7/526: fsync d4/f8 0 2026-03-09T16:14:38.972 INFO:tasks.workunit.client.0.vm03.stdout:8/610: creat da/db/da8/fc5 x:0 0 0 2026-03-09T16:14:38.973 INFO:tasks.workunit.client.0.vm03.stdout:8/611: dread - da/d10/d28/d4f/d68/fa9 zero size 2026-03-09T16:14:38.980 INFO:tasks.workunit.client.0.vm03.stdout:2/583: creat db/d12/d2a/d99/fd4 x:0 0 0 2026-03-09T16:14:38.984 INFO:tasks.workunit.client.0.vm03.stdout:8/612: stat da/d10/d28/d4f/d68/d80/c59 0 2026-03-09T16:14:38.985 INFO:tasks.workunit.client.0.vm03.stdout:4/599: creat d5/d17/da0/fb9 x:0 0 0 2026-03-09T16:14:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/498: truncate d4/d6/d1d/d20/f2a 1242783 0 2026-03-09T16:14:38.996 INFO:tasks.workunit.client.0.vm03.stdout:5/656: dwrite d2/d7/d8/d24/fb1 [0,4194304] 0 2026-03-09T16:14:38.997 INFO:tasks.workunit.client.0.vm03.stdout:1/499: write d4/d6/d3b/d63/f89 [934491,56235] 0 2026-03-09T16:14:38.999 INFO:tasks.workunit.client.0.vm03.stdout:2/584: sync 2026-03-09T16:14:39.007 INFO:tasks.workunit.client.0.vm03.stdout:3/571: dwrite d5/d1e/d42/f84 [0,4194304] 0 2026-03-09T16:14:39.007 INFO:tasks.workunit.client.0.vm03.stdout:8/613: dread da/d10/d28/d4f/d68/d80/f2f [0,4194304] 0 2026-03-09T16:14:39.012 INFO:tasks.workunit.client.0.vm03.stdout:6/573: write d9/d22/f24 [6442366,30608] 0 2026-03-09T16:14:39.015 INFO:tasks.workunit.client.0.vm03.stdout:0/593: symlink d0/d7/d3e/lcd 0 2026-03-09T16:14:39.019 INFO:tasks.workunit.client.0.vm03.stdout:6/574: write d9/d42/d45/d50/d80/d8a/d9c/f8c [1019525,111392] 0 2026-03-09T16:14:39.019 INFO:tasks.workunit.client.0.vm03.stdout:9/625: mkdir d2/d4/d11/d12/db2/dbf 0 2026-03-09T16:14:39.028 INFO:tasks.workunit.client.0.vm03.stdout:1/500: chown d4/d6/d1d/d20/d93/c54 35 1 2026-03-09T16:14:39.029 INFO:tasks.workunit.client.0.vm03.stdout:5/657: creat d2/d7/d8/d16/d5c/dcf/fe4 x:0 0 0 2026-03-09T16:14:39.033 INFO:tasks.workunit.client.0.vm03.stdout:3/572: rmdir d5/d1e 39 2026-03-09T16:14:39.045 INFO:tasks.workunit.client.0.vm03.stdout:9/626: creat d2/d4/d11/d29/fc0 x:0 0 0 2026-03-09T16:14:39.051 INFO:tasks.workunit.client.0.vm03.stdout:7/527: creat d4/da/d5d/db0/fb2 x:0 0 0 2026-03-09T16:14:39.054 INFO:tasks.workunit.client.0.vm03.stdout:3/573: sync 2026-03-09T16:14:39.055 INFO:tasks.workunit.client.0.vm03.stdout:1/501: creat d4/d31/d5c/faf x:0 0 0 2026-03-09T16:14:39.056 INFO:tasks.workunit.client.0.vm03.stdout:3/574: fdatasync d5/d53/d6c/f9f 0 2026-03-09T16:14:39.057 INFO:tasks.workunit.client.0.vm03.stdout:7/528: dread d4/da/d5d/db0/d61/f8b [0,4194304] 0 2026-03-09T16:14:39.064 INFO:tasks.workunit.client.0.vm03.stdout:6/575: dread d9/f35 [0,4194304] 0 2026-03-09T16:14:39.070 INFO:tasks.workunit.client.0.vm03.stdout:4/600: truncate d5/d17/f21 3618271 0 2026-03-09T16:14:39.071 INFO:tasks.workunit.client.0.vm03.stdout:4/601: fsync d5/d17/f2b 0 2026-03-09T16:14:39.077 INFO:tasks.workunit.client.0.vm03.stdout:0/594: truncate d0/f4e 804902 0 2026-03-09T16:14:39.079 INFO:tasks.workunit.client.0.vm03.stdout:2/585: dwrite db/d12/f84 [0,4194304] 0 2026-03-09T16:14:39.081 INFO:tasks.workunit.client.0.vm03.stdout:8/614: dwrite da/d32/d79/f84 [0,4194304] 0 2026-03-09T16:14:39.083 INFO:tasks.workunit.client.0.vm03.stdout:2/586: stat db/d12/d2a/d99/fd4 0 2026-03-09T16:14:39.083 INFO:tasks.workunit.client.0.vm03.stdout:0/595: write d0/d7/d3e/d57/d5a/d5f/db2/dab/fc4 [614354,30081] 0 2026-03-09T16:14:39.098 INFO:tasks.workunit.client.0.vm03.stdout:9/627: dread d2/d4/d11/d29/d2a/d38/f74 [0,4194304] 0 2026-03-09T16:14:39.100 INFO:tasks.workunit.client.0.vm03.stdout:9/628: readlink d2/d4/d11/d12/l21 0 2026-03-09T16:14:39.100 INFO:tasks.workunit.client.0.vm03.stdout:5/658: truncate d2/d7/de/d11/d19/d31/f42 4312681 0 2026-03-09T16:14:39.104 INFO:tasks.workunit.client.0.vm03.stdout:7/529: mknod d4/da/d5d/db0/da9/cb3 0 2026-03-09T16:14:39.105 INFO:tasks.workunit.client.0.vm03.stdout:7/530: fdatasync d4/d2d/f90 0 2026-03-09T16:14:39.109 INFO:tasks.workunit.client.0.vm03.stdout:7/531: read d4/da/d18/d22/f33 [2536992,111578] 0 2026-03-09T16:14:39.111 INFO:tasks.workunit.client.0.vm03.stdout:4/602: mkdir d5/dd/dba 0 2026-03-09T16:14:39.118 INFO:tasks.workunit.client.0.vm03.stdout:2/587: chown db/d12/d2a/d61/d6d/f81 17 1 2026-03-09T16:14:39.120 INFO:tasks.workunit.client.0.vm03.stdout:8/615: write da/f35 [1592610,46732] 0 2026-03-09T16:14:39.128 INFO:tasks.workunit.client.0.vm03.stdout:6/576: write f7 [1167259,122067] 0 2026-03-09T16:14:39.134 INFO:tasks.workunit.client.0.vm03.stdout:0/596: rmdir d0/da/d5c 39 2026-03-09T16:14:39.138 INFO:tasks.workunit.client.0.vm03.stdout:3/575: symlink d5/d1e/d42/d4c/la7 0 2026-03-09T16:14:39.143 INFO:tasks.workunit.client.0.vm03.stdout:8/616: dread da/db/f75 [0,4194304] 0 2026-03-09T16:14:39.148 INFO:tasks.workunit.client.0.vm03.stdout:9/629: rmdir d2/d4/d11/d29/d2a/d46 39 2026-03-09T16:14:39.156 INFO:tasks.workunit.client.0.vm03.stdout:5/659: dwrite d2/d7/d8/f36 [0,4194304] 0 2026-03-09T16:14:39.157 INFO:tasks.workunit.client.0.vm03.stdout:1/502: truncate d4/d6/f19 1317273 0 2026-03-09T16:14:39.170 INFO:tasks.workunit.client.0.vm03.stdout:7/532: symlink d4/da/d18/d22/d24/d16/lb4 0 2026-03-09T16:14:39.171 INFO:tasks.workunit.client.0.vm03.stdout:7/533: chown d4/da/d18/d22/lae 288194542 1 2026-03-09T16:14:39.178 INFO:tasks.workunit.client.0.vm03.stdout:6/577: mkdir d9/d42/d45/d65/dae 0 2026-03-09T16:14:39.178 INFO:tasks.workunit.client.0.vm03.stdout:0/597: mkdir d0/d7/d3e/d57/d5a/d47/dce 0 2026-03-09T16:14:39.185 INFO:tasks.workunit.client.0.vm03.stdout:3/576: rename d5/d2e/d8c/ca6 to d5/d1e/d42/d4c/ca8 0 2026-03-09T16:14:39.209 INFO:tasks.workunit.client.0.vm03.stdout:2/588: mkdir db/d12/da5/daf/dd5 0 2026-03-09T16:14:39.213 INFO:tasks.workunit.client.0.vm03.stdout:4/603: write d5/d17/d44/f64 [575893,70860] 0 2026-03-09T16:14:39.219 INFO:tasks.workunit.client.0.vm03.stdout:0/598: mkdir d0/d7/d3e/d57/d5a/d5f/db2/dcf 0 2026-03-09T16:14:39.219 INFO:tasks.workunit.client.0.vm03.stdout:4/604: write d5/db/d25/d31/d4d/d5b/d7d/f9d [765827,40507] 0 2026-03-09T16:14:39.229 INFO:tasks.workunit.client.0.vm03.stdout:6/578: dwrite d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:39.243 INFO:tasks.workunit.client.0.vm03.stdout:8/617: mknod da/db/da8/db8/cc6 0 2026-03-09T16:14:39.244 INFO:tasks.workunit.client.0.vm03.stdout:8/618: write da/db/f53 [8989736,101443] 0 2026-03-09T16:14:39.251 INFO:tasks.workunit.client.0.vm03.stdout:8/619: chown da/d10/d28/d4f/d68/d80/c59 419118 1 2026-03-09T16:14:39.255 INFO:tasks.workunit.client.0.vm03.stdout:1/503: unlink d4/d6/l27 0 2026-03-09T16:14:39.259 INFO:tasks.workunit.client.0.vm03.stdout:3/577: write d5/d1e/f31 [388748,29928] 0 2026-03-09T16:14:39.260 INFO:tasks.workunit.client.0.vm03.stdout:5/660: write d2/d7/de/d33/f8b [970840,87580] 0 2026-03-09T16:14:39.270 INFO:tasks.workunit.client.0.vm03.stdout:4/605: creat d5/dd/d1f/fbb x:0 0 0 2026-03-09T16:14:39.271 INFO:tasks.workunit.client.0.vm03.stdout:0/599: rmdir d0/d7/d3e/d57/d5a/d52/d9f 39 2026-03-09T16:14:39.278 INFO:tasks.workunit.client.0.vm03.stdout:6/579: creat d9/d42/d45/d50/d80/d8a/d9c/d97/faf x:0 0 0 2026-03-09T16:14:39.284 INFO:tasks.workunit.client.0.vm03.stdout:8/620: mkdir da/db/d30/dc7 0 2026-03-09T16:14:39.288 INFO:tasks.workunit.client.0.vm03.stdout:1/504: creat d4/db/d8b/fb0 x:0 0 0 2026-03-09T16:14:39.289 INFO:tasks.workunit.client.0.vm03.stdout:1/505: chown d4/db/ca0 0 1 2026-03-09T16:14:39.290 INFO:tasks.workunit.client.0.vm03.stdout:3/578: fsync d5/d6d/f7a 0 2026-03-09T16:14:39.293 INFO:tasks.workunit.client.0.vm03.stdout:5/661: creat d2/d7/d8/d16/fe5 x:0 0 0 2026-03-09T16:14:39.296 INFO:tasks.workunit.client.0.vm03.stdout:7/534: mkdir d4/da/d18/d22/d24/d16/d3e/db5 0 2026-03-09T16:14:39.301 INFO:tasks.workunit.client.0.vm03.stdout:2/589: link db/d12/d2a/d61/f9b db/d12/d2a/d61/d6d/d8c/d94/da4/fd6 0 2026-03-09T16:14:39.302 INFO:tasks.workunit.client.0.vm03.stdout:2/590: truncate db/d12/d2a/d99/fb8 842508 0 2026-03-09T16:14:39.306 INFO:tasks.workunit.client.0.vm03.stdout:4/606: dwrite d5/db/d25/d31/d4d/f85 [0,4194304] 0 2026-03-09T16:14:39.316 INFO:tasks.workunit.client.0.vm03.stdout:5/662: sync 2026-03-09T16:14:39.318 INFO:tasks.workunit.client.0.vm03.stdout:0/600: unlink d0/da/d1b/f6e 0 2026-03-09T16:14:39.321 INFO:tasks.workunit.client.0.vm03.stdout:9/630: link d2/d4/d11/d12/l2e d2/lc1 0 2026-03-09T16:14:39.325 INFO:tasks.workunit.client.0.vm03.stdout:8/621: rmdir da/d10/d63 39 2026-03-09T16:14:39.333 INFO:tasks.workunit.client.0.vm03.stdout:3/579: creat d5/d6d/d6a/fa9 x:0 0 0 2026-03-09T16:14:39.342 INFO:tasks.workunit.client.0.vm03.stdout:2/591: dwrite db/d12/d2a/f38 [0,4194304] 0 2026-03-09T16:14:39.346 INFO:tasks.workunit.client.0.vm03.stdout:7/535: write d4/da/d18/d22/f33 [978096,18480] 0 2026-03-09T16:14:39.347 INFO:tasks.workunit.client.0.vm03.stdout:4/607: unlink d5/db/f28 0 2026-03-09T16:14:39.351 INFO:tasks.workunit.client.0.vm03.stdout:4/608: dread d5/db/d25/d31/d4d/d5b/d72/f94 [0,4194304] 0 2026-03-09T16:14:39.353 INFO:tasks.workunit.client.0.vm03.stdout:4/609: chown d5/db/d25/d31/d4d/d5b/d72 10601541 1 2026-03-09T16:14:39.355 INFO:tasks.workunit.client.0.vm03.stdout:4/610: write d5/db/d25/d31/d4d/d5b/d72/d77/f91 [884061,86578] 0 2026-03-09T16:14:39.356 INFO:tasks.workunit.client.0.vm03.stdout:4/611: fdatasync d5/db/d25/d31/d4d/d5b/d72/f94 0 2026-03-09T16:14:39.357 INFO:tasks.workunit.client.0.vm03.stdout:4/612: chown d5/db/d25/d8b/da8/fae 1440 1 2026-03-09T16:14:39.358 INFO:tasks.workunit.client.0.vm03.stdout:4/613: write d5/f74 [4881762,72726] 0 2026-03-09T16:14:39.363 INFO:tasks.workunit.client.0.vm03.stdout:5/663: mkdir d2/d7/d8/d24/d27/d43/d4b/de6 0 2026-03-09T16:14:39.369 INFO:tasks.workunit.client.0.vm03.stdout:9/631: mkdir d2/d4/d11/d12/db2/dc2 0 2026-03-09T16:14:39.372 INFO:tasks.workunit.client.0.vm03.stdout:1/506: symlink d4/d6/d1d/d20/d23/lb1 0 2026-03-09T16:14:39.372 INFO:tasks.workunit.client.0.vm03.stdout:1/507: fdatasync d4/d7b/f90 0 2026-03-09T16:14:39.376 INFO:tasks.workunit.client.0.vm03.stdout:2/592: creat db/d12/d2a/d61/d6d/d8c/d94/dad/db3/fd7 x:0 0 0 2026-03-09T16:14:39.376 INFO:tasks.workunit.client.0.vm03.stdout:2/593: stat db/d12/d2a/d99/fcd 0 2026-03-09T16:14:39.467 INFO:tasks.workunit.client.0.vm03.stdout:4/614: creat d5/db/d25/d31/d4d/d5b/fbc x:0 0 0 2026-03-09T16:14:39.483 INFO:tasks.workunit.client.0.vm03.stdout:5/664: symlink d2/d7/de/d11/d19/d29/d90/le7 0 2026-03-09T16:14:39.483 INFO:tasks.workunit.client.0.vm03.stdout:5/665: fsync d2/d7/de/d11/d19/f8e 0 2026-03-09T16:14:39.486 INFO:tasks.workunit.client.0.vm03.stdout:6/580: rename d9/d22/f24 to d9/d42/d45/d50/fb0 0 2026-03-09T16:14:39.488 INFO:tasks.workunit.client.0.vm03.stdout:6/581: dread - d9/d14/d71/fac zero size 2026-03-09T16:14:39.489 INFO:tasks.workunit.client.0.vm03.stdout:1/508: mkdir d4/db/d8b/db2 0 2026-03-09T16:14:39.493 INFO:tasks.workunit.client.0.vm03.stdout:9/632: write d2/f8 [1505441,93360] 0 2026-03-09T16:14:39.497 INFO:tasks.workunit.client.0.vm03.stdout:0/601: dwrite d0/f54 [0,4194304] 0 2026-03-09T16:14:39.508 INFO:tasks.workunit.client.0.vm03.stdout:2/594: write db/d12/f62 [2296083,73205] 0 2026-03-09T16:14:39.516 INFO:tasks.workunit.client.0.vm03.stdout:8/622: rename da/db/fe to da/d10/d28/d64/fc8 0 2026-03-09T16:14:39.521 INFO:tasks.workunit.client.0.vm03.stdout:1/509: truncate d4/fd 2844764 0 2026-03-09T16:14:39.525 INFO:tasks.workunit.client.0.vm03.stdout:8/623: dwrite da/d10/fa4 [0,4194304] 0 2026-03-09T16:14:39.541 INFO:tasks.workunit.client.0.vm03.stdout:4/615: truncate d5/db/d25/d31/d33/f69 687712 0 2026-03-09T16:14:39.553 INFO:tasks.workunit.client.0.vm03.stdout:2/595: dwrite db/d12/d2a/d61/d79/d83/d64/dbd/f6b [4194304,4194304] 0 2026-03-09T16:14:39.564 INFO:tasks.workunit.client.0.vm03.stdout:6/582: symlink d9/lb1 0 2026-03-09T16:14:39.571 INFO:tasks.workunit.client.0.vm03.stdout:7/536: getdents d4 0 2026-03-09T16:14:39.573 INFO:tasks.workunit.client.0.vm03.stdout:4/616: dread - d5/d17/d44/f84 zero size 2026-03-09T16:14:39.577 INFO:tasks.workunit.client.0.vm03.stdout:2/596: creat db/d12/d2a/d61/d79/d83/d64/dbd/fd8 x:0 0 0 2026-03-09T16:14:39.583 INFO:tasks.workunit.client.0.vm03.stdout:2/597: dwrite db/d12/d2a/d99/fb8 [0,4194304] 0 2026-03-09T16:14:39.583 INFO:tasks.workunit.client.0.vm03.stdout:9/633: truncate d2/f8 1291712 0 2026-03-09T16:14:39.585 INFO:tasks.workunit.client.0.vm03.stdout:9/634: chown d2/d4/d11/d29/f4e 3 1 2026-03-09T16:14:39.588 INFO:tasks.workunit.client.0.vm03.stdout:9/635: dread - d2/de/d88/fa5 zero size 2026-03-09T16:14:39.595 INFO:tasks.workunit.client.0.vm03.stdout:2/598: dwrite db/d12/d2a/d61/d79/f7f [4194304,4194304] 0 2026-03-09T16:14:39.601 INFO:tasks.workunit.client.0.vm03.stdout:1/510: truncate d4/db/f60 2850621 0 2026-03-09T16:14:39.602 INFO:tasks.workunit.client.0.vm03.stdout:1/511: truncate d4/db/f7d 1276818 0 2026-03-09T16:14:39.607 INFO:tasks.workunit.client.0.vm03.stdout:6/583: truncate d9/d22/f62 109418 0 2026-03-09T16:14:39.610 INFO:tasks.workunit.client.0.vm03.stdout:7/537: symlink d4/lb6 0 2026-03-09T16:14:39.616 INFO:tasks.workunit.client.0.vm03.stdout:5/666: getdents d2/d7/de/d11/d19/d31 0 2026-03-09T16:14:39.625 INFO:tasks.workunit.client.0.vm03.stdout:3/580: rename d5/d6d/d6a/d9c to d5/d1e/d42/daa 0 2026-03-09T16:14:39.629 INFO:tasks.workunit.client.0.vm03.stdout:1/512: dread d4/d7b/f90 [0,4194304] 0 2026-03-09T16:14:39.635 INFO:tasks.workunit.client.0.vm03.stdout:9/636: fsync d2/d4/d11/d12/d28/f2f 0 2026-03-09T16:14:39.637 INFO:tasks.workunit.client.0.vm03.stdout:9/637: chown d2/df/l32 27329752 1 2026-03-09T16:14:39.637 INFO:tasks.workunit.client.0.vm03.stdout:9/638: chown d2/d4/d11/d29/d92/c93 732 1 2026-03-09T16:14:39.643 INFO:tasks.workunit.client.0.vm03.stdout:4/617: write d5/d17/d44/f61 [848034,22804] 0 2026-03-09T16:14:39.648 INFO:tasks.workunit.client.0.vm03.stdout:6/584: readlink d9/l12 0 2026-03-09T16:14:39.653 INFO:tasks.workunit.client.0.vm03.stdout:5/667: symlink d2/d7/de/d33/le8 0 2026-03-09T16:14:39.666 INFO:tasks.workunit.client.0.vm03.stdout:8/624: rename da/d6c/c82 to da/d32/cc9 0 2026-03-09T16:14:39.666 INFO:tasks.workunit.client.0.vm03.stdout:0/602: rename d0/d7/d3e/d57/d5a/d5f/db2/d8e to d0/d7/d3e/d57/d5a/d5f/db2/d8e/dd0 22 2026-03-09T16:14:39.667 INFO:tasks.workunit.client.0.vm03.stdout:5/668: dread d2/d7/de/d11/d19/f8e [0,4194304] 0 2026-03-09T16:14:39.668 INFO:tasks.workunit.client.0.vm03.stdout:0/603: write d0/d7/d48/f43 [204817,124849] 0 2026-03-09T16:14:39.685 INFO:tasks.workunit.client.0.vm03.stdout:1/513: fdatasync d4/db/f21 0 2026-03-09T16:14:39.686 INFO:tasks.workunit.client.0.vm03.stdout:0/604: dread d0/d7/d3e/d57/d5a/d82/fa0 [0,4194304] 0 2026-03-09T16:14:39.694 INFO:tasks.workunit.client.0.vm03.stdout:4/618: creat d5/db/d25/d31/d4d/d5b/d7d/fbd x:0 0 0 2026-03-09T16:14:39.697 INFO:tasks.workunit.client.0.vm03.stdout:2/599: symlink db/d12/da5/daf/dd5/ld9 0 2026-03-09T16:14:39.697 INFO:tasks.workunit.client.0.vm03.stdout:4/619: write d5/d17/f2b [4925441,110262] 0 2026-03-09T16:14:39.698 INFO:tasks.workunit.client.0.vm03.stdout:6/585: creat d9/d22/fb2 x:0 0 0 2026-03-09T16:14:39.699 INFO:tasks.workunit.client.0.vm03.stdout:2/600: dread - db/d12/d2a/d61/d79/d83/d64/dbd/da0/fcb zero size 2026-03-09T16:14:39.707 INFO:tasks.workunit.client.0.vm03.stdout:8/625: mknod da/db/d43/cca 0 2026-03-09T16:14:39.708 INFO:tasks.workunit.client.0.vm03.stdout:2/601: dwrite db/d12/d2a/d61/d79/fb7 [0,4194304] 0 2026-03-09T16:14:39.718 INFO:tasks.workunit.client.0.vm03.stdout:0/605: sync 2026-03-09T16:14:39.718 INFO:tasks.workunit.client.0.vm03.stdout:4/620: sync 2026-03-09T16:14:39.722 INFO:tasks.workunit.client.0.vm03.stdout:4/621: chown d5/dd/c38 71456914 1 2026-03-09T16:14:39.727 INFO:tasks.workunit.client.0.vm03.stdout:8/626: dread da/d10/d28/f8b [0,4194304] 0 2026-03-09T16:14:39.728 INFO:tasks.workunit.client.0.vm03.stdout:8/627: write da/db/f53 [7703435,56087] 0 2026-03-09T16:14:39.737 INFO:tasks.workunit.client.0.vm03.stdout:3/581: getdents d5/d1e/d42/d4c/da1 0 2026-03-09T16:14:39.740 INFO:tasks.workunit.client.0.vm03.stdout:5/669: dwrite d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:39.743 INFO:tasks.workunit.client.0.vm03.stdout:4/622: sync 2026-03-09T16:14:39.755 INFO:tasks.workunit.client.0.vm03.stdout:5/670: dread d2/d7/de/d11/d38/d3b/fb3 [0,4194304] 0 2026-03-09T16:14:39.766 INFO:tasks.workunit.client.0.vm03.stdout:8/628: rename da/f35 to da/d1d/fcb 0 2026-03-09T16:14:39.773 INFO:tasks.workunit.client.0.vm03.stdout:3/582: creat d5/d44/d61/fab x:0 0 0 2026-03-09T16:14:39.773 INFO:tasks.workunit.client.0.vm03.stdout:3/583: dread - d5/d53/d6c/f9f zero size 2026-03-09T16:14:39.775 INFO:tasks.workunit.client.0.vm03.stdout:9/639: truncate d2/d4/d11/d29/d2a/d4d/f56 3500575 0 2026-03-09T16:14:39.777 INFO:tasks.workunit.client.0.vm03.stdout:7/538: write d4/f3b [4430846,15541] 0 2026-03-09T16:14:39.781 INFO:tasks.workunit.client.0.vm03.stdout:9/640: sync 2026-03-09T16:14:39.782 INFO:tasks.workunit.client.0.vm03.stdout:9/641: dread - d2/d4/d11/d29/fc0 zero size 2026-03-09T16:14:39.795 INFO:tasks.workunit.client.0.vm03.stdout:1/514: write d4/d6/f9 [4262969,50865] 0 2026-03-09T16:14:39.798 INFO:tasks.workunit.client.0.vm03.stdout:1/515: read d4/d6/d3b/f95 [34859,18234] 0 2026-03-09T16:14:39.821 INFO:tasks.workunit.client.0.vm03.stdout:6/586: link d9/d22/f1c d9/d8e/fb3 0 2026-03-09T16:14:39.821 INFO:tasks.workunit.client.0.vm03.stdout:6/587: readlink d9/d14/da5/laa 0 2026-03-09T16:14:39.821 INFO:tasks.workunit.client.0.vm03.stdout:8/629: mknod da/db/da8/ccc 0 2026-03-09T16:14:39.822 INFO:tasks.workunit.client.0.vm03.stdout:8/630: dread - da/d10/d28/fb0 zero size 2026-03-09T16:14:39.822 INFO:tasks.workunit.client.0.vm03.stdout:3/584: truncate d5/d1e/f26 3641977 0 2026-03-09T16:14:39.822 INFO:tasks.workunit.client.0.vm03.stdout:6/588: dread - d9/d42/f9a zero size 2026-03-09T16:14:39.823 INFO:tasks.workunit.client.0.vm03.stdout:8/631: fdatasync da/d6c/f8e 0 2026-03-09T16:14:39.834 INFO:tasks.workunit.client.0.vm03.stdout:4/623: mkdir d5/db/d25/d8b/da8/dbe 0 2026-03-09T16:14:39.835 INFO:tasks.workunit.client.0.vm03.stdout:4/624: chown d5/d17/d44/f64 13749 1 2026-03-09T16:14:39.835 INFO:tasks.workunit.client.0.vm03.stdout:4/625: readlink d5/d17/l97 0 2026-03-09T16:14:39.841 INFO:tasks.workunit.client.0.vm03.stdout:2/602: getdents db/d12/d2a 0 2026-03-09T16:14:39.851 INFO:tasks.workunit.client.0.vm03.stdout:8/632: symlink da/d1d/d3b/lcd 0 2026-03-09T16:14:39.853 INFO:tasks.workunit.client.0.vm03.stdout:1/516: dwrite d4/d6/d3b/d63/f77 [0,4194304] 0 2026-03-09T16:14:39.855 INFO:tasks.workunit.client.0.vm03.stdout:1/517: read - d4/d31/d5c/f9e zero size 2026-03-09T16:14:39.856 INFO:tasks.workunit.client.0.vm03.stdout:1/518: write d4/d6/d3b/d63/f77 [2155565,52125] 0 2026-03-09T16:14:39.863 INFO:tasks.workunit.client.0.vm03.stdout:7/539: mkdir d4/da/d18/d22/d24/d15/d71/db7 0 2026-03-09T16:14:39.864 INFO:tasks.workunit.client.0.vm03.stdout:7/540: truncate d4/da/d5d/db0/fb2 96188 0 2026-03-09T16:14:39.868 INFO:tasks.workunit.client.0.vm03.stdout:9/642: symlink d2/d4/d11/lc3 0 2026-03-09T16:14:39.874 INFO:tasks.workunit.client.0.vm03.stdout:4/626: symlink d5/db/d25/d31/d4d/da9/lbf 0 2026-03-09T16:14:39.891 INFO:tasks.workunit.client.0.vm03.stdout:0/606: getdents d0/da/d1b/d9b 0 2026-03-09T16:14:39.895 INFO:tasks.workunit.client.0.vm03.stdout:3/585: mknod d5/d1e/d42/d34/cac 0 2026-03-09T16:14:39.905 INFO:tasks.workunit.client.0.vm03.stdout:2/603: mkdir db/d12/d2a/d61/d6d/d8c/d94/dad/db3/dda 0 2026-03-09T16:14:39.912 INFO:tasks.workunit.client.0.vm03.stdout:8/633: mkdir da/d10/d28/db1/dce 0 2026-03-09T16:14:39.912 INFO:tasks.workunit.client.0.vm03.stdout:8/634: chown da/db/c25 129210 1 2026-03-09T16:14:39.928 INFO:tasks.workunit.client.0.vm03.stdout:7/541: mkdir d4/da/d5d/db0/da9/db8 0 2026-03-09T16:14:39.938 INFO:tasks.workunit.client.0.vm03.stdout:9/643: dwrite d2/d4/d11/d29/d2a/d46/f9e [0,4194304] 0 2026-03-09T16:14:39.939 INFO:tasks.workunit.client.0.vm03.stdout:4/627: symlink d5/dd/d1f/d95/lc0 0 2026-03-09T16:14:39.956 INFO:tasks.workunit.client.0.vm03.stdout:1/519: write d4/fa [3414734,103740] 0 2026-03-09T16:14:39.956 INFO:tasks.workunit.client.0.vm03.stdout:5/671: getdents d2/d7/d8 0 2026-03-09T16:14:39.956 INFO:tasks.workunit.client.0.vm03.stdout:0/607: symlink d0/d7/d3e/d57/d5a/d5f/db2/dab/ld1 0 2026-03-09T16:14:39.957 INFO:tasks.workunit.client.0.vm03.stdout:1/520: stat d4/d6/d1d/d20/l83 0 2026-03-09T16:14:39.958 INFO:tasks.workunit.client.0.vm03.stdout:5/672: dread - d2/d7/d8/d16/d5c/dcf/fe4 zero size 2026-03-09T16:14:39.960 INFO:tasks.workunit.client.0.vm03.stdout:2/604: unlink db/d12/d2a/d61/f20 0 2026-03-09T16:14:39.962 INFO:tasks.workunit.client.0.vm03.stdout:6/589: rename d9/l12 to d9/d42/d45/lb4 0 2026-03-09T16:14:39.963 INFO:tasks.workunit.client.0.vm03.stdout:8/635: rmdir da/db/da8 39 2026-03-09T16:14:39.963 INFO:tasks.workunit.client.0.vm03.stdout:7/542: creat d4/da/d45/fb9 x:0 0 0 2026-03-09T16:14:39.967 INFO:tasks.workunit.client.0.vm03.stdout:8/636: dread da/d1d/fcb [0,4194304] 0 2026-03-09T16:14:39.981 INFO:tasks.workunit.client.0.vm03.stdout:2/605: unlink db/d12/f57 0 2026-03-09T16:14:39.995 INFO:tasks.workunit.client.0.vm03.stdout:9/644: rename d2/d4/d11/d29/ca1 to d2/d4/d11/d12/db2/dc2/cc4 0 2026-03-09T16:14:40.004 INFO:tasks.workunit.client.0.vm03.stdout:7/543: truncate d4/da/d18/f44 5604670 0 2026-03-09T16:14:40.007 INFO:tasks.workunit.client.0.vm03.stdout:3/586: creat d5/d1e/d42/d34/fad x:0 0 0 2026-03-09T16:14:40.013 INFO:tasks.workunit.client.0.vm03.stdout:8/637: mknod da/d10/d28/d4f/d68/d80/ccf 0 2026-03-09T16:14:40.017 INFO:tasks.workunit.client.0.vm03.stdout:0/608: write d0/da/d1b/d9b/f93 [686248,91074] 0 2026-03-09T16:14:40.022 INFO:tasks.workunit.client.0.vm03.stdout:5/673: dwrite d2/d7/d1a/d1c/d3f/f67 [0,4194304] 0 2026-03-09T16:14:40.027 INFO:tasks.workunit.client.0.vm03.stdout:5/674: dwrite d2/d7/d8/f86 [0,4194304] 0 2026-03-09T16:14:40.036 INFO:tasks.workunit.client.0.vm03.stdout:5/675: dwrite d2/d7/de/d11/d19/d31/f99 [4194304,4194304] 0 2026-03-09T16:14:40.038 INFO:tasks.workunit.client.0.vm03.stdout:5/676: write d2/d7/d1a/d1c/f5e [8303919,92465] 0 2026-03-09T16:14:40.038 INFO:tasks.workunit.client.0.vm03.stdout:5/677: chown d2/d7/d1a/d1c/d6c 6 1 2026-03-09T16:14:40.040 INFO:tasks.workunit.client.0.vm03.stdout:5/678: write d2/d75/fe3 [389612,40649] 0 2026-03-09T16:14:40.045 INFO:tasks.workunit.client.0.vm03.stdout:6/590: write d9/d42/f78 [1287626,65470] 0 2026-03-09T16:14:40.061 INFO:tasks.workunit.client.0.vm03.stdout:3/587: read d5/d1e/d42/f2c [99390,112094] 0 2026-03-09T16:14:40.063 INFO:tasks.workunit.client.0.vm03.stdout:0/609: mkdir d0/d7/d3e/d57/d5a/d82/dd2 0 2026-03-09T16:14:40.065 INFO:tasks.workunit.client.0.vm03.stdout:0/610: read d0/d7/d3e/d57/d5a/d5f/db2/dab/fc4 [174692,102325] 0 2026-03-09T16:14:40.068 INFO:tasks.workunit.client.0.vm03.stdout:2/606: symlink db/d12/d2a/d61/dbe/ldb 0 2026-03-09T16:14:40.083 INFO:tasks.workunit.client.0.vm03.stdout:5/679: mkdir d2/d7/de9 0 2026-03-09T16:14:40.083 INFO:tasks.workunit.client.0.vm03.stdout:4/628: getdents d5/db/d25/d31/d33/d79 0 2026-03-09T16:14:40.090 INFO:tasks.workunit.client.0.vm03.stdout:5/680: dread d2/d7/de/d11/d19/d31/d35/d87/f8d [0,4194304] 0 2026-03-09T16:14:40.100 INFO:tasks.workunit.client.0.vm03.stdout:6/591: dread d9/d42/d45/d50/d80/d8a/d9c/d97/da8/f7d [0,4194304] 0 2026-03-09T16:14:40.101 INFO:tasks.workunit.client.0.vm03.stdout:6/592: chown d9/d14/l25 540862 1 2026-03-09T16:14:40.105 INFO:tasks.workunit.client.0.vm03.stdout:4/629: dread d5/f74 [0,4194304] 0 2026-03-09T16:14:40.118 INFO:tasks.workunit.client.0.vm03.stdout:4/630: dwrite d5/db/d25/d31/d4d/fb2 [0,4194304] 0 2026-03-09T16:14:40.124 INFO:tasks.workunit.client.0.vm03.stdout:9/645: write d2/d4/d11/f41 [825897,118037] 0 2026-03-09T16:14:40.133 INFO:tasks.workunit.client.0.vm03.stdout:2/607: creat db/d12/d2a/d61/d6d/d8c/d94/da4/fdc x:0 0 0 2026-03-09T16:14:40.134 INFO:tasks.workunit.client.0.vm03.stdout:7/544: dwrite d4/d2d/d4b/f6b [0,4194304] 0 2026-03-09T16:14:40.136 INFO:tasks.workunit.client.0.vm03.stdout:9/646: read d2/d4/d11/d12/f45 [2182396,53644] 0 2026-03-09T16:14:40.139 INFO:tasks.workunit.client.0.vm03.stdout:1/521: rename d4/d6/d1d/d20/l34 to d4/d6/d3b/d6b/d25/lb3 0 2026-03-09T16:14:40.139 INFO:tasks.workunit.client.0.vm03.stdout:5/681: rename d2/d7/de/d33 to d2/d7/de/d33/dea 22 2026-03-09T16:14:40.163 INFO:tasks.workunit.client.0.vm03.stdout:0/611: creat d0/d7/d3e/d57/d5a/d52/d9f/fd3 x:0 0 0 2026-03-09T16:14:40.163 INFO:tasks.workunit.client.0.vm03.stdout:6/593: creat d9/d42/d45/d65/fb5 x:0 0 0 2026-03-09T16:14:40.164 INFO:tasks.workunit.client.0.vm03.stdout:4/631: creat d5/db/d25/d31/d4d/da9/fc1 x:0 0 0 2026-03-09T16:14:40.164 INFO:tasks.workunit.client.0.vm03.stdout:2/608: fsync db/d12/f77 0 2026-03-09T16:14:40.167 INFO:tasks.workunit.client.0.vm03.stdout:5/682: stat d2/d7/d8/d24/d27/c34 0 2026-03-09T16:14:40.169 INFO:tasks.workunit.client.0.vm03.stdout:5/683: fdatasync d2/d7/de/d11/d19/d31/f7e 0 2026-03-09T16:14:40.172 INFO:tasks.workunit.client.0.vm03.stdout:6/594: dwrite d9/d42/d45/d50/d80/fa1 [0,4194304] 0 2026-03-09T16:14:40.185 INFO:tasks.workunit.client.0.vm03.stdout:8/638: getdents da/d45 0 2026-03-09T16:14:40.187 INFO:tasks.workunit.client.0.vm03.stdout:8/639: fsync da/d6c/fae 0 2026-03-09T16:14:40.187 INFO:tasks.workunit.client.0.vm03.stdout:3/588: rmdir d5/d1e/d42/daa 0 2026-03-09T16:14:40.187 INFO:tasks.workunit.client.0.vm03.stdout:3/589: write d5/d44/f56 [3128577,43336] 0 2026-03-09T16:14:40.187 INFO:tasks.workunit.client.0.vm03.stdout:3/590: chown d5/d6d/d5a/f7c 849 1 2026-03-09T16:14:40.187 INFO:tasks.workunit.client.0.vm03.stdout:3/591: fdatasync d5/d1e/f31 0 2026-03-09T16:14:40.200 INFO:tasks.workunit.client.0.vm03.stdout:4/632: readlink d5/l29 0 2026-03-09T16:14:40.203 INFO:tasks.workunit.client.0.vm03.stdout:8/640: dread da/d10/f23 [0,4194304] 0 2026-03-09T16:14:40.203 INFO:tasks.workunit.client.0.vm03.stdout:0/612: dwrite d0/da/d5c/db6/fbc [0,4194304] 0 2026-03-09T16:14:40.205 INFO:tasks.workunit.client.0.vm03.stdout:2/609: mknod db/d12/d2a/d61/d79/d83/d64/dbd/da0/cdd 0 2026-03-09T16:14:40.210 INFO:tasks.workunit.client.0.vm03.stdout:7/545: rename d4/ld to d4/da/d18/d22/d24/d16/d3e/lba 0 2026-03-09T16:14:40.220 INFO:tasks.workunit.client.0.vm03.stdout:6/595: write d9/f3b [2824188,47596] 0 2026-03-09T16:14:40.225 INFO:tasks.workunit.client.0.vm03.stdout:8/641: symlink da/d10/d28/db1/ld0 0 2026-03-09T16:14:40.232 INFO:tasks.workunit.client.0.vm03.stdout:0/613: unlink d0/d7/d3e/d57/d5a/d5f/db2/l65 0 2026-03-09T16:14:40.232 INFO:tasks.workunit.client.0.vm03.stdout:2/610: creat db/d12/d2a/d99/fde x:0 0 0 2026-03-09T16:14:40.232 INFO:tasks.workunit.client.0.vm03.stdout:9/647: rename d2/d4/d1f/f25 to d2/d4/d11/d12/d28/fc5 0 2026-03-09T16:14:40.232 INFO:tasks.workunit.client.0.vm03.stdout:2/611: read db/d12/d2a/d61/d6d/d8c/d94/dad/fb5 [1442053,55538] 0 2026-03-09T16:14:40.233 INFO:tasks.workunit.client.0.vm03.stdout:2/612: readlink db/d12/d2a/d61/d79/d83/d52/l89 0 2026-03-09T16:14:40.234 INFO:tasks.workunit.client.0.vm03.stdout:2/613: chown db/d12/d2a/d61/f74 2909551 1 2026-03-09T16:14:40.236 INFO:tasks.workunit.client.0.vm03.stdout:2/614: write db/d12/d2a/d99/fde [129304,66793] 0 2026-03-09T16:14:40.237 INFO:tasks.workunit.client.0.vm03.stdout:7/546: dread d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:40.246 INFO:tasks.workunit.client.0.vm03.stdout:6/596: symlink d9/d14/da5/lb6 0 2026-03-09T16:14:40.249 INFO:tasks.workunit.client.0.vm03.stdout:1/522: truncate d4/d6/d1d/d20/f72 1999871 0 2026-03-09T16:14:40.252 INFO:tasks.workunit.client.0.vm03.stdout:9/648: dread d2/d4/d11/f41 [0,4194304] 0 2026-03-09T16:14:40.253 INFO:tasks.workunit.client.0.vm03.stdout:6/597: dwrite d9/d22/f83 [0,4194304] 0 2026-03-09T16:14:40.258 INFO:tasks.workunit.client.0.vm03.stdout:8/642: fsync f8 0 2026-03-09T16:14:40.260 INFO:tasks.workunit.client.0.vm03.stdout:5/684: dwrite d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:40.265 INFO:tasks.workunit.client.0.vm03.stdout:1/523: dwrite d4/d6/d3b/d6b/d25/f4e [0,4194304] 0 2026-03-09T16:14:40.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:39 vm05.local ceph-mon[58702]: pgmap v12: 65 pgs: 65 active+clean; 1.3 GiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 35 MiB/s rd, 127 MiB/s wr, 309 op/s 2026-03-09T16:14:40.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:39 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:40.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:39 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:40.277 INFO:tasks.workunit.client.0.vm03.stdout:4/633: rename d5/d17/f9c to d5/db/d25/d31/d4d/d5b/fc2 0 2026-03-09T16:14:40.278 INFO:tasks.workunit.client.0.vm03.stdout:2/615: rename db/d12/d2a/d61/d6d/d8c/d94/dad to db/d12/d2a/d61/d6d/d8c/d94/dad/db3/ddf 22 2026-03-09T16:14:40.284 INFO:tasks.workunit.client.0.vm03.stdout:7/547: rmdir d4/da/d5d/db0/d61 39 2026-03-09T16:14:40.293 INFO:tasks.workunit.client.0.vm03.stdout:8/643: unlink da/d6c/f8e 0 2026-03-09T16:14:40.293 INFO:tasks.workunit.client.0.vm03.stdout:6/598: mkdir d9/d42/d45/d50/d80/d90/db7 0 2026-03-09T16:14:40.299 INFO:tasks.workunit.client.0.vm03.stdout:4/634: rmdir d5/d56 39 2026-03-09T16:14:40.310 INFO:tasks.workunit.client.0.vm03.stdout:2/616: rename db/c24 to db/d12/da5/dbb/dc3/ce0 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:3/592: getdents d5/d6d/d5a 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:1/524: mknod d4/d6/d3b/d6b/cb4 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:1/525: chown d4/d6/d1d/d20/d23/f30 1014856 1 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:0/614: creat d0/d7/d3e/fd4 x:0 0 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:8/644: symlink da/d10/d28/d4f/d68/ld1 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:6/599: mknod d9/d42/cb8 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:5/685: rename d2/d7/d8/d24/d27/l2c to d2/d7/de9/leb 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:1/526: dwrite d4/d6/d3b/d63/f82 [4194304,4194304] 0 2026-03-09T16:14:40.319 INFO:tasks.workunit.client.0.vm03.stdout:7/548: dwrite d4/da/d18/d22/d24/d16/d6e/f73 [0,4194304] 0 2026-03-09T16:14:40.321 INFO:tasks.workunit.client.0.vm03.stdout:3/593: mkdir d5/d1e/d42/d55/d86/dae 0 2026-03-09T16:14:40.323 INFO:tasks.workunit.client.0.vm03.stdout:5/686: fsync d2/d7/d8/d16/d5c/dcf/fe4 0 2026-03-09T16:14:40.327 INFO:tasks.workunit.client.0.vm03.stdout:6/600: mknod d9/d14/d71/cb9 0 2026-03-09T16:14:40.332 INFO:tasks.workunit.client.0.vm03.stdout:1/527: read - d4/d6/d1d/d20/d93/f6f zero size 2026-03-09T16:14:40.335 INFO:tasks.workunit.client.0.vm03.stdout:1/528: dread d4/d39/d70/f97 [0,4194304] 0 2026-03-09T16:14:40.363 INFO:tasks.workunit.client.0.vm03.stdout:5/687: creat d2/d75/fec x:0 0 0 2026-03-09T16:14:40.363 INFO:tasks.workunit.client.0.vm03.stdout:2/617: sync 2026-03-09T16:14:40.364 INFO:tasks.workunit.client.0.vm03.stdout:5/688: write d2/d75/fd0 [375618,901] 0 2026-03-09T16:14:40.379 INFO:tasks.workunit.client.0.vm03.stdout:0/615: getdents d0/da/d1b 0 2026-03-09T16:14:40.384 INFO:tasks.workunit.client.0.vm03.stdout:0/616: chown d0/d7/d3e/d57/d5a/d82/d89/dbd/la6 10885 1 2026-03-09T16:14:40.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:39 vm03.local ceph-mon[51019]: pgmap v12: 65 pgs: 65 active+clean; 1.3 GiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 35 MiB/s rd, 127 MiB/s wr, 309 op/s 2026-03-09T16:14:40.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:39 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:40.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:39 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:40.391 INFO:tasks.workunit.client.0.vm03.stdout:9/649: write d2/de/d88/f86 [334503,22328] 0 2026-03-09T16:14:40.402 INFO:tasks.workunit.client.0.vm03.stdout:8/645: write da/d10/d28/d4f/d85/fa1 [558820,28101] 0 2026-03-09T16:14:40.404 INFO:tasks.workunit.client.0.vm03.stdout:4/635: dwrite d5/db/d25/f26 [0,4194304] 0 2026-03-09T16:14:40.412 INFO:tasks.workunit.client.0.vm03.stdout:4/636: dwrite d5/d17/d44/f61 [0,4194304] 0 2026-03-09T16:14:40.423 INFO:tasks.workunit.client.0.vm03.stdout:6/601: dwrite d9/d22/f62 [0,4194304] 0 2026-03-09T16:14:40.425 INFO:tasks.workunit.client.0.vm03.stdout:6/602: dread - d9/d8e/f94 zero size 2026-03-09T16:14:40.449 INFO:tasks.workunit.client.0.vm03.stdout:8/646: dread da/d6c/d7a/f97 [0,4194304] 0 2026-03-09T16:14:40.467 INFO:tasks.workunit.client.0.vm03.stdout:3/594: link d5/d53/d6c/l4f d5/d6d/d5a/laf 0 2026-03-09T16:14:40.468 INFO:tasks.workunit.client.0.vm03.stdout:3/595: write d5/d1e/d42/d55/f7e [461732,88721] 0 2026-03-09T16:14:40.475 INFO:tasks.workunit.client.0.vm03.stdout:2/618: creat db/d12/d2a/d61/d6d/d8c/d94/dad/db3/fe1 x:0 0 0 2026-03-09T16:14:40.480 INFO:tasks.workunit.client.0.vm03.stdout:5/689: truncate d2/d7/de/d11/d19/d31/d35/d87/f8d 429567 0 2026-03-09T16:14:40.483 INFO:tasks.workunit.client.0.vm03.stdout:7/549: link d4/d2d/d4b/l86 d4/da/d45/lbb 0 2026-03-09T16:14:40.490 INFO:tasks.workunit.client.0.vm03.stdout:9/650: unlink d2/d4/d11/d29/d2a/d38/c71 0 2026-03-09T16:14:40.491 INFO:tasks.workunit.client.0.vm03.stdout:9/651: chown d2/d4/d11/d29 178 1 2026-03-09T16:14:40.501 INFO:tasks.workunit.client.0.vm03.stdout:8/647: symlink da/d1d/d3b/ld2 0 2026-03-09T16:14:40.502 INFO:tasks.workunit.client.0.vm03.stdout:2/619: rename db/d12/da5/daf to db/d12/da5/de2 0 2026-03-09T16:14:40.506 INFO:tasks.workunit.client.0.vm03.stdout:5/690: symlink d2/d7/de/d54/led 0 2026-03-09T16:14:40.518 INFO:tasks.workunit.client.0.vm03.stdout:3/596: write d5/d1e/d42/f2c [1094402,97402] 0 2026-03-09T16:14:40.519 INFO:tasks.workunit.client.0.vm03.stdout:7/550: write d4/da/d18/d22/d24/d15/f34 [1524783,70266] 0 2026-03-09T16:14:40.522 INFO:tasks.workunit.client.0.vm03.stdout:0/617: mknod d0/d7/d3e/d57/d5a/d47/dce/cd5 0 2026-03-09T16:14:40.524 INFO:tasks.workunit.client.0.vm03.stdout:4/637: rmdir d5/d56 39 2026-03-09T16:14:40.529 INFO:tasks.workunit.client.0.vm03.stdout:4/638: dwrite d5/d17/f8d [0,4194304] 0 2026-03-09T16:14:40.530 INFO:tasks.workunit.client.0.vm03.stdout:4/639: read - d5/db/d25/d31/d4d/d5b/d7d/fbd zero size 2026-03-09T16:14:40.532 INFO:tasks.workunit.client.0.vm03.stdout:4/640: read d5/db/d25/d31/d33/d79/f7b [324159,39259] 0 2026-03-09T16:14:40.545 INFO:tasks.workunit.client.0.vm03.stdout:2/620: mknod db/d12/da5/ce3 0 2026-03-09T16:14:40.547 INFO:tasks.workunit.client.0.vm03.stdout:5/691: fsync d2/d7/d1a/d1c/d6c/f79 0 2026-03-09T16:14:40.547 INFO:tasks.workunit.client.0.vm03.stdout:1/529: getdents d4/d39/d7f 0 2026-03-09T16:14:40.551 INFO:tasks.workunit.client.0.vm03.stdout:1/530: dwrite d4/d6/d1d/f66 [0,4194304] 0 2026-03-09T16:14:40.553 INFO:tasks.workunit.client.0.vm03.stdout:2/621: dwrite db/d12/d2a/f38 [0,4194304] 0 2026-03-09T16:14:40.556 INFO:tasks.workunit.client.0.vm03.stdout:6/603: creat d9/d42/d45/d50/fba x:0 0 0 2026-03-09T16:14:40.567 INFO:tasks.workunit.client.0.vm03.stdout:2/622: dwrite db/d12/d2a/d99/fcd [0,4194304] 0 2026-03-09T16:14:40.569 INFO:tasks.workunit.client.0.vm03.stdout:2/623: write db/d12/d2a/d61/f65 [438784,21877] 0 2026-03-09T16:14:40.584 INFO:tasks.workunit.client.0.vm03.stdout:5/692: mkdir d2/d7/d8/d16/dee 0 2026-03-09T16:14:40.602 INFO:tasks.workunit.client.0.vm03.stdout:7/551: mkdir d4/da/d5d/db0/d61/dbc 0 2026-03-09T16:14:40.603 INFO:tasks.workunit.client.0.vm03.stdout:9/652: rename d2/d4/d11/d29/f5d to d2/fc6 0 2026-03-09T16:14:40.606 INFO:tasks.workunit.client.0.vm03.stdout:7/552: dwrite d4/d2d/f90 [0,4194304] 0 2026-03-09T16:14:40.619 INFO:tasks.workunit.client.0.vm03.stdout:2/624: mkdir db/d12/da5/de4 0 2026-03-09T16:14:40.654 INFO:tasks.workunit.client.0.vm03.stdout:8/648: rename da/d1d/d3b/f4b to da/d10/d28/fd3 0 2026-03-09T16:14:40.661 INFO:tasks.workunit.client.0.vm03.stdout:9/653: mkdir d2/d4/d11/d12/dc7 0 2026-03-09T16:14:40.666 INFO:tasks.workunit.client.0.vm03.stdout:1/531: mkdir d4/d6/d1d/db5 0 2026-03-09T16:14:40.682 INFO:tasks.workunit.client.0.vm03.stdout:3/597: write d5/d1e/d42/d55/f57 [1516203,15921] 0 2026-03-09T16:14:40.693 INFO:tasks.workunit.client.0.vm03.stdout:4/641: link d5/db/d25/d8b/da8/d81/f8c d5/db/d25/d31/d4d/d5b/d7d/fc3 0 2026-03-09T16:14:40.697 INFO:tasks.workunit.client.0.vm03.stdout:6/604: rename d9/d42/d45/d50/l72 to d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/lbb 0 2026-03-09T16:14:40.702 INFO:tasks.workunit.client.0.vm03.stdout:5/693: dwrite d2/d7/de/d11/d38/d3b/f68 [0,4194304] 0 2026-03-09T16:14:40.702 INFO:tasks.workunit.client.0.vm03.stdout:5/694: readlink d2/d7/d8/d16/l5f 0 2026-03-09T16:14:40.702 INFO:tasks.workunit.client.0.vm03.stdout:2/625: write db/d12/d2a/d61/d79/f95 [23698,22587] 0 2026-03-09T16:14:40.702 INFO:tasks.workunit.client.0.vm03.stdout:8/649: write da/d1d/fcb [704375,128640] 0 2026-03-09T16:14:40.705 INFO:tasks.workunit.client.0.vm03.stdout:8/650: chown da/d6c 1795 1 2026-03-09T16:14:40.712 INFO:tasks.workunit.client.0.vm03.stdout:5/695: read d2/d7/d8/d16/d5c/fb5 [2276449,34940] 0 2026-03-09T16:14:40.713 INFO:tasks.workunit.client.0.vm03.stdout:4/642: dwrite d5/db/f34 [0,4194304] 0 2026-03-09T16:14:40.717 INFO:tasks.workunit.client.0.vm03.stdout:7/553: symlink d4/da/d18/d22/d24/lbd 0 2026-03-09T16:14:40.718 INFO:tasks.workunit.client.0.vm03.stdout:0/618: getdents d0/d7/d3e/d95 0 2026-03-09T16:14:40.719 INFO:tasks.workunit.client.0.vm03.stdout:7/554: read d4/da/f42 [1959949,2101] 0 2026-03-09T16:14:40.728 INFO:tasks.workunit.client.0.vm03.stdout:2/626: readlink db/l3a 0 2026-03-09T16:14:40.730 INFO:tasks.workunit.client.0.vm03.stdout:9/654: creat d2/de/d88/d7a/fc8 x:0 0 0 2026-03-09T16:14:40.736 INFO:tasks.workunit.client.0.vm03.stdout:5/696: unlink d2/d7/d1a/l1f 0 2026-03-09T16:14:40.740 INFO:tasks.workunit.client.0.vm03.stdout:7/555: mknod d4/da/d45/cbe 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:7/556: fdatasync d4/da/d18/d22/d24/d16/d6e/fa1 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:2/627: creat db/d12/da5/dbb/dc3/fe5 x:0 0 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:9/655: rmdir d2/d4/d11 39 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:8/651: dread - da/d10/d28/d4f/d68/d80/f58 zero size 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:7/557: dread d4/da/f5f [0,4194304] 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:5/697: creat d2/d7/d8/d24/fef x:0 0 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:5/698: readlink d2/d7/d8/d24/l51 0 2026-03-09T16:14:40.755 INFO:tasks.workunit.client.0.vm03.stdout:2/628: creat db/d12/d2a/d61/dbe/fe6 x:0 0 0 2026-03-09T16:14:40.756 INFO:tasks.workunit.client.0.vm03.stdout:7/558: dwrite d4/f3b [0,4194304] 0 2026-03-09T16:14:40.759 INFO:tasks.workunit.client.0.vm03.stdout:9/656: rename d2/df/d89/f9b to d2/df/d84/fc9 0 2026-03-09T16:14:40.760 INFO:tasks.workunit.client.0.vm03.stdout:9/657: chown d2/f7 89532 1 2026-03-09T16:14:40.773 INFO:tasks.workunit.client.0.vm03.stdout:3/598: sync 2026-03-09T16:14:40.774 INFO:tasks.workunit.client.0.vm03.stdout:3/599: chown d5/d53/f96 28 1 2026-03-09T16:14:40.778 INFO:tasks.workunit.client.0.vm03.stdout:2/629: dread db/d12/d2a/d61/f54 [0,4194304] 0 2026-03-09T16:14:40.779 INFO:tasks.workunit.client.0.vm03.stdout:9/658: stat d2/d4/d11/d12/d28/c8e 0 2026-03-09T16:14:40.788 INFO:tasks.workunit.client.0.vm03.stdout:9/659: creat d2/d4/d11/d29/d2a/d38/fca x:0 0 0 2026-03-09T16:14:40.789 INFO:tasks.workunit.client.0.vm03.stdout:9/660: fsync d2/df/f42 0 2026-03-09T16:14:40.790 INFO:tasks.workunit.client.0.vm03.stdout:0/619: sync 2026-03-09T16:14:40.791 INFO:tasks.workunit.client.0.vm03.stdout:3/600: truncate d5/d1e/d42/f25 3635122 0 2026-03-09T16:14:40.794 INFO:tasks.workunit.client.0.vm03.stdout:9/661: fsync d2/d54/f69 0 2026-03-09T16:14:40.798 INFO:tasks.workunit.client.0.vm03.stdout:2/630: sync 2026-03-09T16:14:40.801 INFO:tasks.workunit.client.0.vm03.stdout:9/662: creat d2/d4/d11/d12/db2/fcb x:0 0 0 2026-03-09T16:14:40.802 INFO:tasks.workunit.client.0.vm03.stdout:9/663: chown d2/d4/d11/d12/db2/dbf 432 1 2026-03-09T16:14:40.803 INFO:tasks.workunit.client.0.vm03.stdout:9/664: read d2/d4/d11/d12/f50 [275620,118226] 0 2026-03-09T16:14:40.805 INFO:tasks.workunit.client.0.vm03.stdout:1/532: write d4/d31/f79 [738005,52644] 0 2026-03-09T16:14:40.807 INFO:tasks.workunit.client.0.vm03.stdout:6/605: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/da8/f7d [4194304,4194304] 0 2026-03-09T16:14:40.809 INFO:tasks.workunit.client.0.vm03.stdout:4/643: write d5/d17/d44/f84 [17119,19173] 0 2026-03-09T16:14:40.810 INFO:tasks.workunit.client.0.vm03.stdout:6/606: fdatasync d9/d42/d45/d50/d80/d90/f64 0 2026-03-09T16:14:40.810 INFO:tasks.workunit.client.0.vm03.stdout:5/699: rmdir d2/d7 39 2026-03-09T16:14:40.816 INFO:tasks.workunit.client.0.vm03.stdout:8/652: dwrite da/d32/d79/f90 [0,4194304] 0 2026-03-09T16:14:40.819 INFO:tasks.workunit.client.0.vm03.stdout:8/653: stat da/d1d/d3b/c5a 0 2026-03-09T16:14:40.820 INFO:tasks.workunit.client.0.vm03.stdout:0/620: symlink d0/d7/ld6 0 2026-03-09T16:14:40.828 INFO:tasks.workunit.client.0.vm03.stdout:7/559: dwrite d4/da/d18/d22/d24/d15/f2a [0,4194304] 0 2026-03-09T16:14:40.836 INFO:tasks.workunit.client.0.vm03.stdout:9/665: fsync d2/d4/d11/d12/f68 0 2026-03-09T16:14:40.841 INFO:tasks.workunit.client.0.vm03.stdout:4/644: rmdir d5/d17 39 2026-03-09T16:14:40.856 INFO:tasks.workunit.client.0.vm03.stdout:5/700: write d2/d7/d1a/f6e [2103935,1095] 0 2026-03-09T16:14:40.857 INFO:tasks.workunit.client.0.vm03.stdout:5/701: write d2/d7/de/d11/d19/d31/f7e [3226274,81741] 0 2026-03-09T16:14:40.866 INFO:tasks.workunit.client.0.vm03.stdout:3/601: link d5/d1e/d42/d8b/c97 d5/d1e/d42/d34/cb0 0 2026-03-09T16:14:40.866 INFO:tasks.workunit.client.0.vm03.stdout:3/602: write d5/d1e/d42/d55/f57 [1334333,112099] 0 2026-03-09T16:14:40.868 INFO:tasks.workunit.client.0.vm03.stdout:8/654: symlink da/d10/d28/d4f/d68/ld4 0 2026-03-09T16:14:40.868 INFO:tasks.workunit.client.0.vm03.stdout:7/560: mkdir d4/da/dbf 0 2026-03-09T16:14:40.871 INFO:tasks.workunit.client.0.vm03.stdout:5/702: dread d2/d7/de/f48 [0,4194304] 0 2026-03-09T16:14:40.872 INFO:tasks.workunit.client.0.vm03.stdout:5/703: write d2/d7/de/d11/d19/d31/fcb [416604,31685] 0 2026-03-09T16:14:40.872 INFO:tasks.workunit.client.0.vm03.stdout:9/666: mkdir d2/d4/d11/d12/dc7/dcc 0 2026-03-09T16:14:40.873 INFO:tasks.workunit.client.0.vm03.stdout:9/667: fdatasync d2/d4/f17 0 2026-03-09T16:14:40.873 INFO:tasks.workunit.client.0.vm03.stdout:9/668: write d2/d54/d7d/d8f/fbb [322585,61579] 0 2026-03-09T16:14:40.874 INFO:tasks.workunit.client.0.vm03.stdout:9/669: chown d2/d4/d11/d12/f45 938803 1 2026-03-09T16:14:40.877 INFO:tasks.workunit.client.0.vm03.stdout:4/645: sync 2026-03-09T16:14:40.912 INFO:tasks.workunit.client.0.vm03.stdout:3/603: mknod d5/d1e/d42/d55/cb1 0 2026-03-09T16:14:40.928 INFO:tasks.workunit.client.0.vm03.stdout:7/561: symlink d4/da/d18/d22/d24/d15/d71/lc0 0 2026-03-09T16:14:40.928 INFO:tasks.workunit.client.0.vm03.stdout:2/631: write db/d12/f85 [2738095,3944] 0 2026-03-09T16:14:40.932 INFO:tasks.workunit.client.0.vm03.stdout:2/632: chown db/d12/d2a/d61/d6d/l7d 27587 1 2026-03-09T16:14:40.933 INFO:tasks.workunit.client.0.vm03.stdout:0/621: dwrite d0/d7/d48/f18 [0,4194304] 0 2026-03-09T16:14:40.936 INFO:tasks.workunit.client.0.vm03.stdout:5/704: creat d2/d7/d8/d16/ff0 x:0 0 0 2026-03-09T16:14:40.946 INFO:tasks.workunit.client.0.vm03.stdout:9/670: truncate d2/d4/d11/d12/f45 761210 0 2026-03-09T16:14:40.947 INFO:tasks.workunit.client.0.vm03.stdout:7/562: dwrite d4/da/d5d/f9b [0,4194304] 0 2026-03-09T16:14:40.948 INFO:tasks.workunit.client.0.vm03.stdout:1/533: link d4/cf d4/d6/d1d/d20/d93/cb6 0 2026-03-09T16:14:40.949 INFO:tasks.workunit.client.0.vm03.stdout:6/607: getdents d9/d42/d45/d50/d80/d8a/d9c/d97 0 2026-03-09T16:14:40.951 INFO:tasks.workunit.client.0.vm03.stdout:7/563: write d4/d2d/f8c [1174590,118870] 0 2026-03-09T16:14:40.958 INFO:tasks.workunit.client.0.vm03.stdout:2/633: read - db/d12/d2a/d61/d79/d83/d64/dbd/f9e zero size 2026-03-09T16:14:40.969 INFO:tasks.workunit.client.0.vm03.stdout:2/634: write db/d12/f77 [3265293,41866] 0 2026-03-09T16:14:40.969 INFO:tasks.workunit.client.0.vm03.stdout:5/705: rename d2/d7/d8/d16/ff0 to d2/d7/d8/d16/d5c/ff1 0 2026-03-09T16:14:40.977 INFO:tasks.workunit.client.0.vm03.stdout:7/564: unlink d4/da/d45/d51/d36/l88 0 2026-03-09T16:14:40.986 INFO:tasks.workunit.client.0.vm03.stdout:3/604: fsync d5/d1e/d42/f25 0 2026-03-09T16:14:41.005 INFO:tasks.workunit.client.0.vm03.stdout:0/622: creat d0/d7/d3e/d57/d5a/d5f/db2/dcf/fd7 x:0 0 0 2026-03-09T16:14:41.006 INFO:tasks.workunit.client.0.vm03.stdout:2/635: mkdir db/d12/d2a/d99/de7 0 2026-03-09T16:14:41.007 INFO:tasks.workunit.client.0.vm03.stdout:5/706: symlink d2/d7/de/d11/dbf/lf2 0 2026-03-09T16:14:41.011 INFO:tasks.workunit.client.0.vm03.stdout:9/671: dwrite d2/d54/d7d/fa4 [0,4194304] 0 2026-03-09T16:14:41.011 INFO:tasks.workunit.client.0.vm03.stdout:1/534: dwrite d4/d39/f5a [4194304,4194304] 0 2026-03-09T16:14:41.017 INFO:tasks.workunit.client.0.vm03.stdout:1/535: dread - d4/d6/d3b/f9b zero size 2026-03-09T16:14:41.017 INFO:tasks.workunit.client.0.vm03.stdout:4/646: dwrite d5/dd/d1f/f4c [0,4194304] 0 2026-03-09T16:14:41.017 INFO:tasks.workunit.client.0.vm03.stdout:1/536: chown d4/d6/d3b 61043528 1 2026-03-09T16:14:41.019 INFO:tasks.workunit.client.0.vm03.stdout:4/647: fsync d5/dd/d1f/d5f/f7c 0 2026-03-09T16:14:41.019 INFO:tasks.workunit.client.0.vm03.stdout:1/537: dread - d4/d6/d1d/fae zero size 2026-03-09T16:14:41.022 INFO:tasks.workunit.client.0.vm03.stdout:0/623: dwrite d0/d7/f3d [0,4194304] 0 2026-03-09T16:14:41.031 INFO:tasks.workunit.client.0.vm03.stdout:4/648: dwrite d5/dd/f23 [0,4194304] 0 2026-03-09T16:14:41.052 INFO:tasks.workunit.client.0.vm03.stdout:1/538: dread d4/d31/f4f [0,4194304] 0 2026-03-09T16:14:41.078 INFO:tasks.workunit.client.0.vm03.stdout:5/707: truncate d2/d7/de/d11/f26 5575182 0 2026-03-09T16:14:41.089 INFO:tasks.workunit.client.0.vm03.stdout:9/672: mkdir d2/d4/d11/d29/d2a/d38/dcd 0 2026-03-09T16:14:41.126 INFO:tasks.workunit.client.0.vm03.stdout:7/565: fdatasync d4/da/d18/d22/d24/d16/f6c 0 2026-03-09T16:14:41.138 INFO:tasks.workunit.client.0.vm03.stdout:7/566: dwrite d4/d2d/f95 [0,4194304] 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:4/649: dread d5/db/d25/f78 [0,4194304] 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:3/605: rename d5/d1e/d42/d34/cb0 to d5/d1e/d42/d34/d70/cb2 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:8/655: link da/d10/d63/l93 da/d10/ld5 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:5/708: creat d2/d7/d8/d24/ff3 x:0 0 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:0/624: mknod d0/da/d1b/dc8/cd8 0 2026-03-09T16:14:41.151 INFO:tasks.workunit.client.0.vm03.stdout:9/673: readlink d2/d4/d11/d12/l2e 0 2026-03-09T16:14:41.158 INFO:tasks.workunit.client.0.vm03.stdout:8/656: fdatasync da/d6c/d7a/f7f 0 2026-03-09T16:14:41.158 INFO:tasks.workunit.client.0.vm03.stdout:8/657: stat da 0 2026-03-09T16:14:41.158 INFO:tasks.workunit.client.0.vm03.stdout:8/658: truncate da/d10/d28/d4f/d68/fa9 987101 0 2026-03-09T16:14:41.159 INFO:tasks.workunit.client.0.vm03.stdout:8/659: readlink da/db/l98 0 2026-03-09T16:14:41.159 INFO:tasks.workunit.client.0.vm03.stdout:8/660: dread - da/d6c/fae zero size 2026-03-09T16:14:41.161 INFO:tasks.workunit.client.0.vm03.stdout:9/674: dwrite d2/d4/d11/d29/d2a/d38/fb4 [0,4194304] 0 2026-03-09T16:14:41.163 INFO:tasks.workunit.client.0.vm03.stdout:6/608: getdents d9/d14/da5 0 2026-03-09T16:14:41.171 INFO:tasks.workunit.client.0.vm03.stdout:7/567: mknod d4/d2d/d4b/cc1 0 2026-03-09T16:14:41.185 INFO:tasks.workunit.client.0.vm03.stdout:2/636: getdents db/d12/d2a/d61/d6d/d8c 0 2026-03-09T16:14:41.190 INFO:tasks.workunit.client.0.vm03.stdout:0/625: mkdir d0/da/d1b/d9b/dd9 0 2026-03-09T16:14:41.190 INFO:tasks.workunit.client.0.vm03.stdout:5/709: rmdir d2/d7/de/d11/d19/d29/d90 39 2026-03-09T16:14:41.193 INFO:tasks.workunit.client.0.vm03.stdout:5/710: dread - d2/d7/de/d11/d38/d52/fc6 zero size 2026-03-09T16:14:41.194 INFO:tasks.workunit.client.0.vm03.stdout:5/711: truncate d2/d7/de/d11/dbf/fc5 1035731 0 2026-03-09T16:14:41.202 INFO:tasks.workunit.client.0.vm03.stdout:7/568: creat d4/da/d5d/db0/d9d/fc2 x:0 0 0 2026-03-09T16:14:41.206 INFO:tasks.workunit.client.0.vm03.stdout:4/650: mkdir d5/db/d25/d8b/da8/dbe/dc4 0 2026-03-09T16:14:41.207 INFO:tasks.workunit.client.0.vm03.stdout:0/626: fdatasync d0/d7/d48/f4a 0 2026-03-09T16:14:41.208 INFO:tasks.workunit.client.0.vm03.stdout:4/651: fsync d5/db/d25/d31/d33/d79/fa6 0 2026-03-09T16:14:41.212 INFO:tasks.workunit.client.0.vm03.stdout:9/675: sync 2026-03-09T16:14:41.212 INFO:tasks.workunit.client.0.vm03.stdout:2/637: sync 2026-03-09T16:14:41.212 INFO:tasks.workunit.client.0.vm03.stdout:5/712: sync 2026-03-09T16:14:41.212 INFO:tasks.workunit.client.0.vm03.stdout:8/661: mknod da/d10/d28/d4f/daf/cd6 0 2026-03-09T16:14:41.212 INFO:tasks.workunit.client.0.vm03.stdout:8/662: chown da/d10/d28/d64/c78 507 1 2026-03-09T16:14:41.213 INFO:tasks.workunit.client.0.vm03.stdout:9/676: truncate d2/df/d84/d8a/fb5 660958 0 2026-03-09T16:14:41.213 INFO:tasks.workunit.client.0.vm03.stdout:9/677: fdatasync d2/d4/f17 0 2026-03-09T16:14:41.237 INFO:tasks.workunit.client.0.vm03.stdout:1/539: getdents d4/d6 0 2026-03-09T16:14:41.237 INFO:tasks.workunit.client.0.vm03.stdout:0/627: creat d0/d7/d3e/d57/d5a/d52/d9f/fda x:0 0 0 2026-03-09T16:14:41.237 INFO:tasks.workunit.client.0.vm03.stdout:4/652: creat d5/db/d25/d8b/da8/dbe/fc5 x:0 0 0 2026-03-09T16:14:41.237 INFO:tasks.workunit.client.0.vm03.stdout:3/606: creat d5/fb3 x:0 0 0 2026-03-09T16:14:41.237 INFO:tasks.workunit.client.0.vm03.stdout:5/713: fsync d2/d7/d1a/f9f 0 2026-03-09T16:14:41.239 INFO:tasks.workunit.client.0.vm03.stdout:3/607: chown d5/c2f 195 1 2026-03-09T16:14:41.239 INFO:tasks.workunit.client.0.vm03.stdout:3/608: readlink d5/d53/d6c/l41 0 2026-03-09T16:14:41.244 INFO:tasks.workunit.client.0.vm03.stdout:7/569: dread d4/da/d45/d51/d36/f6f [0,4194304] 0 2026-03-09T16:14:41.244 INFO:tasks.workunit.client.0.vm03.stdout:6/609: link d9/d22/l19 d9/d42/d45/d50/d80/d90/db7/lbc 0 2026-03-09T16:14:41.245 INFO:tasks.workunit.client.0.vm03.stdout:4/653: creat d5/db/d25/d8b/fc6 x:0 0 0 2026-03-09T16:14:41.245 INFO:tasks.workunit.client.0.vm03.stdout:0/628: creat d0/d7/d3e/d57/fdb x:0 0 0 2026-03-09T16:14:41.249 INFO:tasks.workunit.client.0.vm03.stdout:4/654: chown d5/db/d25/d31/d4d/d5b/d9a 0 1 2026-03-09T16:14:41.249 INFO:tasks.workunit.client.0.vm03.stdout:7/570: stat d4/da/d18/d22/d24/d16/d6e/fa1 0 2026-03-09T16:14:41.249 INFO:tasks.workunit.client.0.vm03.stdout:4/655: write d5/dd/f23 [3764768,76100] 0 2026-03-09T16:14:41.249 INFO:tasks.workunit.client.0.vm03.stdout:3/609: mknod d5/d2e/cb4 0 2026-03-09T16:14:41.250 INFO:tasks.workunit.client.0.vm03.stdout:7/571: chown d4/da/d18/d22/d24/d15/la7 17032028 1 2026-03-09T16:14:41.251 INFO:tasks.workunit.client.0.vm03.stdout:8/663: symlink da/d10/d28/d4f/d85/d9c/ld7 0 2026-03-09T16:14:41.252 INFO:tasks.workunit.client.0.vm03.stdout:2/638: creat db/d12/fe8 x:0 0 0 2026-03-09T16:14:41.252 INFO:tasks.workunit.client.0.vm03.stdout:5/714: rmdir d2/d7/de/d11/d19/d29/d90/db6 39 2026-03-09T16:14:41.252 INFO:tasks.workunit.client.0.vm03.stdout:1/540: truncate d4/d6/d1d/d20/f72 1518299 0 2026-03-09T16:14:41.252 INFO:tasks.workunit.client.0.vm03.stdout:6/610: stat d9/f40 0 2026-03-09T16:14:41.253 INFO:tasks.workunit.client.0.vm03.stdout:0/629: mknod d0/d7/d3e/d57/d5a/d47/dce/cdc 0 2026-03-09T16:14:41.258 INFO:tasks.workunit.client.0.vm03.stdout:3/610: truncate d5/d1e/d42/f99 1001764 0 2026-03-09T16:14:41.261 INFO:tasks.workunit.client.0.vm03.stdout:1/541: fsync d4/d39/f5a 0 2026-03-09T16:14:41.268 INFO:tasks.workunit.client.0.vm03.stdout:9/678: dwrite d2/d4/d11/d29/d2a/d4d/fab [0,4194304] 0 2026-03-09T16:14:41.280 INFO:tasks.workunit.client.0.vm03.stdout:1/542: symlink d4/d6/d1d/d20/d5f/lb7 0 2026-03-09T16:14:41.286 INFO:tasks.workunit.client.0.vm03.stdout:0/630: dwrite d0/da/d7a/d98/f9d [0,4194304] 0 2026-03-09T16:14:41.291 INFO:tasks.workunit.client.0.vm03.stdout:2/639: dwrite db/d12/d2a/d61/d79/d83/f53 [0,4194304] 0 2026-03-09T16:14:41.294 INFO:tasks.workunit.client.0.vm03.stdout:9/679: dwrite d2/d4/f17 [0,4194304] 0 2026-03-09T16:14:41.307 INFO:tasks.workunit.client.0.vm03.stdout:7/572: dwrite d4/da/d18/f37 [0,4194304] 0 2026-03-09T16:14:41.311 INFO:tasks.workunit.client.0.vm03.stdout:9/680: stat d2/d4/d11/d29/d2a/d38/f74 0 2026-03-09T16:14:41.313 INFO:tasks.workunit.client.0.vm03.stdout:3/611: mknod d5/d53/d6c/cb5 0 2026-03-09T16:14:41.317 INFO:tasks.workunit.client.0.vm03.stdout:6/611: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/dbd 0 2026-03-09T16:14:41.331 INFO:tasks.workunit.client.0.vm03.stdout:4/656: mkdir d5/db/d25/d9f/dc7 0 2026-03-09T16:14:41.352 INFO:tasks.workunit.client.0.vm03.stdout:9/681: unlink d2/de/d88/f6f 0 2026-03-09T16:14:41.367 INFO:tasks.workunit.client.0.vm03.stdout:8/664: rename da/d10/ca2 to da/d10/d28/d4f/d68/d80/cd8 0 2026-03-09T16:14:41.367 INFO:tasks.workunit.client.0.vm03.stdout:1/543: creat d4/d6/d3b/d6b/d25/fb8 x:0 0 0 2026-03-09T16:14:41.367 INFO:tasks.workunit.client.0.vm03.stdout:6/612: sync 2026-03-09T16:14:41.369 INFO:tasks.workunit.client.0.vm03.stdout:1/544: dread - d4/db/d59/f9d zero size 2026-03-09T16:14:41.369 INFO:tasks.workunit.client.0.vm03.stdout:9/682: sync 2026-03-09T16:14:41.372 INFO:tasks.workunit.client.0.vm03.stdout:7/573: rmdir d4/da/d5d/db0/d9d/db1 0 2026-03-09T16:14:41.372 INFO:tasks.workunit.client.0.vm03.stdout:3/612: dread d5/d6d/f7a [0,4194304] 0 2026-03-09T16:14:41.373 INFO:tasks.workunit.client.0.vm03.stdout:3/613: fsync d5/d6d/d6a/fa9 0 2026-03-09T16:14:41.373 INFO:tasks.workunit.client.0.vm03.stdout:3/614: readlink d5/d53/d6c/l41 0 2026-03-09T16:14:41.378 INFO:tasks.workunit.client.0.vm03.stdout:3/615: chown d5/d53/d6c/d79 31555 1 2026-03-09T16:14:41.379 INFO:tasks.workunit.client.0.vm03.stdout:3/616: write d5/fb [6367304,48796] 0 2026-03-09T16:14:41.386 INFO:tasks.workunit.client.0.vm03.stdout:3/617: dwrite d5/fb3 [0,4194304] 0 2026-03-09T16:14:41.419 INFO:tasks.workunit.client.0.vm03.stdout:6/613: dread d9/f20 [4194304,4194304] 0 2026-03-09T16:14:41.426 INFO:tasks.workunit.client.0.vm03.stdout:0/631: write d0/d7/d3e/d57/fa8 [1031125,92639] 0 2026-03-09T16:14:41.434 INFO:tasks.workunit.client.0.vm03.stdout:2/640: write db/d12/d2a/d61/f47 [4734314,96728] 0 2026-03-09T16:14:41.468 INFO:tasks.workunit.client.0.vm03.stdout:3/618: mkdir d5/d2e/db6 0 2026-03-09T16:14:41.474 INFO:tasks.workunit.client.0.vm03.stdout:3/619: dwrite d5/d1e/d42/f84 [0,4194304] 0 2026-03-09T16:14:41.475 INFO:tasks.workunit.client.0.vm03.stdout:6/614: rmdir d9/d14/d71 39 2026-03-09T16:14:41.478 INFO:tasks.workunit.client.0.vm03.stdout:6/615: write d9/d42/d45/d50/d80/d90/f64 [1629560,76652] 0 2026-03-09T16:14:41.479 INFO:tasks.workunit.client.0.vm03.stdout:3/620: write d5/d6d/f7a [3378400,7872] 0 2026-03-09T16:14:41.480 INFO:tasks.workunit.client.0.vm03.stdout:0/632: rmdir d0/da/d5c 39 2026-03-09T16:14:41.480 INFO:tasks.workunit.client.0.vm03.stdout:6/616: chown d9/d22/f4e 771926 1 2026-03-09T16:14:41.482 INFO:tasks.workunit.client.0.vm03.stdout:0/633: write d0/d7/d3e/d57/fa8 [478459,3713] 0 2026-03-09T16:14:41.494 INFO:tasks.workunit.client.0.vm03.stdout:8/665: creat da/d32/dad/fd9 x:0 0 0 2026-03-09T16:14:41.506 INFO:tasks.workunit.client.0.vm03.stdout:8/666: dwrite da/d10/d28/fb0 [0,4194304] 0 2026-03-09T16:14:41.507 INFO:tasks.workunit.client.0.vm03.stdout:1/545: dread d4/d6/d3b/d6b/d25/f87 [0,4194304] 0 2026-03-09T16:14:41.517 INFO:tasks.workunit.client.0.vm03.stdout:7/574: symlink d4/da/d18/lc3 0 2026-03-09T16:14:41.521 INFO:tasks.workunit.client.0.vm03.stdout:0/634: rmdir d0/d7/d3e/d57/d5a/d82/d89/dc0 39 2026-03-09T16:14:41.521 INFO:tasks.workunit.client.0.vm03.stdout:3/621: read d5/d1e/d42/f1d [2843263,120885] 0 2026-03-09T16:14:41.526 INFO:tasks.workunit.client.0.vm03.stdout:6/617: dread d9/d42/d45/f4d [0,4194304] 0 2026-03-09T16:14:41.535 INFO:tasks.workunit.client.0.vm03.stdout:9/683: dwrite d2/d4/d11/f41 [0,4194304] 0 2026-03-09T16:14:41.537 INFO:tasks.workunit.client.0.vm03.stdout:1/546: mknod d4/d6/cb9 0 2026-03-09T16:14:41.537 INFO:tasks.workunit.client.0.vm03.stdout:5/715: rename d2/d7/d8/d16/d5c/dcf/fda to d2/d7/de/d11/ff4 0 2026-03-09T16:14:41.537 INFO:tasks.workunit.client.0.vm03.stdout:4/657: dwrite d5/d17/f21 [0,4194304] 0 2026-03-09T16:14:41.540 INFO:tasks.workunit.client.0.vm03.stdout:2/641: creat db/d12/d2a/d61/d79/d83/fe9 x:0 0 0 2026-03-09T16:14:41.541 INFO:tasks.workunit.client.0.vm03.stdout:7/575: readlink d4/da/d18/d22/d24/d16/l46 0 2026-03-09T16:14:41.541 INFO:tasks.workunit.client.0.vm03.stdout:4/658: write d5/db/d25/d31/d33/f92 [1043228,68851] 0 2026-03-09T16:14:41.541 INFO:tasks.workunit.client.0.vm03.stdout:4/659: chown d5 187411 1 2026-03-09T16:14:41.546 INFO:tasks.workunit.client.0.vm03.stdout:5/716: stat d2/ca6 0 2026-03-09T16:14:41.546 INFO:tasks.workunit.client.0.vm03.stdout:2/642: chown db/d12/d2a/d61/d79/d83/cae 148 1 2026-03-09T16:14:41.550 INFO:tasks.workunit.client.0.vm03.stdout:5/717: chown d2/d7/d1a/d1c/d6c/cad 2374794 1 2026-03-09T16:14:41.554 INFO:tasks.workunit.client.0.vm03.stdout:5/718: write d2/d7/de/d11/d19/d31/f7e [2690929,119082] 0 2026-03-09T16:14:41.554 INFO:tasks.workunit.client.0.vm03.stdout:7/576: dread d4/da/d18/d22/d24/d16/d6e/f73 [0,4194304] 0 2026-03-09T16:14:41.566 INFO:tasks.workunit.client.0.vm03.stdout:1/547: dwrite d4/d31/f4f [0,4194304] 0 2026-03-09T16:14:41.572 INFO:tasks.workunit.client.0.vm03.stdout:4/660: mkdir d5/db/d25/dc8 0 2026-03-09T16:14:41.580 INFO:tasks.workunit.client.0.vm03.stdout:2/643: dread db/d12/f69 [0,4194304] 0 2026-03-09T16:14:41.582 INFO:tasks.workunit.client.0.vm03.stdout:9/684: dread d2/df/f10 [4194304,4194304] 0 2026-03-09T16:14:41.584 INFO:tasks.workunit.client.0.vm03.stdout:3/622: symlink d5/d53/lb7 0 2026-03-09T16:14:41.584 INFO:tasks.workunit.client.0.vm03.stdout:9/685: stat d2/d4/d11/d29/d2a/db3 0 2026-03-09T16:14:41.596 INFO:tasks.workunit.client.0.vm03.stdout:1/548: dwrite d4/fa [0,4194304] 0 2026-03-09T16:14:41.598 INFO:tasks.workunit.client.0.vm03.stdout:7/577: read d4/d2d/d4b/f9f [663162,91621] 0 2026-03-09T16:14:41.607 INFO:tasks.workunit.client.0.vm03.stdout:2/644: mknod db/d12/d2a/d61/d6d/d8c/d94/da4/cea 0 2026-03-09T16:14:41.607 INFO:tasks.workunit.client.0.vm03.stdout:2/645: stat db/d12/d2a/d61/d79/d83/d64/f80 0 2026-03-09T16:14:41.613 INFO:tasks.workunit.client.0.vm03.stdout:3/623: creat d5/d44/d61/fb8 x:0 0 0 2026-03-09T16:14:41.615 INFO:tasks.workunit.client.0.vm03.stdout:5/719: truncate d2/d7/de/d11/f26 4052671 0 2026-03-09T16:14:41.625 INFO:tasks.workunit.client.0.vm03.stdout:2/646: dread db/d12/d2a/d61/d79/d83/d52/fd0 [0,4194304] 0 2026-03-09T16:14:41.627 INFO:tasks.workunit.client.0.vm03.stdout:8/667: rename da/d10/d63/l93 to da/lda 0 2026-03-09T16:14:41.628 INFO:tasks.workunit.client.0.vm03.stdout:4/661: dread d5/d17/d44/fa4 [0,4194304] 0 2026-03-09T16:14:41.629 INFO:tasks.workunit.client.0.vm03.stdout:9/686: sync 2026-03-09T16:14:41.654 INFO:tasks.workunit.client.0.vm03.stdout:6/618: getdents d9 0 2026-03-09T16:14:41.654 INFO:tasks.workunit.client.0.vm03.stdout:6/619: dread d9/d42/d45/d50/d80/d8a/d9c/f93 [0,4194304] 0 2026-03-09T16:14:41.667 INFO:tasks.workunit.client.0.vm03.stdout:0/635: write d0/d7/d75/f59 [3434724,84799] 0 2026-03-09T16:14:41.677 INFO:tasks.workunit.client.0.vm03.stdout:5/720: mkdir d2/d7/de/d11/d19/d29/d90/dbe/df5 0 2026-03-09T16:14:41.690 INFO:tasks.workunit.client.0.vm03.stdout:8/668: unlink da/d1d/d3b/c5a 0 2026-03-09T16:14:41.693 INFO:tasks.workunit.client.0.vm03.stdout:8/669: write da/d10/d28/d4f/d68/f8f [2564975,3978] 0 2026-03-09T16:14:41.706 INFO:tasks.workunit.client.0.vm03.stdout:9/687: truncate d2/df/f10 8078882 0 2026-03-09T16:14:41.719 INFO:tasks.workunit.client.0.vm03.stdout:6/620: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/dbe 0 2026-03-09T16:14:41.725 INFO:tasks.workunit.client.0.vm03.stdout:6/621: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/da8/f81 [0,4194304] 0 2026-03-09T16:14:41.729 INFO:tasks.workunit.client.0.vm03.stdout:6/622: readlink d9/d22/l19 0 2026-03-09T16:14:41.776 INFO:tasks.workunit.client.0.vm03.stdout:4/662: truncate d5/db/d25/f26 1036262 0 2026-03-09T16:14:41.779 INFO:tasks.workunit.client.0.vm03.stdout:2/647: dwrite fa [0,4194304] 0 2026-03-09T16:14:41.780 INFO:tasks.workunit.client.0.vm03.stdout:7/578: link d4/da/d18/d22/d24/d16/d6e/d7e/cab d4/da/d5d/db0/d9d/cc4 0 2026-03-09T16:14:41.781 INFO:tasks.workunit.client.0.vm03.stdout:5/721: dwrite d2/d7/d8/f7a [0,4194304] 0 2026-03-09T16:14:41.784 INFO:tasks.workunit.client.0.vm03.stdout:2/648: dread db/d12/d2a/f60 [0,4194304] 0 2026-03-09T16:14:41.798 INFO:tasks.workunit.client.0.vm03.stdout:9/688: dread d2/f8 [0,4194304] 0 2026-03-09T16:14:41.798 INFO:tasks.workunit.client.0.vm03.stdout:6/623: write d9/f73 [518365,118419] 0 2026-03-09T16:14:41.799 INFO:tasks.workunit.client.0.vm03.stdout:6/624: stat d9/d42/d45/d50 0 2026-03-09T16:14:41.800 INFO:tasks.workunit.client.0.vm03.stdout:9/689: write d2/d4/d11/d12/db2/fcb [757285,53730] 0 2026-03-09T16:14:41.810 INFO:tasks.workunit.client.0.vm03.stdout:3/624: rmdir d5/d2e/d8c 0 2026-03-09T16:14:41.815 INFO:tasks.workunit.client.0.vm03.stdout:8/670: mknod da/db/d30/dc7/cdb 0 2026-03-09T16:14:41.851 INFO:tasks.workunit.client.0.vm03.stdout:7/579: write d4/da/f5f [5214954,43205] 0 2026-03-09T16:14:41.860 INFO:tasks.workunit.client.0.vm03.stdout:7/580: fsync d4/da/f5f 0 2026-03-09T16:14:41.861 INFO:tasks.workunit.client.0.vm03.stdout:7/581: chown d4/da/d18/d22/d24/laa 25616 1 2026-03-09T16:14:41.864 INFO:tasks.workunit.client.0.vm03.stdout:7/582: dread d4/da/d18/d22/f48 [0,4194304] 0 2026-03-09T16:14:41.864 INFO:tasks.workunit.client.0.vm03.stdout:7/583: stat d4/da/d18/l96 0 2026-03-09T16:14:41.873 INFO:tasks.workunit.client.0.vm03.stdout:2/649: symlink db/d12/d2a/d61/d6d/d8c/d94/da4/leb 0 2026-03-09T16:14:41.892 INFO:tasks.workunit.client.0.vm03.stdout:9/690: mkdir d2/d4/d11/d12/db2/dce 0 2026-03-09T16:14:41.897 INFO:tasks.workunit.client.0.vm03.stdout:1/549: mknod d4/d6/d3b/d6b/d25/cba 0 2026-03-09T16:14:41.898 INFO:tasks.workunit.client.0.vm03.stdout:1/550: fdatasync d4/fa 0 2026-03-09T16:14:41.910 INFO:tasks.workunit.client.0.vm03.stdout:9/691: sync 2026-03-09T16:14:41.916 INFO:tasks.workunit.client.0.vm03.stdout:0/636: rmdir d0/da/d1b/d9b/dd9 0 2026-03-09T16:14:41.933 INFO:tasks.workunit.client.0.vm03.stdout:4/663: creat d5/db/d25/d9f/dc7/fc9 x:0 0 0 2026-03-09T16:14:41.942 INFO:tasks.workunit.client.0.vm03.stdout:3/625: dwrite d5/d1e/d42/f25 [0,4194304] 0 2026-03-09T16:14:41.944 INFO:tasks.workunit.client.0.vm03.stdout:8/671: dwrite da/db/d30/f94 [0,4194304] 0 2026-03-09T16:14:41.959 INFO:tasks.workunit.client.0.vm03.stdout:7/584: symlink d4/da/d45/d51/lc5 0 2026-03-09T16:14:41.973 INFO:tasks.workunit.client.0.vm03.stdout:2/650: mkdir db/d12/d2a/d61/d79/d83/d64/dbd/dec 0 2026-03-09T16:14:41.977 INFO:tasks.workunit.client.0.vm03.stdout:2/651: dwrite db/d12/f85 [0,4194304] 0 2026-03-09T16:14:41.996 INFO:tasks.workunit.client.0.vm03.stdout:9/692: rmdir d2/d4/d11/d12 39 2026-03-09T16:14:41.996 INFO:tasks.workunit.client.0.vm03.stdout:0/637: symlink d0/d7/d3e/d57/d5a/d5f/db2/dcf/ldd 0 2026-03-09T16:14:41.997 INFO:tasks.workunit.client.0.vm03.stdout:8/672: chown da/d1d/c51 6074590 1 2026-03-09T16:14:42.000 INFO:tasks.workunit.client.0.vm03.stdout:8/673: dread da/db/d30/f94 [0,4194304] 0 2026-03-09T16:14:42.001 INFO:tasks.workunit.client.0.vm03.stdout:7/585: symlink d4/da/d18/d22/d24/d16/d6e/d7e/lc6 0 2026-03-09T16:14:42.009 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:41 vm03.local ceph-mon[51019]: pgmap v13: 65 pgs: 65 active+clean; 1.3 GiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 27 MiB/s rd, 96 MiB/s wr, 218 op/s 2026-03-09T16:14:42.009 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:41 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:42.009 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:41 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:42.018 INFO:tasks.workunit.client.0.vm03.stdout:9/693: fdatasync d2/df/d89/f7e 0 2026-03-09T16:14:42.018 INFO:tasks.workunit.client.0.vm03.stdout:9/694: read d2/de/f87 [873662,24259] 0 2026-03-09T16:14:42.026 INFO:tasks.workunit.client.0.vm03.stdout:5/722: link d2/d7/de/d11/d38/d52/l98 d2/d7/de/d11/d19/d29/lf6 0 2026-03-09T16:14:42.026 INFO:tasks.workunit.client.0.vm03.stdout:5/723: readlink d2/d7/de/d11/d19/l65 0 2026-03-09T16:14:42.026 INFO:tasks.workunit.client.0.vm03.stdout:9/695: dwrite d2/d4/d11/d29/d2a/d38/fca [0,4194304] 0 2026-03-09T16:14:42.036 INFO:tasks.workunit.client.0.vm03.stdout:1/551: creat d4/d6/da2/fbb x:0 0 0 2026-03-09T16:14:42.047 INFO:tasks.workunit.client.0.vm03.stdout:3/626: dread d5/d1e/d42/f20 [4194304,4194304] 0 2026-03-09T16:14:42.048 INFO:tasks.workunit.client.0.vm03.stdout:3/627: stat d5/d1e/d42/d8b 0 2026-03-09T16:14:42.052 INFO:tasks.workunit.client.0.vm03.stdout:3/628: dwrite d5/d53/f96 [0,4194304] 0 2026-03-09T16:14:42.053 INFO:tasks.workunit.client.0.vm03.stdout:6/625: truncate d9/d22/f83 2624064 0 2026-03-09T16:14:42.065 INFO:tasks.workunit.client.0.vm03.stdout:4/664: write d5/d17/d44/f4a [214412,75255] 0 2026-03-09T16:14:42.065 INFO:tasks.workunit.client.0.vm03.stdout:4/665: readlink d5/d17/d44/l50 0 2026-03-09T16:14:42.071 INFO:tasks.workunit.client.0.vm03.stdout:0/638: creat d0/fde x:0 0 0 2026-03-09T16:14:42.072 INFO:tasks.workunit.client.0.vm03.stdout:0/639: chown d0/d7/d3e/d57/d5a/d82/d89/dbd/fc7 43890 1 2026-03-09T16:14:42.074 INFO:tasks.workunit.client.0.vm03.stdout:8/674: rename da/d10/d63 to da/d10/d28/d4f/d68/ddc 0 2026-03-09T16:14:42.088 INFO:tasks.workunit.client.0.vm03.stdout:7/586: write d4/da/d18/d22/d24/d16/f6c [86611,64721] 0 2026-03-09T16:14:42.093 INFO:tasks.workunit.client.0.vm03.stdout:2/652: getdents db/d12/d2a/d61/d6d 0 2026-03-09T16:14:42.098 INFO:tasks.workunit.client.0.vm03.stdout:7/587: dwrite d4/da/d45/fa4 [0,4194304] 0 2026-03-09T16:14:42.112 INFO:tasks.workunit.client.0.vm03.stdout:0/640: truncate d0/d7/d3e/d57/f90 2771504 0 2026-03-09T16:14:42.126 INFO:tasks.workunit.client.0.vm03.stdout:3/629: mkdir d5/d6d/db9 0 2026-03-09T16:14:42.127 INFO:tasks.workunit.client.0.vm03.stdout:5/724: write d2/d7/d3c/f9a [487461,78546] 0 2026-03-09T16:14:42.134 INFO:tasks.workunit.client.0.vm03.stdout:9/696: creat d2/d54/fcf x:0 0 0 2026-03-09T16:14:42.138 INFO:tasks.workunit.client.0.vm03.stdout:6/626: getdents d9/d42/d45/d65/dae 0 2026-03-09T16:14:42.178 INFO:tasks.workunit.client.0.vm03.stdout:7/588: write d4/d2d/f52 [207331,3126] 0 2026-03-09T16:14:42.179 INFO:tasks.workunit.client.0.vm03.stdout:5/725: dread - d2/d7/de/d11/d19/d31/d35/fd3 zero size 2026-03-09T16:14:42.182 INFO:tasks.workunit.client.0.vm03.stdout:1/552: dwrite d4/d6/d1d/d20/f2a [0,4194304] 0 2026-03-09T16:14:42.193 INFO:tasks.workunit.client.0.vm03.stdout:6/627: mkdir d9/d42/d45/d65/dbf 0 2026-03-09T16:14:42.194 INFO:tasks.workunit.client.0.vm03.stdout:2/653: mknod db/ced 0 2026-03-09T16:14:42.201 INFO:tasks.workunit.client.0.vm03.stdout:0/641: symlink d0/d7/ldf 0 2026-03-09T16:14:42.203 INFO:tasks.workunit.client.0.vm03.stdout:8/675: link da/db/d30/dc7/cdb da/d32/d79/d95/dbd/cdd 0 2026-03-09T16:14:42.206 INFO:tasks.workunit.client.0.vm03.stdout:7/589: dwrite d4/d2d/d4b/f9f [0,4194304] 0 2026-03-09T16:14:42.208 INFO:tasks.workunit.client.0.vm03.stdout:7/590: fdatasync d4/d2d/d4b/f6b 0 2026-03-09T16:14:42.214 INFO:tasks.workunit.client.0.vm03.stdout:7/591: dwrite d4/da/d5d/db0/fb2 [0,4194304] 0 2026-03-09T16:14:42.245 INFO:tasks.workunit.client.0.vm03.stdout:3/630: creat d5/d2e/db6/fba x:0 0 0 2026-03-09T16:14:42.247 INFO:tasks.workunit.client.0.vm03.stdout:5/726: creat d2/d7/d1a/d1c/d6c/ff7 x:0 0 0 2026-03-09T16:14:42.259 INFO:tasks.workunit.client.0.vm03.stdout:5/727: dwrite d2/d7/d3c/f9a [0,4194304] 0 2026-03-09T16:14:42.266 INFO:tasks.workunit.client.0.vm03.stdout:6/628: mknod d9/d42/d45/d65/cc0 0 2026-03-09T16:14:42.266 INFO:tasks.workunit.client.0.vm03.stdout:4/666: getdents d5/db/d25/d31/d4d/d5b 0 2026-03-09T16:14:42.267 INFO:tasks.workunit.client.0.vm03.stdout:4/667: chown d5/dd/d1f/l27 3 1 2026-03-09T16:14:42.269 INFO:tasks.workunit.client.0.vm03.stdout:0/642: mkdir d0/da/d1b/de0 0 2026-03-09T16:14:42.271 INFO:tasks.workunit.client.0.vm03.stdout:0/643: dread d0/d7/d48/f18 [0,4194304] 0 2026-03-09T16:14:42.271 INFO:tasks.workunit.client.0.vm03.stdout:8/676: creat da/d10/d28/fde x:0 0 0 2026-03-09T16:14:42.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:41 vm05.local ceph-mon[58702]: pgmap v13: 65 pgs: 65 active+clean; 1.3 GiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 27 MiB/s rd, 96 MiB/s wr, 218 op/s 2026-03-09T16:14:42.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:41 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:42.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:41 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:42.279 INFO:tasks.workunit.client.0.vm03.stdout:7/592: unlink d4/da/d18/d22/d24/d15/f2a 0 2026-03-09T16:14:42.280 INFO:tasks.workunit.client.0.vm03.stdout:7/593: write d4/da/d5d/f9b [770193,39805] 0 2026-03-09T16:14:42.281 INFO:tasks.workunit.client.0.vm03.stdout:3/631: symlink d5/d44/d61/lbb 0 2026-03-09T16:14:42.286 INFO:tasks.workunit.client.0.vm03.stdout:3/632: dread d5/d1e/d42/d55/f57 [0,4194304] 0 2026-03-09T16:14:42.291 INFO:tasks.workunit.client.0.vm03.stdout:3/633: dwrite d5/d1e/d42/f99 [0,4194304] 0 2026-03-09T16:14:42.297 INFO:tasks.workunit.client.0.vm03.stdout:9/697: creat d2/d4/d11/d12/dc7/dcc/fd0 x:0 0 0 2026-03-09T16:14:42.317 INFO:tasks.workunit.client.0.vm03.stdout:3/634: dread d5/d1e/d42/f74 [4194304,4194304] 0 2026-03-09T16:14:42.324 INFO:tasks.workunit.client.0.vm03.stdout:6/629: mkdir d9/d42/d45/d50/d80/d8a/dc1 0 2026-03-09T16:14:42.334 INFO:tasks.workunit.client.0.vm03.stdout:8/677: mknod da/db/d43/cdf 0 2026-03-09T16:14:42.342 INFO:tasks.workunit.client.0.vm03.stdout:9/698: unlink d2/d54/fcf 0 2026-03-09T16:14:42.344 INFO:tasks.workunit.client.0.vm03.stdout:1/553: creat d4/fbc x:0 0 0 2026-03-09T16:14:42.344 INFO:tasks.workunit.client.0.vm03.stdout:1/554: dread - d4/d6/d3b/f98 zero size 2026-03-09T16:14:42.344 INFO:tasks.workunit.client.0.vm03.stdout:5/728: unlink d2/d7/de/d11/d19/d31/d35/ld7 0 2026-03-09T16:14:42.351 INFO:tasks.workunit.client.0.vm03.stdout:2/654: truncate db/d12/f62 3507300 0 2026-03-09T16:14:42.354 INFO:tasks.workunit.client.0.vm03.stdout:7/594: dwrite d4/da/d18/d22/d24/f41 [0,4194304] 0 2026-03-09T16:14:42.371 INFO:tasks.workunit.client.0.vm03.stdout:8/678: creat da/d45/fe0 x:0 0 0 2026-03-09T16:14:42.372 INFO:tasks.workunit.client.0.vm03.stdout:8/679: stat da/d32/f66 0 2026-03-09T16:14:42.373 INFO:tasks.workunit.client.0.vm03.stdout:8/680: truncate da/d10/d28/d4f/d85/fa1 1513751 0 2026-03-09T16:14:42.377 INFO:tasks.workunit.client.0.vm03.stdout:9/699: creat d2/d4/d11/d12/db2/fd1 x:0 0 0 2026-03-09T16:14:42.396 INFO:tasks.workunit.client.0.vm03.stdout:0/644: dwrite d0/da/d5c/f66 [0,4194304] 0 2026-03-09T16:14:42.413 INFO:tasks.workunit.client.0.vm03.stdout:1/555: rename d4/d31/f79 to d4/d7b/fbd 0 2026-03-09T16:14:42.440 INFO:tasks.workunit.client.0.vm03.stdout:7/595: symlink d4/d2d/lc7 0 2026-03-09T16:14:42.443 INFO:tasks.workunit.client.0.vm03.stdout:6/630: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/fc2 x:0 0 0 2026-03-09T16:14:42.539 INFO:tasks.workunit.client.0.vm03.stdout:1/556: fdatasync d4/d6/d3b/d63/f7e 0 2026-03-09T16:14:42.539 INFO:tasks.workunit.client.0.vm03.stdout:1/557: dread - d4/db/d59/f9d zero size 2026-03-09T16:14:42.540 INFO:tasks.workunit.client.0.vm03.stdout:1/558: chown d4/d31/d5c/faf 2421685 1 2026-03-09T16:14:42.541 INFO:tasks.workunit.client.0.vm03.stdout:3/635: link d5/d1e/d42/d34/cac d5/d44/d61/cbc 0 2026-03-09T16:14:42.543 INFO:tasks.workunit.client.0.vm03.stdout:4/668: getdents d5/db/d25/d31/d4d/da9 0 2026-03-09T16:14:42.545 INFO:tasks.workunit.client.0.vm03.stdout:2/655: mknod db/d12/d2a/d61/d79/d83/cee 0 2026-03-09T16:14:42.547 INFO:tasks.workunit.client.0.vm03.stdout:7/596: creat d4/da/d45/d51/d36/fc8 x:0 0 0 2026-03-09T16:14:42.550 INFO:tasks.workunit.client.0.vm03.stdout:6/631: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/da8/cc3 0 2026-03-09T16:14:42.551 INFO:tasks.workunit.client.0.vm03.stdout:6/632: write d9/d42/d45/d50/d80/d8a/d9c/f8c [1927309,74937] 0 2026-03-09T16:14:42.558 INFO:tasks.workunit.client.0.vm03.stdout:8/681: mknod da/d32/d79/d95/dbd/ce1 0 2026-03-09T16:14:42.558 INFO:tasks.workunit.client.0.vm03.stdout:8/682: chown da/d32/fa6 18382 1 2026-03-09T16:14:42.565 INFO:tasks.workunit.client.0.vm03.stdout:1/559: mkdir d4/d6/d3b/dbe 0 2026-03-09T16:14:42.571 INFO:tasks.workunit.client.0.vm03.stdout:2/656: unlink db/d12/d2a/d61/d79/fc1 0 2026-03-09T16:14:42.574 INFO:tasks.workunit.client.0.vm03.stdout:8/683: stat da/db/da8 0 2026-03-09T16:14:42.578 INFO:tasks.workunit.client.0.vm03.stdout:0/645: link d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc d0/d7/d3e/d57/d5a/d82/d89/dc0/fe1 0 2026-03-09T16:14:42.578 INFO:tasks.workunit.client.0.vm03.stdout:0/646: write d0/d7/d75/f59 [3493191,104879] 0 2026-03-09T16:14:42.585 INFO:tasks.workunit.client.0.vm03.stdout:7/597: sync 2026-03-09T16:14:42.585 INFO:tasks.workunit.client.0.vm03.stdout:6/633: sync 2026-03-09T16:14:42.591 INFO:tasks.workunit.client.0.vm03.stdout:7/598: dread d4/da/d18/d22/d24/d15/f34 [0,4194304] 0 2026-03-09T16:14:42.606 INFO:tasks.workunit.client.0.vm03.stdout:8/684: dread da/d10/d28/f29 [0,4194304] 0 2026-03-09T16:14:42.631 INFO:tasks.workunit.client.0.vm03.stdout:9/700: dwrite d2/d4/d11/d29/d2a/d38/f72 [0,4194304] 0 2026-03-09T16:14:42.644 INFO:tasks.workunit.client.0.vm03.stdout:0/647: dread d0/d7/d48/f4a [0,4194304] 0 2026-03-09T16:14:42.647 INFO:tasks.workunit.client.0.vm03.stdout:5/729: getdents d2/d7/de/d11/d19/d29/d90/db6 0 2026-03-09T16:14:42.662 INFO:tasks.workunit.client.0.vm03.stdout:0/648: sync 2026-03-09T16:14:42.666 INFO:tasks.workunit.client.0.vm03.stdout:4/669: creat d5/db/d25/d8b/da8/fca x:0 0 0 2026-03-09T16:14:42.673 INFO:tasks.workunit.client.0.vm03.stdout:3/636: write d5/d1e/fa4 [4526682,103075] 0 2026-03-09T16:14:42.675 INFO:tasks.workunit.client.0.vm03.stdout:6/634: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/da8/cc4 0 2026-03-09T16:14:42.678 INFO:tasks.workunit.client.0.vm03.stdout:2/657: rename db/c26 to db/d12/d2a/d61/d79/d83/d64/dbd/da0/cef 0 2026-03-09T16:14:42.679 INFO:tasks.workunit.client.0.vm03.stdout:2/658: write db/d12/fe8 [907554,63745] 0 2026-03-09T16:14:42.690 INFO:tasks.workunit.client.0.vm03.stdout:9/701: mknod d2/d54/d7d/d8f/cd2 0 2026-03-09T16:14:42.690 INFO:tasks.workunit.client.0.vm03.stdout:9/702: readlink d2/df/l24 0 2026-03-09T16:14:42.691 INFO:tasks.workunit.client.0.vm03.stdout:9/703: chown d2/d4/d11/d29/d92/f6a 442718448 1 2026-03-09T16:14:42.700 INFO:tasks.workunit.client.0.vm03.stdout:4/670: dread d5/f74 [0,4194304] 0 2026-03-09T16:14:42.712 INFO:tasks.workunit.client.0.vm03.stdout:8/685: dwrite da/d10/d28/d64/fab [0,4194304] 0 2026-03-09T16:14:42.721 INFO:tasks.workunit.client.0.vm03.stdout:5/730: dwrite d2/d7/d1a/d1c/d3f/f92 [0,4194304] 0 2026-03-09T16:14:42.732 INFO:tasks.workunit.client.0.vm03.stdout:3/637: mkdir d5/d6d/d6a/dbd 0 2026-03-09T16:14:42.733 INFO:tasks.workunit.client.0.vm03.stdout:3/638: stat d5/d6d/d5a/d63/l77 0 2026-03-09T16:14:42.734 INFO:tasks.workunit.client.0.vm03.stdout:3/639: dread d5/fb3 [0,4194304] 0 2026-03-09T16:14:42.745 INFO:tasks.workunit.client.0.vm03.stdout:6/635: rmdir d9/d8e 39 2026-03-09T16:14:42.748 INFO:tasks.workunit.client.0.vm03.stdout:0/649: rename d0/d7/d3e/d57/d5a/d5f/db2/f77 to d0/d7/d3e/d57/d5a/d82/d89/dbd/d9c/fe2 0 2026-03-09T16:14:42.752 INFO:tasks.workunit.client.0.vm03.stdout:2/659: unlink db/d12/d2a/d61/d79/f9a 0 2026-03-09T16:14:42.762 INFO:tasks.workunit.client.0.vm03.stdout:7/599: truncate d4/da/d18/d22/d24/d15/f34 430264 0 2026-03-09T16:14:42.763 INFO:tasks.workunit.client.0.vm03.stdout:7/600: write d4/d2d/f95 [2247925,12828] 0 2026-03-09T16:14:42.764 INFO:tasks.workunit.client.0.vm03.stdout:7/601: write d4/d2d/f95 [3149675,60651] 0 2026-03-09T16:14:42.768 INFO:tasks.workunit.client.0.vm03.stdout:9/704: dread d2/d54/d7d/d8f/dad/fae [0,4194304] 0 2026-03-09T16:14:42.769 INFO:tasks.workunit.client.0.vm03.stdout:1/560: getdents d4/d6/d1d/d3d 0 2026-03-09T16:14:42.773 INFO:tasks.workunit.client.0.vm03.stdout:4/671: mkdir d5/db/d25/d31/d4d/d5b/d72/dcb 0 2026-03-09T16:14:42.787 INFO:tasks.workunit.client.0.vm03.stdout:8/686: unlink da/d10/d28/fde 0 2026-03-09T16:14:42.798 INFO:tasks.workunit.client.0.vm03.stdout:5/731: mknod d2/d7/d8/d24/d27/d43/cf8 0 2026-03-09T16:14:42.820 INFO:tasks.workunit.client.0.vm03.stdout:0/650: truncate d0/d7/d48/f13 2391190 0 2026-03-09T16:14:42.828 INFO:tasks.workunit.client.0.vm03.stdout:2/660: dread db/f2d [0,4194304] 0 2026-03-09T16:14:42.831 INFO:tasks.workunit.client.0.vm03.stdout:2/661: dwrite db/d12/d2a/d61/d79/d83/d64/dbd/da0/fcb [0,4194304] 0 2026-03-09T16:14:42.846 INFO:tasks.workunit.client.0.vm03.stdout:1/561: truncate d4/d6/d3b/f35 2218401 0 2026-03-09T16:14:42.863 INFO:tasks.workunit.client.0.vm03.stdout:4/672: rmdir d5/db/d25/d8b 39 2026-03-09T16:14:42.865 INFO:tasks.workunit.client.0.vm03.stdout:4/673: dread d5/d17/d44/f90 [0,4194304] 0 2026-03-09T16:14:42.866 INFO:tasks.workunit.client.0.vm03.stdout:8/687: symlink da/d32/le2 0 2026-03-09T16:14:42.870 INFO:tasks.workunit.client.0.vm03.stdout:3/640: dwrite d5/d1e/d42/d55/f57 [0,4194304] 0 2026-03-09T16:14:42.881 INFO:tasks.workunit.client.0.vm03.stdout:0/651: truncate d0/d7/d3e/d57/d5a/d5f/f84 749616 0 2026-03-09T16:14:42.900 INFO:tasks.workunit.client.0.vm03.stdout:2/662: rmdir db/d12 39 2026-03-09T16:14:42.910 INFO:tasks.workunit.client.0.vm03.stdout:8/688: mknod da/d6c/ce3 0 2026-03-09T16:14:42.934 INFO:tasks.workunit.client.0.vm03.stdout:9/705: write d2/d54/d7d/d8f/dad/fae [480039,94327] 0 2026-03-09T16:14:42.937 INFO:tasks.workunit.client.0.vm03.stdout:9/706: dwrite d2/d4/d11/d29/d2a/d4d/fab [0,4194304] 0 2026-03-09T16:14:42.955 INFO:tasks.workunit.client.0.vm03.stdout:6/636: creat d9/d42/d45/d50/d80/d8a/fc5 x:0 0 0 2026-03-09T16:14:42.955 INFO:tasks.workunit.client.0.vm03.stdout:0/652: creat d0/d7/d3e/d57/d5a/d52/d9f/fe3 x:0 0 0 2026-03-09T16:14:42.957 INFO:tasks.workunit.client.0.vm03.stdout:1/562: creat d4/d31/d5c/da8/da1/fbf x:0 0 0 2026-03-09T16:14:42.958 INFO:tasks.workunit.client.0.vm03.stdout:1/563: stat d4/d39/f5a 0 2026-03-09T16:14:42.958 INFO:tasks.workunit.client.0.vm03.stdout:4/674: mkdir d5/db/d25/d31/dcc 0 2026-03-09T16:14:42.958 INFO:tasks.workunit.client.0.vm03.stdout:1/564: fsync d4/d31/f4f 0 2026-03-09T16:14:42.958 INFO:tasks.workunit.client.0.vm03.stdout:8/689: truncate da/d32/f4d 2891828 0 2026-03-09T16:14:42.959 INFO:tasks.workunit.client.0.vm03.stdout:1/565: read - d4/d6/d1d/d20/d93/f6f zero size 2026-03-09T16:14:42.961 INFO:tasks.workunit.client.0.vm03.stdout:0/653: dread d0/da/d5c/f66 [0,4194304] 0 2026-03-09T16:14:42.968 INFO:tasks.workunit.client.0.vm03.stdout:6/637: dwrite d9/d42/d45/d50/d80/d8a/d9c/f8c [0,4194304] 0 2026-03-09T16:14:42.974 INFO:tasks.workunit.client.0.vm03.stdout:9/707: fsync d2/d4/d11/d29/d2a/f8b 0 2026-03-09T16:14:42.984 INFO:tasks.workunit.client.0.vm03.stdout:3/641: dread d5/f2b [0,4194304] 0 2026-03-09T16:14:43.002 INFO:tasks.workunit.client.0.vm03.stdout:5/732: getdents d2/d7/de/d54/dce 0 2026-03-09T16:14:43.004 INFO:tasks.workunit.client.0.vm03.stdout:8/690: mkdir da/d6c/d7a/de4 0 2026-03-09T16:14:43.006 INFO:tasks.workunit.client.0.vm03.stdout:1/566: mkdir d4/d6/d3b/d6b/da5/dc0 0 2026-03-09T16:14:43.011 INFO:tasks.workunit.client.0.vm03.stdout:0/654: fsync d0/d7/d3e/d57/d5a/d52/f97 0 2026-03-09T16:14:43.022 INFO:tasks.workunit.client.0.vm03.stdout:9/708: mkdir d2/d54/d7d/dd3 0 2026-03-09T16:14:43.026 INFO:tasks.workunit.client.0.vm03.stdout:3/642: write d5/fb3 [1240871,66011] 0 2026-03-09T16:14:43.027 INFO:tasks.workunit.client.0.vm03.stdout:3/643: dread - d5/d2e/db6/fba zero size 2026-03-09T16:14:43.027 INFO:tasks.workunit.client.0.vm03.stdout:3/644: fsync d5/d2e/db6/fba 0 2026-03-09T16:14:43.033 INFO:tasks.workunit.client.0.vm03.stdout:7/602: getdents d4/da/d18/d22/d24/d16/d69 0 2026-03-09T16:14:43.038 INFO:tasks.workunit.client.0.vm03.stdout:2/663: creat db/ff0 x:0 0 0 2026-03-09T16:14:43.041 INFO:tasks.workunit.client.0.vm03.stdout:5/733: rmdir d2/d7 39 2026-03-09T16:14:43.044 INFO:tasks.workunit.client.0.vm03.stdout:8/691: fsync da/d10/d28/d4f/d68/fa7 0 2026-03-09T16:14:43.044 INFO:tasks.workunit.client.0.vm03.stdout:1/567: creat d4/d6/d1d/d20/fc1 x:0 0 0 2026-03-09T16:14:43.050 INFO:tasks.workunit.client.0.vm03.stdout:1/568: dwrite d4/f1b [0,4194304] 0 2026-03-09T16:14:43.051 INFO:tasks.workunit.client.0.vm03.stdout:1/569: chown d4/d6/d1d/d20/d23/l91 9787639 1 2026-03-09T16:14:43.067 INFO:tasks.workunit.client.0.vm03.stdout:4/675: dwrite d5/db/d25/d8b/da8/f62 [8388608,4194304] 0 2026-03-09T16:14:43.076 INFO:tasks.workunit.client.0.vm03.stdout:6/638: rename d9/d22 to d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6 0 2026-03-09T16:14:43.077 INFO:tasks.workunit.client.0.vm03.stdout:0/655: creat d0/d7/d3e/d57/fe4 x:0 0 0 2026-03-09T16:14:43.079 INFO:tasks.workunit.client.0.vm03.stdout:9/709: symlink d2/d4/d11/d29/d2a/d46/ld4 0 2026-03-09T16:14:43.081 INFO:tasks.workunit.client.0.vm03.stdout:3/645: mkdir d5/d1e/d42/d55/d86/dbe 0 2026-03-09T16:14:43.082 INFO:tasks.workunit.client.0.vm03.stdout:3/646: chown d5/c93 19908042 1 2026-03-09T16:14:43.083 INFO:tasks.workunit.client.0.vm03.stdout:3/647: chown d5/d53/d6c/f9f 92 1 2026-03-09T16:14:43.094 INFO:tasks.workunit.client.0.vm03.stdout:8/692: creat da/db/d43/fe5 x:0 0 0 2026-03-09T16:14:43.094 INFO:tasks.workunit.client.0.vm03.stdout:1/570: rename d4/d6/d3b/d63/f82 to d4/d6/d1d/d20/fc2 0 2026-03-09T16:14:43.094 INFO:tasks.workunit.client.0.vm03.stdout:7/603: truncate d4/da/d18/f44 4331924 0 2026-03-09T16:14:43.095 INFO:tasks.workunit.client.0.vm03.stdout:9/710: read d2/f7 [6710932,101389] 0 2026-03-09T16:14:43.095 INFO:tasks.workunit.client.0.vm03.stdout:7/604: chown d4/da/d18/d22/d24/d16/d3e/d77 6858 1 2026-03-09T16:14:43.098 INFO:tasks.workunit.client.0.vm03.stdout:7/605: dwrite d4/da/d45/d51/d36/fc8 [0,4194304] 0 2026-03-09T16:14:43.111 INFO:tasks.workunit.client.0.vm03.stdout:1/571: mkdir d4/d6/d3b/d6b/da5/dc3 0 2026-03-09T16:14:43.111 INFO:tasks.workunit.client.0.vm03.stdout:4/676: sync 2026-03-09T16:14:43.113 INFO:tasks.workunit.client.0.vm03.stdout:6/639: creat d9/d42/d45/d50/d80/d8a/dc1/fc7 x:0 0 0 2026-03-09T16:14:43.116 INFO:tasks.workunit.client.0.vm03.stdout:3/648: dread d5/d1e/f9b [0,4194304] 0 2026-03-09T16:14:43.118 INFO:tasks.workunit.client.0.vm03.stdout:6/640: dwrite d9/d42/fa0 [0,4194304] 0 2026-03-09T16:14:43.120 INFO:tasks.workunit.client.0.vm03.stdout:9/711: creat d2/d4/d11/d12/d28/fd5 x:0 0 0 2026-03-09T16:14:43.134 INFO:tasks.workunit.client.0.vm03.stdout:2/664: creat db/d12/da5/dc2/dc9/ff1 x:0 0 0 2026-03-09T16:14:43.135 INFO:tasks.workunit.client.0.vm03.stdout:8/693: creat da/d6c/dc4/fe6 x:0 0 0 2026-03-09T16:14:43.137 INFO:tasks.workunit.client.0.vm03.stdout:4/677: fdatasync d5/d17/d44/fa4 0 2026-03-09T16:14:43.137 INFO:tasks.workunit.client.0.vm03.stdout:4/678: dread - d5/db/d25/d31/d4d/da9/fc1 zero size 2026-03-09T16:14:43.138 INFO:tasks.workunit.client.0.vm03.stdout:0/656: link d0/d7/d3e/d57/d5a/d82/d89/dbd/c85 d0/d7/d3e/d57/ce5 0 2026-03-09T16:14:43.145 INFO:tasks.workunit.client.0.vm03.stdout:3/649: unlink d5/d2e/c35 0 2026-03-09T16:14:43.149 INFO:tasks.workunit.client.0.vm03.stdout:3/650: dread d5/d1e/d42/f2c [0,4194304] 0 2026-03-09T16:14:43.149 INFO:tasks.workunit.client.0.vm03.stdout:3/651: stat d5/d2e/c4b 0 2026-03-09T16:14:43.150 INFO:tasks.workunit.client.0.vm03.stdout:1/572: read d4/d6/d1d/d20/d93/f48 [2635300,39951] 0 2026-03-09T16:14:43.161 INFO:tasks.workunit.client.0.vm03.stdout:9/712: mkdir d2/d4/d11/d29/d2a/d46/dd6 0 2026-03-09T16:14:43.161 INFO:tasks.workunit.client.0.vm03.stdout:7/606: truncate d4/f8f 3814001 0 2026-03-09T16:14:43.162 INFO:tasks.workunit.client.0.vm03.stdout:8/694: unlink da/d1d/c26 0 2026-03-09T16:14:43.166 INFO:tasks.workunit.client.0.vm03.stdout:2/665: dread db/f23 [0,4194304] 0 2026-03-09T16:14:43.169 INFO:tasks.workunit.client.0.vm03.stdout:4/679: symlink d5/dd/d1f/lcd 0 2026-03-09T16:14:43.171 INFO:tasks.workunit.client.0.vm03.stdout:1/573: unlink d4/d7b/fac 0 2026-03-09T16:14:43.172 INFO:tasks.workunit.client.0.vm03.stdout:1/574: chown d4/d6/l5d 1045400566 1 2026-03-09T16:14:43.183 INFO:tasks.workunit.client.0.vm03.stdout:2/666: rmdir db/d12/d2a/d61/d6d/d8c/d94 39 2026-03-09T16:14:43.186 INFO:tasks.workunit.client.0.vm03.stdout:4/680: symlink d5/dd/d1f/d5f/lce 0 2026-03-09T16:14:43.189 INFO:tasks.workunit.client.0.vm03.stdout:4/681: dwrite d5/dd/f23 [0,4194304] 0 2026-03-09T16:14:43.210 INFO:tasks.workunit.client.0.vm03.stdout:5/734: dwrite d2/d7/d1a/d1c/d6c/f79 [0,4194304] 0 2026-03-09T16:14:43.210 INFO:tasks.workunit.client.0.vm03.stdout:5/735: write d2/d7/de/d11/d38/d52/f7d [669524,115483] 0 2026-03-09T16:14:43.211 INFO:tasks.workunit.client.0.vm03.stdout:5/736: write d2/d7/d8/d24/d27/d43/d4b/fd1 [712467,27841] 0 2026-03-09T16:14:43.215 INFO:tasks.workunit.client.0.vm03.stdout:5/737: dwrite d2/d7/d1a/d1c/d3f/f92 [0,4194304] 0 2026-03-09T16:14:43.238 INFO:tasks.workunit.client.0.vm03.stdout:6/641: mknod d9/d14/cc8 0 2026-03-09T16:14:43.238 INFO:tasks.workunit.client.0.vm03.stdout:6/642: stat d9/d42/d45/d50/d80/d8a/d9c/l7c 0 2026-03-09T16:14:43.245 INFO:tasks.workunit.client.0.vm03.stdout:8/695: dwrite da/fba [0,4194304] 0 2026-03-09T16:14:43.250 INFO:tasks.workunit.client.0.vm03.stdout:0/657: link d0/d7/d3e/d57/d5a/d52/l8d d0/da/d7a/le6 0 2026-03-09T16:14:43.263 INFO:tasks.workunit.client.0.vm03.stdout:1/575: dwrite d4/db/f21 [0,4194304] 0 2026-03-09T16:14:43.263 INFO:tasks.workunit.client.0.vm03.stdout:1/576: dread - d4/db/d8b/fb0 zero size 2026-03-09T16:14:43.272 INFO:tasks.workunit.client.0.vm03.stdout:9/713: creat d2/fd7 x:0 0 0 2026-03-09T16:14:43.278 INFO:tasks.workunit.client.0.vm03.stdout:9/714: fdatasync d2/d4/d11/d12/d28/fd5 0 2026-03-09T16:14:43.278 INFO:tasks.workunit.client.0.vm03.stdout:9/715: dwrite d2/d4/d11/d29/fc0 [0,4194304] 0 2026-03-09T16:14:43.282 INFO:tasks.workunit.client.0.vm03.stdout:9/716: truncate d2/d54/d7d/d8f/dad/fae 1170854 0 2026-03-09T16:14:43.282 INFO:tasks.workunit.client.0.vm03.stdout:9/717: chown d2/d4/f3e 25 1 2026-03-09T16:14:43.293 INFO:tasks.workunit.client.0.vm03.stdout:0/658: chown d0/d7/d3e/d57/d5a/d82/d89/dbd/ccb 13274 1 2026-03-09T16:14:43.297 INFO:tasks.workunit.client.0.vm03.stdout:3/652: getdents d5/d1e/d42/d34 0 2026-03-09T16:14:43.299 INFO:tasks.workunit.client.0.vm03.stdout:3/653: stat d5/d44/d61/fb8 0 2026-03-09T16:14:43.299 INFO:tasks.workunit.client.0.vm03.stdout:9/718: mknod d2/df/d89/cd8 0 2026-03-09T16:14:43.299 INFO:tasks.workunit.client.0.vm03.stdout:9/719: chown d2/d4/d11/d12/l21 1852611 1 2026-03-09T16:14:43.305 INFO:tasks.workunit.client.0.vm03.stdout:0/659: read d0/d7/d3e/d57/d5a/f38 [500511,112804] 0 2026-03-09T16:14:43.305 INFO:tasks.workunit.client.0.vm03.stdout:3/654: creat d5/d1e/d42/d4c/fbf x:0 0 0 2026-03-09T16:14:43.306 INFO:tasks.workunit.client.0.vm03.stdout:0/660: chown d0/d7/d3e/d57/d5a/d5f/db2/dab 109841 1 2026-03-09T16:14:43.307 INFO:tasks.workunit.client.0.vm03.stdout:7/607: dread d4/da/d18/f44 [0,4194304] 0 2026-03-09T16:14:43.313 INFO:tasks.workunit.client.0.vm03.stdout:1/577: mknod d4/d6/d1d/d20/d23/cc4 0 2026-03-09T16:14:43.322 INFO:tasks.workunit.client.0.vm03.stdout:5/738: truncate d2/d7/d8/d24/fb1 1120029 0 2026-03-09T16:14:43.324 INFO:tasks.workunit.client.0.vm03.stdout:4/682: dwrite d5/db/d25/d31/d4d/da9/fab [0,4194304] 0 2026-03-09T16:14:43.327 INFO:tasks.workunit.client.0.vm03.stdout:8/696: dwrite da/db/f34 [0,4194304] 0 2026-03-09T16:14:43.328 INFO:tasks.workunit.client.0.vm03.stdout:6/643: dwrite d9/d14/f3d [0,4194304] 0 2026-03-09T16:14:43.334 INFO:tasks.workunit.client.0.vm03.stdout:6/644: write d9/d42/d45/d65/f7f [293316,64372] 0 2026-03-09T16:14:43.342 INFO:tasks.workunit.client.0.vm03.stdout:4/683: dwrite d5/db/d25/d8b/fc6 [0,4194304] 0 2026-03-09T16:14:43.344 INFO:tasks.workunit.client.0.vm03.stdout:8/697: dread f8 [4194304,4194304] 0 2026-03-09T16:14:43.353 INFO:tasks.workunit.client.0.vm03.stdout:2/667: getdents db/d12/da5/dc2/dc9 0 2026-03-09T16:14:43.359 INFO:tasks.workunit.client.0.vm03.stdout:3/655: rename d5/d1e/d42/d4c/fbf to d5/d2e/db6/fc0 0 2026-03-09T16:14:43.361 INFO:tasks.workunit.client.0.vm03.stdout:7/608: mkdir d4/da/d5d/db0/d9d/dc9 0 2026-03-09T16:14:43.362 INFO:tasks.workunit.client.0.vm03.stdout:1/578: mkdir d4/d6/d3b/dc5 0 2026-03-09T16:14:43.362 INFO:tasks.workunit.client.0.vm03.stdout:1/579: fsync d4/f1b 0 2026-03-09T16:14:43.376 INFO:tasks.workunit.client.0.vm03.stdout:4/684: write d5/f54 [2905000,24353] 0 2026-03-09T16:14:43.379 INFO:tasks.workunit.client.0.vm03.stdout:4/685: dread d5/d17/f39 [0,4194304] 0 2026-03-09T16:14:43.379 INFO:tasks.workunit.client.0.vm03.stdout:4/686: readlink d5/db/d25/d31/d33/d79/l96 0 2026-03-09T16:14:43.380 INFO:tasks.workunit.client.0.vm03.stdout:4/687: write d5/db/d25/d8b/da8/f62 [6787951,130638] 0 2026-03-09T16:14:43.384 INFO:tasks.workunit.client.0.vm03.stdout:8/698: symlink da/d45/le7 0 2026-03-09T16:14:43.385 INFO:tasks.workunit.client.0.vm03.stdout:9/720: mkdir d2/d4/d11/d29/d2a/d46/dd6/dd9 0 2026-03-09T16:14:43.386 INFO:tasks.workunit.client.0.vm03.stdout:9/721: write d2/d4/d11/d29/d2a/f58 [1476733,6766] 0 2026-03-09T16:14:43.392 INFO:tasks.workunit.client.0.vm03.stdout:3/656: rmdir d5/d1e/d42/d8b 39 2026-03-09T16:14:43.393 INFO:tasks.workunit.client.0.vm03.stdout:0/661: link d0/da/d1b/fd d0/da/d5c/fe7 0 2026-03-09T16:14:43.394 INFO:tasks.workunit.client.0.vm03.stdout:0/662: stat d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc 0 2026-03-09T16:14:43.397 INFO:tasks.workunit.client.0.vm03.stdout:1/580: unlink d4/d7b/fbd 0 2026-03-09T16:14:43.398 INFO:tasks.workunit.client.0.vm03.stdout:1/581: chown d4/d6/d1d/d20/d93/l67 4738214 1 2026-03-09T16:14:43.404 INFO:tasks.workunit.client.0.vm03.stdout:7/609: dread d4/d2d/f32 [0,4194304] 0 2026-03-09T16:14:43.414 INFO:tasks.workunit.client.0.vm03.stdout:6/645: dread d9/d14/f44 [0,4194304] 0 2026-03-09T16:14:43.454 INFO:tasks.workunit.client.0.vm03.stdout:9/722: rename d2/de/f87 to d2/d4/d11/d29/d2a/d4d/fda 0 2026-03-09T16:14:43.457 INFO:tasks.workunit.client.0.vm03.stdout:2/668: dwrite db/d12/d2a/d61/f9b [0,4194304] 0 2026-03-09T16:14:43.460 INFO:tasks.workunit.client.0.vm03.stdout:3/657: symlink d5/d1e/lc1 0 2026-03-09T16:14:43.462 INFO:tasks.workunit.client.0.vm03.stdout:3/658: dread d5/f2b [0,4194304] 0 2026-03-09T16:14:43.466 INFO:tasks.workunit.client.0.vm03.stdout:1/582: unlink d4/db/f2e 0 2026-03-09T16:14:43.467 INFO:tasks.workunit.client.0.vm03.stdout:1/583: read - d4/db/d59/f9d zero size 2026-03-09T16:14:43.474 INFO:tasks.workunit.client.0.vm03.stdout:6/646: write d9/d42/f74 [4757959,58417] 0 2026-03-09T16:14:43.477 INFO:tasks.workunit.client.0.vm03.stdout:6/647: dwrite d9/d42/d45/d65/fb5 [0,4194304] 0 2026-03-09T16:14:43.491 INFO:tasks.workunit.client.0.vm03.stdout:9/723: symlink d2/d4/d11/d29/d2a/d38/db6/ldb 0 2026-03-09T16:14:43.507 INFO:tasks.workunit.client.0.vm03.stdout:7/610: fdatasync d4/da/d18/d22/d24/d16/d2b/f5a 0 2026-03-09T16:14:43.512 INFO:tasks.workunit.client.0.vm03.stdout:5/739: truncate d2/d7/d3c/d3d/f93 1544688 0 2026-03-09T16:14:43.517 INFO:tasks.workunit.client.0.vm03.stdout:8/699: dwrite da/d10/d28/d4f/d68/fa7 [0,4194304] 0 2026-03-09T16:14:43.527 INFO:tasks.workunit.client.0.vm03.stdout:2/669: symlink db/d12/d2a/d61/dca/lf2 0 2026-03-09T16:14:43.530 INFO:tasks.workunit.client.0.vm03.stdout:3/659: rename d5/d1e/d42/d4c/l5c to d5/d1e/d42/d55/d86/lc2 0 2026-03-09T16:14:43.533 INFO:tasks.workunit.client.0.vm03.stdout:1/584: dwrite d4/d6/d1d/d20/d93/f85 [0,4194304] 0 2026-03-09T16:14:43.542 INFO:tasks.workunit.client.0.vm03.stdout:5/740: mknod d2/d7/de9/cf9 0 2026-03-09T16:14:43.553 INFO:tasks.workunit.client.0.vm03.stdout:6/648: mkdir d9/d42/d45/d65/dbf/dc9 0 2026-03-09T16:14:43.555 INFO:tasks.workunit.client.0.vm03.stdout:4/688: getdents d5/d17 0 2026-03-09T16:14:43.558 INFO:tasks.workunit.client.0.vm03.stdout:4/689: dwrite d5/db/d25/d31/d4d/d5b/d72/d82/fa7 [0,4194304] 0 2026-03-09T16:14:43.568 INFO:tasks.workunit.client.0.vm03.stdout:7/611: dwrite d4/da/d18/d22/f48 [0,4194304] 0 2026-03-09T16:14:43.570 INFO:tasks.workunit.client.0.vm03.stdout:7/612: read d4/d2d/f32 [1820438,124811] 0 2026-03-09T16:14:43.580 INFO:tasks.workunit.client.0.vm03.stdout:0/663: getdents d0/d7/d3e/d57/d5a/d5f/db2/d8e 0 2026-03-09T16:14:43.581 INFO:tasks.workunit.client.0.vm03.stdout:0/664: stat d0/da/f8b 0 2026-03-09T16:14:43.591 INFO:tasks.workunit.client.0.vm03.stdout:7/613: mkdir d4/da/d5d/db0/d61/dca 0 2026-03-09T16:14:43.598 INFO:tasks.workunit.client.0.vm03.stdout:3/660: mknod d5/d44/cc3 0 2026-03-09T16:14:43.601 INFO:tasks.workunit.client.0.vm03.stdout:5/741: dread d2/d7/d1a/f9f [0,4194304] 0 2026-03-09T16:14:43.601 INFO:tasks.workunit.client.0.vm03.stdout:5/742: read - d2/d75/fec zero size 2026-03-09T16:14:43.602 INFO:tasks.workunit.client.0.vm03.stdout:2/670: write db/d12/d2a/d61/d6d/d8c/d94/da4/fb9 [507535,5254] 0 2026-03-09T16:14:43.613 INFO:tasks.workunit.client.0.vm03.stdout:4/690: symlink d5/db/d25/d31/lcf 0 2026-03-09T16:14:43.617 INFO:tasks.workunit.client.0.vm03.stdout:4/691: dwrite d5/db/d25/d31/d33/d79/fa6 [0,4194304] 0 2026-03-09T16:14:43.618 INFO:tasks.workunit.client.0.vm03.stdout:4/692: chown d5/db/cb5 598224 1 2026-03-09T16:14:43.629 INFO:tasks.workunit.client.0.vm03.stdout:7/614: symlink d4/da/d18/d22/d24/d15/d71/db7/lcb 0 2026-03-09T16:14:43.641 INFO:tasks.workunit.client.0.vm03.stdout:3/661: dread d5/d6d/d6a/f8e [0,4194304] 0 2026-03-09T16:14:43.647 INFO:tasks.workunit.client.0.vm03.stdout:9/724: truncate d2/d4/d11/d29/d2a/f58 3865123 0 2026-03-09T16:14:43.648 INFO:tasks.workunit.client.0.vm03.stdout:9/725: readlink d2/d4/d11/d29/d2a/d38/la6 0 2026-03-09T16:14:43.648 INFO:tasks.workunit.client.0.vm03.stdout:8/700: write da/d10/d28/d64/fc8 [4880121,98195] 0 2026-03-09T16:14:43.650 INFO:tasks.workunit.client.0.vm03.stdout:6/649: dwrite d9/d14/f31 [4194304,4194304] 0 2026-03-09T16:14:43.655 INFO:tasks.workunit.client.0.vm03.stdout:6/650: dread d9/d42/d45/d65/fb5 [0,4194304] 0 2026-03-09T16:14:43.663 INFO:tasks.workunit.client.0.vm03.stdout:1/585: dwrite d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:43.666 INFO:tasks.workunit.client.0.vm03.stdout:0/665: dwrite d0/da/d1b/d9b/f61 [0,4194304] 0 2026-03-09T16:14:43.668 INFO:tasks.workunit.client.0.vm03.stdout:1/586: read - d4/d6/da2/fbb zero size 2026-03-09T16:14:43.680 INFO:tasks.workunit.client.0.vm03.stdout:4/693: creat d5/db/d25/d8b/da8/dbe/fd0 x:0 0 0 2026-03-09T16:14:43.680 INFO:tasks.workunit.client.0.vm03.stdout:4/694: readlink d5/l88 0 2026-03-09T16:14:43.681 INFO:tasks.workunit.client.0.vm03.stdout:4/695: fdatasync d5/db/d25/d31/d4d/da9/fc1 0 2026-03-09T16:14:43.682 INFO:tasks.workunit.client.0.vm03.stdout:4/696: write d5/db/d25/d31/d4d/fb2 [1432172,35422] 0 2026-03-09T16:14:43.687 INFO:tasks.workunit.client.0.vm03.stdout:2/671: symlink db/d12/d2a/d61/d79/d83/d64/lf3 0 2026-03-09T16:14:43.687 INFO:tasks.workunit.client.0.vm03.stdout:2/672: dread - db/d12/d2a/f8d zero size 2026-03-09T16:14:43.688 INFO:tasks.workunit.client.0.vm03.stdout:2/673: write db/d12/d2a/d61/d79/d83/d64/dbd/fd8 [102237,124470] 0 2026-03-09T16:14:43.690 INFO:tasks.workunit.client.0.vm03.stdout:2/674: dread - db/d12/d2a/d61/f74 zero size 2026-03-09T16:14:43.701 INFO:tasks.workunit.client.0.vm03.stdout:2/675: dwrite db/d12/d2a/d61/d79/d83/fe9 [0,4194304] 0 2026-03-09T16:14:43.701 INFO:tasks.workunit.client.0.vm03.stdout:7/615: fsync d4/da/d18/f6a 0 2026-03-09T16:14:43.714 INFO:tasks.workunit.client.0.vm03.stdout:8/701: fdatasync da/d1d/f99 0 2026-03-09T16:14:43.715 INFO:tasks.workunit.client.0.vm03.stdout:9/726: dread d2/f33 [0,4194304] 0 2026-03-09T16:14:43.814 INFO:tasks.workunit.client.0.vm03.stdout:3/662: dwrite d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:43.815 INFO:tasks.workunit.client.0.vm03.stdout:3/663: fsync d5/d44/f5d 0 2026-03-09T16:14:43.843 INFO:tasks.workunit.client.0.vm03.stdout:0/666: symlink d0/d7/d75/le8 0 2026-03-09T16:14:43.854 INFO:tasks.workunit.client.0.vm03.stdout:4/697: truncate d5/dd/d1f/d5f/f98 333080 0 2026-03-09T16:14:43.857 INFO:tasks.workunit.client.0.vm03.stdout:4/698: dwrite d5/d17/da0/fb9 [0,4194304] 0 2026-03-09T16:14:43.877 INFO:tasks.workunit.client.0.vm03.stdout:3/664: rename d5/d2e/l3b to d5/d1e/d42/d34/d70/lc4 0 2026-03-09T16:14:43.877 INFO:tasks.workunit.client.0.vm03.stdout:3/665: stat d5/d6d/d5a/f7c 0 2026-03-09T16:14:43.896 INFO:tasks.workunit.client.0.vm03.stdout:6/651: symlink d9/d42/d45/d65/dae/lca 0 2026-03-09T16:14:43.901 INFO:tasks.workunit.client.0.vm03.stdout:0/667: fsync d0/da/d5c/fae 0 2026-03-09T16:14:43.904 INFO:tasks.workunit.client.0.vm03.stdout:6/652: dread d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:43.904 INFO:tasks.workunit.client.0.vm03.stdout:6/653: chown d9 2 1 2026-03-09T16:14:43.909 INFO:tasks.workunit.client.0.vm03.stdout:1/587: creat d4/db/d8b/db2/fc6 x:0 0 0 2026-03-09T16:14:43.921 INFO:tasks.workunit.client.0.vm03.stdout:5/743: link d2/d7/de/d11/d19/d29/d90/db6/cca d2/d7/de/d11/d19/d31/d35/d87/cfa 0 2026-03-09T16:14:43.921 INFO:tasks.workunit.client.0.vm03.stdout:5/744: chown d2/d75/fec 100610 1 2026-03-09T16:14:43.925 INFO:tasks.workunit.client.0.vm03.stdout:5/745: dwrite d2/d7/d1a/d1c/d6c/f79 [4194304,4194304] 0 2026-03-09T16:14:43.926 INFO:tasks.workunit.client.0.vm03.stdout:5/746: readlink d2/d7/d1a/d1c/d3f/l45 0 2026-03-09T16:14:43.940 INFO:tasks.workunit.client.0.vm03.stdout:7/616: creat d4/da/d5d/db0/d61/dca/fcc x:0 0 0 2026-03-09T16:14:43.940 INFO:tasks.workunit.client.0.vm03.stdout:7/617: fdatasync d4/da/d5d/f9b 0 2026-03-09T16:14:43.945 INFO:tasks.workunit.client.0.vm03.stdout:8/702: mkdir da/d10/d28/db1/dce/de8 0 2026-03-09T16:14:43.973 INFO:tasks.workunit.client.0.vm03.stdout:6/654: dwrite d9/d42/d45/d65/fb5 [0,4194304] 0 2026-03-09T16:14:43.975 INFO:tasks.workunit.client.0.vm03.stdout:6/655: write d9/d42/f9a [70327,73164] 0 2026-03-09T16:14:43.985 INFO:tasks.workunit.client.0.vm03.stdout:4/699: mkdir d5/db/d25/d8b/da8/dbe/dc4/dd1 0 2026-03-09T16:14:43.991 INFO:tasks.workunit.client.0.vm03.stdout:5/747: rmdir d2/d7/de/d11/d38 39 2026-03-09T16:14:43.992 INFO:tasks.workunit.client.0.vm03.stdout:5/748: write d2/d7/d1a/f6e [4305159,46475] 0 2026-03-09T16:14:43.992 INFO:tasks.workunit.client.0.vm03.stdout:5/749: dread - d2/d7/d8/d24/ff3 zero size 2026-03-09T16:14:43.997 INFO:tasks.workunit.client.0.vm03.stdout:7/618: unlink d4/da/d18/d22/d24/d15/d71/db7/lcb 0 2026-03-09T16:14:43.998 INFO:tasks.workunit.client.0.vm03.stdout:4/700: dread d5/db/d25/d8b/da8/f62 [4194304,4194304] 0 2026-03-09T16:14:43.998 INFO:tasks.workunit.client.0.vm03.stdout:4/701: chown d5/d17/f8d 61570 1 2026-03-09T16:14:44.000 INFO:tasks.workunit.client.0.vm03.stdout:8/703: symlink da/db/d30/le9 0 2026-03-09T16:14:44.002 INFO:tasks.workunit.client.0.vm03.stdout:8/704: dread da/d10/d28/d64/fab [0,4194304] 0 2026-03-09T16:14:44.003 INFO:tasks.workunit.client.0.vm03.stdout:8/705: readlink da/d32/d79/l9f 0 2026-03-09T16:14:44.019 INFO:tasks.workunit.client.0.vm03.stdout:9/727: getdents d2/d4/d11/d12/db2 0 2026-03-09T16:14:44.021 INFO:tasks.workunit.client.0.vm03.stdout:6/656: creat d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/fcb x:0 0 0 2026-03-09T16:14:44.023 INFO:tasks.workunit.client.0.vm03.stdout:1/588: rename d4/d6/d3b/f9b to d4/d6/d3b/d6b/d25/fc7 0 2026-03-09T16:14:44.023 INFO:tasks.workunit.client.0.vm03.stdout:7/619: rename d4/da/d18/d22 to d4/da/d18/d22/d24/d16/d69/dcd 22 2026-03-09T16:14:44.028 INFO:tasks.workunit.client.0.vm03.stdout:2/676: dwrite db/d12/f49 [0,4194304] 0 2026-03-09T16:14:44.039 INFO:tasks.workunit.client.0.vm03.stdout:5/750: symlink d2/d7/de/d11/d19/d29/d90/lfb 0 2026-03-09T16:14:44.049 INFO:tasks.workunit.client.0.vm03.stdout:1/589: dread d4/d39/f5a [0,4194304] 0 2026-03-09T16:14:44.050 INFO:tasks.workunit.client.0.vm03.stdout:1/590: readlink d4/d6/d1d/d20/d5f/lb7 0 2026-03-09T16:14:44.057 INFO:tasks.workunit.client.0.vm03.stdout:9/728: creat d2/d4/d11/d12/db2/dc2/fdc x:0 0 0 2026-03-09T16:14:44.073 INFO:tasks.workunit.client.0.vm03.stdout:5/751: mkdir d2/d7/d8/d16/d5c/dfc 0 2026-03-09T16:14:44.073 INFO:tasks.workunit.client.0.vm03.stdout:5/752: dwrite d2/d7/de/d33/f8b [0,4194304] 0 2026-03-09T16:14:44.073 INFO:tasks.workunit.client.0.vm03.stdout:7/620: read d4/da/d45/d51/f50 [205098,128344] 0 2026-03-09T16:14:44.089 INFO:tasks.workunit.client.0.vm03.stdout:1/591: dread d4/d6/d1d/d3d/f49 [0,4194304] 0 2026-03-09T16:14:44.095 INFO:tasks.workunit.client.0.vm03.stdout:7/621: unlink d4/da/d18/lc3 0 2026-03-09T16:14:44.099 INFO:tasks.workunit.client.0.vm03.stdout:0/668: unlink d0/d7/d3e/fd4 0 2026-03-09T16:14:44.117 INFO:tasks.workunit.client.0.vm03.stdout:0/669: truncate d0/f4d 4117980 0 2026-03-09T16:14:44.123 INFO:tasks.workunit.client.0.vm03.stdout:1/592: link d4/d6/d3b/d6b/d25/f84 d4/d6/d1d/d20/fc8 0 2026-03-09T16:14:44.137 INFO:tasks.workunit.client.0.vm03.stdout:3/666: link d5/d1e/f31 d5/d1e/fc5 0 2026-03-09T16:14:44.137 INFO:tasks.workunit.client.0.vm03.stdout:3/667: readlink d5/d6d/d5a/laf 0 2026-03-09T16:14:44.137 INFO:tasks.workunit.client.0.vm03.stdout:3/668: chown d5/d6d/d5a/d63 11225 1 2026-03-09T16:14:44.139 INFO:tasks.workunit.client.0.vm03.stdout:2/677: sync 2026-03-09T16:14:44.147 INFO:tasks.workunit.client.0.vm03.stdout:3/669: dread d5/d1e/d42/f99 [0,4194304] 0 2026-03-09T16:14:44.147 INFO:tasks.workunit.client.0.vm03.stdout:3/670: stat d5/d44/f5d 0 2026-03-09T16:14:44.152 INFO:tasks.workunit.client.0.vm03.stdout:4/702: rename d5/db/d25/d8b/da8/dbe/dc4 to d5/db/d25/dc8/dd2 0 2026-03-09T16:14:44.154 INFO:tasks.workunit.client.0.vm03.stdout:2/678: unlink db/d12/d2a/d61/d79/f95 0 2026-03-09T16:14:44.157 INFO:tasks.workunit.client.0.vm03.stdout:2/679: dwrite db/d12/d2a/d61/dbe/fe6 [0,4194304] 0 2026-03-09T16:14:44.167 INFO:tasks.workunit.client.0.vm03.stdout:9/729: rename d2/d4/d11/d29/d2a/d38/f72 to d2/d4/d11/d12/db2/dc2/fdd 0 2026-03-09T16:14:44.168 INFO:tasks.workunit.client.0.vm03.stdout:5/753: creat d2/d7/de/ffd x:0 0 0 2026-03-09T16:14:44.170 INFO:tasks.workunit.client.0.vm03.stdout:4/703: rmdir d5/db/d25/d31/d4d/da9 39 2026-03-09T16:14:44.172 INFO:tasks.workunit.client.0.vm03.stdout:5/754: dwrite d2/d75/fe3 [0,4194304] 0 2026-03-09T16:14:44.179 INFO:tasks.workunit.client.0.vm03.stdout:8/706: write da/f52 [1066187,31332] 0 2026-03-09T16:14:44.186 INFO:tasks.workunit.client.0.vm03.stdout:3/671: rename d5/d1e/fa4 to d5/d6d/d6a/fc6 0 2026-03-09T16:14:44.187 INFO:tasks.workunit.client.0.vm03.stdout:3/672: write d5/d1e/d42/f25 [3482537,67664] 0 2026-03-09T16:14:44.193 INFO:tasks.workunit.client.0.vm03.stdout:6/657: truncate d9/d42/fa6 81306 0 2026-03-09T16:14:44.195 INFO:tasks.workunit.client.0.vm03.stdout:7/622: write d4/da/d18/d22/d24/f59 [2190564,64537] 0 2026-03-09T16:14:44.198 INFO:tasks.workunit.client.0.vm03.stdout:7/623: dwrite d4/da/d5d/f9b [4194304,4194304] 0 2026-03-09T16:14:44.210 INFO:tasks.workunit.client.0.vm03.stdout:4/704: unlink d5/cc 0 2026-03-09T16:14:44.213 INFO:tasks.workunit.client.0.vm03.stdout:5/755: creat d2/d7/de9/ffe x:0 0 0 2026-03-09T16:14:44.222 INFO:tasks.workunit.client.0.vm03.stdout:3/673: creat d5/d1e/d42/d34/d70/fc7 x:0 0 0 2026-03-09T16:14:44.222 INFO:tasks.workunit.client.0.vm03.stdout:0/670: dwrite d0/d7/f56 [0,4194304] 0 2026-03-09T16:14:44.224 INFO:tasks.workunit.client.0.vm03.stdout:1/593: dwrite d4/d6/f15 [0,4194304] 0 2026-03-09T16:14:44.228 INFO:tasks.workunit.client.0.vm03.stdout:1/594: dread - d4/d6/d1d/d20/fc1 zero size 2026-03-09T16:14:44.252 INFO:tasks.workunit.client.0.vm03.stdout:6/658: unlink d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/fc2 0 2026-03-09T16:14:44.256 INFO:tasks.workunit.client.0.vm03.stdout:7/624: mknod d4/da/d45/d51/d36/cce 0 2026-03-09T16:14:44.260 INFO:tasks.workunit.client.0.vm03.stdout:5/756: mknod d2/d7/d8/d24/d27/d43/d4b/cff 0 2026-03-09T16:14:44.261 INFO:tasks.workunit.client.0.vm03.stdout:5/757: write d2/d7/d8/f36 [4211131,14464] 0 2026-03-09T16:14:44.267 INFO:tasks.workunit.client.0.vm03.stdout:6/659: dread d9/d42/d45/d50/d80/fa1 [0,4194304] 0 2026-03-09T16:14:44.275 INFO:tasks.workunit.client.0.vm03.stdout:8/707: write da/d32/f66 [1134803,3320] 0 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd='[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]': finished 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T16:14:44.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:44 vm05.local ceph-mon[58702]: mgrmap e25: vm03.gbgzmu(active, starting, since 0.030576s) 2026-03-09T16:14:44.281 INFO:tasks.workunit.client.0.vm03.stdout:7/625: rename d4/da/d18/d22/d24/d16/d3e/f75 to d4/da/d18/d22/d24/d16/d3e/db5/fcf 0 2026-03-09T16:14:44.283 INFO:tasks.workunit.client.0.vm03.stdout:4/705: creat d5/db/d25/d31/d4d/d5b/d72/dcb/fd3 x:0 0 0 2026-03-09T16:14:44.283 INFO:tasks.workunit.client.0.vm03.stdout:4/706: fdatasync d5/d17/f2b 0 2026-03-09T16:14:44.287 INFO:tasks.workunit.client.0.vm03.stdout:5/758: sync 2026-03-09T16:14:44.288 INFO:tasks.workunit.client.0.vm03.stdout:7/626: sync 2026-03-09T16:14:44.289 INFO:tasks.workunit.client.0.vm03.stdout:7/627: chown d4/da/d18/d22/d24/d16/d3e/d77/ca5 3 1 2026-03-09T16:14:44.289 INFO:tasks.workunit.client.0.vm03.stdout:7/628: chown d4/da/d5d/db0/l7c 169966 1 2026-03-09T16:14:44.295 INFO:tasks.workunit.client.0.vm03.stdout:9/730: write d2/d4/d11/d12/d28/f2f [5090,45456] 0 2026-03-09T16:14:44.302 INFO:tasks.workunit.client.0.vm03.stdout:0/671: dwrite d0/d7/d3e/d57/d5a/d5f/db2/f5e [0,4194304] 0 2026-03-09T16:14:44.308 INFO:tasks.workunit.client.0.vm03.stdout:6/660: truncate d9/d42/d45/d50/d80/d8a/d9c/f6a 2001850 0 2026-03-09T16:14:44.308 INFO:tasks.workunit.client.0.vm03.stdout:8/708: rmdir da/d6c/dc4 39 2026-03-09T16:14:44.315 INFO:tasks.workunit.client.0.vm03.stdout:4/707: chown d5/dd/d1f/l3f 73 1 2026-03-09T16:14:44.319 INFO:tasks.workunit.client.0.vm03.stdout:4/708: dwrite d5/d17/d44/f84 [0,4194304] 0 2026-03-09T16:14:44.321 INFO:tasks.workunit.client.0.vm03.stdout:2/680: link db/cc db/d12/cf4 0 2026-03-09T16:14:44.328 INFO:tasks.workunit.client.0.vm03.stdout:0/672: mkdir d0/d7/d3e/d57/de9 0 2026-03-09T16:14:44.329 INFO:tasks.workunit.client.0.vm03.stdout:0/673: stat d0/da/d1b/dc8 0 2026-03-09T16:14:44.329 INFO:tasks.workunit.client.0.vm03.stdout:3/674: creat d5/fc8 x:0 0 0 2026-03-09T16:14:44.330 INFO:tasks.workunit.client.0.vm03.stdout:6/661: readlink d9/l46 0 2026-03-09T16:14:44.340 INFO:tasks.workunit.client.0.vm03.stdout:6/662: readlink d9/d14/l39 0 2026-03-09T16:14:44.340 INFO:tasks.workunit.client.0.vm03.stdout:9/731: creat d2/d4/d11/d12/db2/dce/fde x:0 0 0 2026-03-09T16:14:44.340 INFO:tasks.workunit.client.0.vm03.stdout:0/674: creat d0/da/d5c/db6/fea x:0 0 0 2026-03-09T16:14:44.340 INFO:tasks.workunit.client.0.vm03.stdout:9/732: dwrite d2/df/f22 [0,4194304] 0 2026-03-09T16:14:44.342 INFO:tasks.workunit.client.0.vm03.stdout:1/595: getdents d4/d6/d3b 0 2026-03-09T16:14:44.343 INFO:tasks.workunit.client.0.vm03.stdout:1/596: dread - d4/d6/da2/fbb zero size 2026-03-09T16:14:44.344 INFO:tasks.workunit.client.0.vm03.stdout:8/709: getdents da/d10/d28/db1/dce/de8 0 2026-03-09T16:14:44.345 INFO:tasks.workunit.client.0.vm03.stdout:0/675: sync 2026-03-09T16:14:44.348 INFO:tasks.workunit.client.0.vm03.stdout:7/629: link d4/da/d5d/db0/d9d/fac d4/da/d5d/db0/d9d/dc9/fd0 0 2026-03-09T16:14:44.350 INFO:tasks.workunit.client.0.vm03.stdout:3/675: mkdir d5/d53/d6c/d79/d91/dc9 0 2026-03-09T16:14:44.359 INFO:tasks.workunit.client.0.vm03.stdout:5/759: getdents d2/d7/de 0 2026-03-09T16:14:44.393 INFO:tasks.workunit.client.0.vm03.stdout:2/681: dwrite db/d12/d2a/d61/d6d/f8f [0,4194304] 0 2026-03-09T16:14:44.395 INFO:tasks.workunit.client.0.vm03.stdout:6/663: write d9/d42/d45/d50/d80/d8a/d9c/d97/f9d [947107,50554] 0 2026-03-09T16:14:44.408 INFO:tasks.workunit.client.0.vm03.stdout:8/710: write da/d10/d28/f5c [4184044,64536] 0 2026-03-09T16:14:44.414 INFO:tasks.workunit.client.0.vm03.stdout:0/676: dwrite d0/da/d1b/d9b/f87 [0,4194304] 0 2026-03-09T16:14:44.420 INFO:tasks.workunit.client.0.vm03.stdout:0/677: dread d0/da/d5c/fae [0,4194304] 0 2026-03-09T16:14:44.427 INFO:tasks.workunit.client.0.vm03.stdout:7/630: dwrite d4/da/d45/d51/f91 [0,4194304] 0 2026-03-09T16:14:44.435 INFO:tasks.workunit.client.0.vm03.stdout:1/597: dwrite d4/d6/d3b/d6b/d25/f84 [0,4194304] 0 2026-03-09T16:14:44.441 INFO:tasks.workunit.client.0.vm03.stdout:1/598: read d4/d6/d1d/d20/d93/f48 [464614,10499] 0 2026-03-09T16:14:44.445 INFO:tasks.workunit.client.0.vm03.stdout:1/599: dwrite d4/d6/d1d/d20/f2a [0,4194304] 0 2026-03-09T16:14:44.448 INFO:tasks.workunit.client.0.vm03.stdout:1/600: chown d4/d6/d3b/d6b/da5 659 1 2026-03-09T16:14:44.480 INFO:tasks.workunit.client.0.vm03.stdout:9/733: rename d2/d4/d11/d12/lb1 to d2/d54/ldf 0 2026-03-09T16:14:44.485 INFO:tasks.workunit.client.0.vm03.stdout:3/676: symlink d5/d1e/d42/d8b/lca 0 2026-03-09T16:14:44.488 INFO:tasks.workunit.client.0.vm03.stdout:2/682: rmdir db/d12/d2a/d61/d79/d83/d64 39 2026-03-09T16:14:44.488 INFO:tasks.workunit.client.0.vm03.stdout:2/683: read db/d12/d2a/d61/dbe/fe6 [4097558,77884] 0 2026-03-09T16:14:44.490 INFO:tasks.workunit.client.0.vm03.stdout:6/664: readlink d9/d14/l61 0 2026-03-09T16:14:44.493 INFO:tasks.workunit.client.0.vm03.stdout:8/711: dread - da/d10/d28/d4f/d68/fc1 zero size 2026-03-09T16:14:44.507 INFO:tasks.workunit.client.0.vm03.stdout:0/678: dread d0/da/f8b [0,4194304] 0 2026-03-09T16:14:44.517 INFO:tasks.workunit.client.0.vm03.stdout:7/631: dwrite d4/da/d45/d51/d36/f6f [0,4194304] 0 2026-03-09T16:14:44.525 INFO:tasks.workunit.client.0.vm03.stdout:5/760: rename d2/d7/de9/ffe to d2/d7/d8/d24/d27/d43/f100 0 2026-03-09T16:14:44.525 INFO:tasks.workunit.client.0.vm03.stdout:5/761: chown d2/d7/de/d11/d19/d29/d90/dbe/df5 2644 1 2026-03-09T16:14:44.531 INFO:tasks.workunit.client.0.vm03.stdout:4/709: mkdir d5/db/d25/d8b/da8/d81/dd4 0 2026-03-09T16:14:44.538 INFO:tasks.workunit.client.0.vm03.stdout:2/684: readlink db/d12/d2a/d61/d79/d83/l3e 0 2026-03-09T16:14:44.538 INFO:tasks.workunit.client.0.vm03.stdout:2/685: chown db/d12/d2a/d61/d6d/d8c/d94 855943 1 2026-03-09T16:14:44.546 INFO:tasks.workunit.client.0.vm03.stdout:6/665: write d9/d14/f44 [1534122,72660] 0 2026-03-09T16:14:44.550 INFO:tasks.workunit.client.0.vm03.stdout:8/712: read da/d10/d28/d4f/daf/fb7 [6788712,122467] 0 2026-03-09T16:14:44.551 INFO:tasks.workunit.client.0.vm03.stdout:8/713: write da/d10/fa4 [5078276,59979] 0 2026-03-09T16:14:44.551 INFO:tasks.workunit.client.0.vm03.stdout:8/714: dread - da/d6c/fae zero size 2026-03-09T16:14:44.562 INFO:tasks.workunit.client.0.vm03.stdout:0/679: rmdir d0/d7/d3e/d57/d5a/d5f/db2/dcf 39 2026-03-09T16:14:44.566 INFO:tasks.workunit.client.0.vm03.stdout:0/680: dwrite d0/da/d1b/d9b/f87 [0,4194304] 0 2026-03-09T16:14:44.575 INFO:tasks.workunit.client.0.vm03.stdout:7/632: creat d4/da/d18/d22/d24/d15/d71/fd1 x:0 0 0 2026-03-09T16:14:44.576 INFO:tasks.workunit.client.0.vm03.stdout:7/633: dread - d4/da/d5d/db0/d61/dca/fcc zero size 2026-03-09T16:14:44.584 INFO:tasks.workunit.client.0.vm03.stdout:9/734: mkdir d2/d4/d11/d29/d2a/db3/dbe/de0 0 2026-03-09T16:14:44.595 INFO:tasks.workunit.client.0.vm03.stdout:4/710: dread d5/db/d25/d31/d33/d79/f4f [0,4194304] 0 2026-03-09T16:14:44.607 INFO:tasks.workunit.client.0.vm03.stdout:2/686: dwrite f7 [0,4194304] 0 2026-03-09T16:14:44.609 INFO:tasks.workunit.client.0.vm03.stdout:2/687: chown db/d12/d2a/d61/d79/d83/fe9 1037438097 1 2026-03-09T16:14:44.621 INFO:tasks.workunit.client.0.vm03.stdout:7/634: creat d4/da/d5d/db0/d61/fd2 x:0 0 0 2026-03-09T16:14:44.624 INFO:tasks.workunit.client.0.vm03.stdout:1/601: link d4/d31/d5c/da8/l65 d4/d39/d70/lc9 0 2026-03-09T16:14:44.630 INFO:tasks.workunit.client.0.vm03.stdout:1/602: dwrite d4/db/d59/f9d [0,4194304] 0 2026-03-09T16:14:44.634 INFO:tasks.workunit.client.0.vm03.stdout:5/762: mknod d2/d7/de/d11/d19/d29/d90/dbe/df5/c101 0 2026-03-09T16:14:44.638 INFO:tasks.workunit.client.0.vm03.stdout:9/735: creat d2/d4/d11/d29/d2a/db3/fe1 x:0 0 0 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 192.168.123.105:0/1754351883' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: from='mgr.24357 ' entity='mgr.vm05.dygxfv' cmd='[{"prefix": "mgr fail", "who": "vm05.dygxfv"}]': finished 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T16:14:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:44 vm03.local ceph-mon[51019]: mgrmap e25: vm03.gbgzmu(active, starting, since 0.030576s) 2026-03-09T16:14:44.645 INFO:tasks.workunit.client.0.vm03.stdout:0/681: dwrite d0/da/d7a/fac [0,4194304] 0 2026-03-09T16:14:44.647 INFO:tasks.workunit.client.0.vm03.stdout:0/682: write d0/da/d1b/d9b/f61 [3717643,29631] 0 2026-03-09T16:14:44.655 INFO:tasks.workunit.client.0.vm03.stdout:2/688: fdatasync db/f23 0 2026-03-09T16:14:44.658 INFO:tasks.workunit.client.0.vm03.stdout:8/715: symlink da/d32/db5/lea 0 2026-03-09T16:14:44.671 INFO:tasks.workunit.client.0.vm03.stdout:7/635: dread d4/da/d45/d51/f5b [0,4194304] 0 2026-03-09T16:14:44.672 INFO:tasks.workunit.client.0.vm03.stdout:1/603: creat d4/db/d59/fca x:0 0 0 2026-03-09T16:14:44.673 INFO:tasks.workunit.client.0.vm03.stdout:1/604: fsync d4/db/f7d 0 2026-03-09T16:14:44.678 INFO:tasks.workunit.client.0.vm03.stdout:6/666: truncate d9/d14/f3d 181876 0 2026-03-09T16:14:44.688 INFO:tasks.workunit.client.0.vm03.stdout:3/677: getdents d5/d6d/d6a 0 2026-03-09T16:14:44.688 INFO:tasks.workunit.client.0.vm03.stdout:5/763: dwrite d2/d7/d8/d16/d5c/f94 [0,4194304] 0 2026-03-09T16:14:44.698 INFO:tasks.workunit.client.0.vm03.stdout:4/711: truncate d5/d56/f6a 237946 0 2026-03-09T16:14:44.707 INFO:tasks.workunit.client.0.vm03.stdout:8/716: symlink da/d10/d28/d4f/leb 0 2026-03-09T16:14:44.709 INFO:tasks.workunit.client.0.vm03.stdout:7/636: rmdir d4/da/d18/d22/d24/d16/d3e/db5 39 2026-03-09T16:14:44.715 INFO:tasks.workunit.client.0.vm03.stdout:6/667: symlink d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/lcc 0 2026-03-09T16:14:44.719 INFO:tasks.workunit.client.0.vm03.stdout:9/736: truncate d2/d4/d1f/f44 462076 0 2026-03-09T16:14:44.727 INFO:tasks.workunit.client.0.vm03.stdout:5/764: creat d2/d7/de/d11/dbf/f102 x:0 0 0 2026-03-09T16:14:44.732 INFO:tasks.workunit.client.0.vm03.stdout:3/678: dwrite d5/d1e/f31 [0,4194304] 0 2026-03-09T16:14:44.735 INFO:tasks.workunit.client.0.vm03.stdout:4/712: mkdir d5/dd/dd5 0 2026-03-09T16:14:44.736 INFO:tasks.workunit.client.0.vm03.stdout:4/713: chown d5/db/d25/d31/d33/c3b 5 1 2026-03-09T16:14:44.737 INFO:tasks.workunit.client.0.vm03.stdout:0/683: getdents d0/d7/d3e/d95 0 2026-03-09T16:14:44.740 INFO:tasks.workunit.client.0.vm03.stdout:0/684: dwrite d0/fde [0,4194304] 0 2026-03-09T16:14:44.747 INFO:tasks.workunit.client.0.vm03.stdout:0/685: dwrite d0/d7/d3e/d57/d5a/d52/d9f/fd3 [0,4194304] 0 2026-03-09T16:14:44.754 INFO:tasks.workunit.client.0.vm03.stdout:2/689: mkdir db/d12/d2a/d61/d79/d83/d64/dbd/df5 0 2026-03-09T16:14:44.758 INFO:tasks.workunit.client.0.vm03.stdout:2/690: fsync db/d12/da5/dc2/dc9/ff1 0 2026-03-09T16:14:44.760 INFO:tasks.workunit.client.0.vm03.stdout:8/717: rmdir da/db 39 2026-03-09T16:14:44.770 INFO:tasks.workunit.client.0.vm03.stdout:6/668: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcd x:0 0 0 2026-03-09T16:14:44.773 INFO:tasks.workunit.client.0.vm03.stdout:9/737: read d2/d4/d11/d29/f70 [184783,49159] 0 2026-03-09T16:14:44.779 INFO:tasks.workunit.client.0.vm03.stdout:5/765: creat d2/d7/de9/f103 x:0 0 0 2026-03-09T16:14:44.783 INFO:tasks.workunit.client.0.vm03.stdout:3/679: truncate d5/d53/d6c/d79/f9d 104605 0 2026-03-09T16:14:44.784 INFO:tasks.workunit.client.0.vm03.stdout:3/680: write d5/d1e/d42/d34/fad [974453,95015] 0 2026-03-09T16:14:44.785 INFO:tasks.workunit.client.0.vm03.stdout:3/681: chown d5/d1e/d42/c32 12176 1 2026-03-09T16:14:44.790 INFO:tasks.workunit.client.0.vm03.stdout:4/714: mkdir d5/db/d25/d8b/dd6 0 2026-03-09T16:14:44.799 INFO:tasks.workunit.client.0.vm03.stdout:1/605: truncate d4/d6/d1d/d20/fc8 2516637 0 2026-03-09T16:14:44.800 INFO:tasks.workunit.client.0.vm03.stdout:1/606: write d4/d31/d5c/f9e [807661,111324] 0 2026-03-09T16:14:44.813 INFO:tasks.workunit.client.0.vm03.stdout:0/686: read - d0/d7/d3e/d57/d5a/d82/f8a zero size 2026-03-09T16:14:44.821 INFO:tasks.workunit.client.0.vm03.stdout:5/766: symlink d2/d7/de/d33/l104 0 2026-03-09T16:14:44.825 INFO:tasks.workunit.client.0.vm03.stdout:4/715: truncate d5/db/d25/d8b/fa3 109308 0 2026-03-09T16:14:44.828 INFO:tasks.workunit.client.0.vm03.stdout:1/607: unlink d4/d6/d1d/l96 0 2026-03-09T16:14:44.836 INFO:tasks.workunit.client.0.vm03.stdout:9/738: dread d2/d4/d11/d12/f35 [4194304,4194304] 0 2026-03-09T16:14:44.837 INFO:tasks.workunit.client.0.vm03.stdout:8/718: creat da/db/d30/dc7/fec x:0 0 0 2026-03-09T16:14:44.837 INFO:tasks.workunit.client.0.vm03.stdout:8/719: stat da/d10/d28/d4f/d68/fa7 0 2026-03-09T16:14:44.846 INFO:tasks.workunit.client.0.vm03.stdout:5/767: dread - d2/d7/de/d11/d19/d29/fa1 zero size 2026-03-09T16:14:44.852 INFO:tasks.workunit.client.0.vm03.stdout:0/687: link d0/da/d5c/db6/fea d0/d7/d75/feb 0 2026-03-09T16:14:44.854 INFO:tasks.workunit.client.0.vm03.stdout:2/691: creat db/d12/d2a/d61/d79/d83/d64/ff6 x:0 0 0 2026-03-09T16:14:44.856 INFO:tasks.workunit.client.0.vm03.stdout:9/739: mknod d2/d54/d7d/ce2 0 2026-03-09T16:14:44.859 INFO:tasks.workunit.client.0.vm03.stdout:7/637: getdents d4/da/d5d/db0/d61 0 2026-03-09T16:14:44.863 INFO:tasks.workunit.client.0.vm03.stdout:3/682: rename d5/c2f to d5/d6d/ccb 0 2026-03-09T16:14:44.863 INFO:tasks.workunit.client.0.vm03.stdout:4/716: rename d5/d17 to d5/d17/d44/dd7 22 2026-03-09T16:14:44.864 INFO:tasks.workunit.client.0.vm03.stdout:4/717: chown d5/db/d25/d31/d4d/d5b/d72/d77/f91 160012 1 2026-03-09T16:14:44.868 INFO:tasks.workunit.client.0.vm03.stdout:2/692: mkdir db/d12/d2a/d61/d6d/d8c/d94/dad/db3/df7 0 2026-03-09T16:14:44.872 INFO:tasks.workunit.client.0.vm03.stdout:9/740: symlink d2/d4/d11/d12/db2/le3 0 2026-03-09T16:14:44.874 INFO:tasks.workunit.client.0.vm03.stdout:1/608: sync 2026-03-09T16:14:44.874 INFO:tasks.workunit.client.0.vm03.stdout:3/683: sync 2026-03-09T16:14:44.874 INFO:tasks.workunit.client.0.vm03.stdout:1/609: chown d4/d7b/l9c 10 1 2026-03-09T16:14:44.878 INFO:tasks.workunit.client.0.vm03.stdout:6/669: truncate d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f62 3568792 0 2026-03-09T16:14:44.903 INFO:tasks.workunit.client.0.vm03.stdout:9/741: truncate d2/d4/d11/d29/d2a/d38/f74 3976815 0 2026-03-09T16:14:44.907 INFO:tasks.workunit.client.0.vm03.stdout:2/693: rename db/d12/d2a/d61/d6d/d8c to db/d12/d2a/d61/d79/d83/d64/dbd/dec/df8 0 2026-03-09T16:14:44.910 INFO:tasks.workunit.client.0.vm03.stdout:2/694: dwrite db/d12/f49 [4194304,4194304] 0 2026-03-09T16:14:44.922 INFO:tasks.workunit.client.0.vm03.stdout:5/768: truncate d2/d75/fd0 232332 0 2026-03-09T16:14:44.924 INFO:tasks.workunit.client.0.vm03.stdout:0/688: write d0/d7/d3e/d57/f90 [3081893,15523] 0 2026-03-09T16:14:44.924 INFO:tasks.workunit.client.0.vm03.stdout:0/689: chown d0/d7/d3e/d57/d5a/d52/l7d 70906 1 2026-03-09T16:14:44.925 INFO:tasks.workunit.client.0.vm03.stdout:0/690: fdatasync d0/fde 0 2026-03-09T16:14:44.928 INFO:tasks.workunit.client.0.vm03.stdout:1/610: rmdir d4/db/d8b 39 2026-03-09T16:14:44.950 INFO:tasks.workunit.client.0.vm03.stdout:1/611: dread d4/d6/d1d/d20/d93/f48 [0,4194304] 0 2026-03-09T16:14:44.952 INFO:tasks.workunit.client.0.vm03.stdout:7/638: write d4/da/d5d/db0/d61/f84 [1226138,83329] 0 2026-03-09T16:14:44.953 INFO:tasks.workunit.client.0.vm03.stdout:7/639: chown d4/d2d/d4b/cc1 58 1 2026-03-09T16:14:44.962 INFO:tasks.workunit.client.0.vm03.stdout:5/769: mknod d2/d7/de/d11/d19/d29/d90/db6/c105 0 2026-03-09T16:14:44.963 INFO:tasks.workunit.client.0.vm03.stdout:4/718: write d5/fa [3879766,88695] 0 2026-03-09T16:14:44.963 INFO:tasks.workunit.client.0.vm03.stdout:9/742: write d2/d4/d11/d29/d2a/d46/f9e [5142287,93576] 0 2026-03-09T16:14:44.965 INFO:tasks.workunit.client.0.vm03.stdout:9/743: read d2/d4/d11/f13 [4575061,45939] 0 2026-03-09T16:14:44.965 INFO:tasks.workunit.client.0.vm03.stdout:9/744: readlink d2/d4/d11/d29/d2a/d38/la6 0 2026-03-09T16:14:44.976 INFO:tasks.workunit.client.0.vm03.stdout:8/720: getdents da/d1d/d3b 0 2026-03-09T16:14:44.977 INFO:tasks.workunit.client.0.vm03.stdout:9/745: sync 2026-03-09T16:14:44.980 INFO:tasks.workunit.client.0.vm03.stdout:8/721: dwrite da/f52 [0,4194304] 0 2026-03-09T16:14:44.986 INFO:tasks.workunit.client.0.vm03.stdout:0/691: write d0/da/f1c [497279,52171] 0 2026-03-09T16:14:44.987 INFO:tasks.workunit.client.0.vm03.stdout:6/670: truncate d9/d42/d45/d50/d80/d8a/d9c/d97/f99 2645927 0 2026-03-09T16:14:44.997 INFO:tasks.workunit.client.0.vm03.stdout:5/770: unlink d2/d7/d1a/fde 0 2026-03-09T16:14:45.001 INFO:tasks.workunit.client.0.vm03.stdout:4/719: mknod d5/dd/d1f/d95/cd8 0 2026-03-09T16:14:45.003 INFO:tasks.workunit.client.0.vm03.stdout:3/684: creat d5/d53/fcc x:0 0 0 2026-03-09T16:14:45.008 INFO:tasks.workunit.client.0.vm03.stdout:2/695: write db/d12/d2a/d61/d79/d83/f87 [44190,25281] 0 2026-03-09T16:14:45.010 INFO:tasks.workunit.client.0.vm03.stdout:2/696: truncate db/d12/d2a/d61/d79/d83/d64/dbd/fd8 963594 0 2026-03-09T16:14:45.017 INFO:tasks.workunit.client.0.vm03.stdout:8/722: creat da/d10/d28/d64/fed x:0 0 0 2026-03-09T16:14:45.017 INFO:tasks.workunit.client.0.vm03.stdout:8/723: readlink da/db/d43/lc3 0 2026-03-09T16:14:45.020 INFO:tasks.workunit.client.0.vm03.stdout:6/671: read d9/d42/d45/d50/d80/fa1 [2847484,84615] 0 2026-03-09T16:14:45.020 INFO:tasks.workunit.client.0.vm03.stdout:1/612: getdents d4/d6/d1d/db5 0 2026-03-09T16:14:45.025 INFO:tasks.workunit.client.0.vm03.stdout:8/724: sync 2026-03-09T16:14:45.044 INFO:tasks.workunit.client.0.vm03.stdout:6/672: dread d9/d14/f1d [0,4194304] 0 2026-03-09T16:14:45.046 INFO:tasks.workunit.client.0.vm03.stdout:6/673: stat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fb2 0 2026-03-09T16:14:45.047 INFO:tasks.workunit.client.0.vm03.stdout:2/697: dwrite db/d12/d2a/d61/d79/d83/d64/dbd/dec/df8/d94/dad/fb5 [0,4194304] 0 2026-03-09T16:14:45.051 INFO:tasks.workunit.client.0.vm03.stdout:2/698: read - db/d12/d2a/d61/d79/d83/d64/dbd/dec/df8/d94/dad/db3/fe1 zero size 2026-03-09T16:14:45.052 INFO:tasks.workunit.client.0.vm03.stdout:5/771: truncate d2/d7/d1a/d1c/d3f/f92 1736547 0 2026-03-09T16:14:45.055 INFO:tasks.workunit.client.0.vm03.stdout:8/725: stat da/d32/cc9 0 2026-03-09T16:14:45.057 INFO:tasks.workunit.client.0.vm03.stdout:1/613: dwrite d4/d6/d3b/d63/f7e [0,4194304] 0 2026-03-09T16:14:45.058 INFO:tasks.workunit.client.0.vm03.stdout:9/746: link d2/de/f85 d2/d4/d11/d12/db2/fe4 0 2026-03-09T16:14:45.063 INFO:tasks.workunit.client.0.vm03.stdout:0/692: creat d0/fec x:0 0 0 2026-03-09T16:14:45.065 INFO:tasks.workunit.client.0.vm03.stdout:9/747: dwrite d2/de/d88/f86 [0,4194304] 0 2026-03-09T16:14:45.074 INFO:tasks.workunit.client.0.vm03.stdout:7/640: link d4/f8 d4/da/d18/d22/d24/d15/fd3 0 2026-03-09T16:14:45.095 INFO:tasks.workunit.client.0.vm03.stdout:8/726: mkdir da/d10/d28/d4f/daf/dee 0 2026-03-09T16:14:45.101 INFO:tasks.workunit.client.0.vm03.stdout:1/614: rmdir d4/d6/d1d/d20 39 2026-03-09T16:14:45.101 INFO:tasks.workunit.client.0.vm03.stdout:1/615: stat d4/d6/d3b/d6b/c2c 0 2026-03-09T16:14:45.101 INFO:tasks.workunit.client.0.vm03.stdout:1/616: dwrite d4/db/f7d [0,4194304] 0 2026-03-09T16:14:45.110 INFO:tasks.workunit.client.0.vm03.stdout:5/772: dread d2/d7/de/d11/f32 [4194304,4194304] 0 2026-03-09T16:14:45.118 INFO:tasks.workunit.client.0.vm03.stdout:0/693: mknod d0/da/d5c/ced 0 2026-03-09T16:14:45.120 INFO:tasks.workunit.client.0.vm03.stdout:0/694: truncate d0/d7/d3e/d57/d5a/d52/d9f/fe3 21987 0 2026-03-09T16:14:45.126 INFO:tasks.workunit.client.0.vm03.stdout:7/641: chown d4/da/d18/d22/d24/d16/d2b/c2e 0 1 2026-03-09T16:14:45.127 INFO:tasks.workunit.client.0.vm03.stdout:4/720: getdents d5/dd/d1f 0 2026-03-09T16:14:45.129 INFO:tasks.workunit.client.0.vm03.stdout:3/685: getdents d5/d1e/d42/d55/d86 0 2026-03-09T16:14:45.129 INFO:tasks.workunit.client.0.vm03.stdout:3/686: write d5/d6d/d6a/fa9 [186787,75369] 0 2026-03-09T16:14:45.136 INFO:tasks.workunit.client.0.vm03.stdout:6/674: rename d9/f1e to d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/fce 0 2026-03-09T16:14:45.138 INFO:tasks.workunit.client.0.vm03.stdout:9/748: write d2/df/d89/f7e [4870316,7997] 0 2026-03-09T16:14:45.149 INFO:tasks.workunit.client.0.vm03.stdout:2/699: dwrite db/d12/f62 [0,4194304] 0 2026-03-09T16:14:45.155 INFO:tasks.workunit.client.0.vm03.stdout:1/617: creat d4/d39/d7f/fcb x:0 0 0 2026-03-09T16:14:45.156 INFO:tasks.workunit.client.0.vm03.stdout:5/773: write d2/d7/d8/d24/d27/fc3 [343959,109269] 0 2026-03-09T16:14:45.156 INFO:tasks.workunit.client.0.vm03.stdout:5/774: readlink d2/d7/l89 0 2026-03-09T16:14:45.162 INFO:tasks.workunit.client.0.vm03.stdout:8/727: dwrite da/d32/d79/f84 [0,4194304] 0 2026-03-09T16:14:45.165 INFO:tasks.workunit.client.0.vm03.stdout:0/695: fdatasync d0/da/d1b/f46 0 2026-03-09T16:14:45.184 INFO:tasks.workunit.client.0.vm03.stdout:2/700: dread - db/d12/d2a/d61/d79/d83/d52/f86 zero size 2026-03-09T16:14:45.185 INFO:tasks.workunit.client.0.vm03.stdout:2/701: chown db/d12/d2a/d61/d79/d83/d64/dbd/df5 34211513 1 2026-03-09T16:14:45.187 INFO:tasks.workunit.client.0.vm03.stdout:5/775: rmdir d2/d7/de 39 2026-03-09T16:14:45.192 INFO:tasks.workunit.client.0.vm03.stdout:0/696: symlink d0/da/d1b/lee 0 2026-03-09T16:14:45.197 INFO:tasks.workunit.client.0.vm03.stdout:7/642: mkdir d4/da/d18/d22/d24/d16/d3e/db5/dd4 0 2026-03-09T16:14:45.198 INFO:tasks.workunit.client.0.vm03.stdout:9/749: dwrite d2/d4/d11/d12/f68 [0,4194304] 0 2026-03-09T16:14:45.198 INFO:tasks.workunit.client.0.vm03.stdout:7/643: write d4/da/d45/fa4 [1239833,128302] 0 2026-03-09T16:14:45.205 INFO:tasks.workunit.client.0.vm03.stdout:4/721: symlink d5/db/d25/d8b/da8/ld9 0 2026-03-09T16:14:45.212 INFO:tasks.workunit.client.0.vm03.stdout:6/675: truncate d9/f40 2163126 0 2026-03-09T16:14:45.218 INFO:tasks.workunit.client.0.vm03.stdout:2/702: rename db/d12/d2a/d61/d79/d83 to db/d12/d2a/d99/de7/df9 0 2026-03-09T16:14:45.222 INFO:tasks.workunit.client.0.vm03.stdout:5/776: write d2/d7/de/d11/dbf/fc5 [1086801,64203] 0 2026-03-09T16:14:45.232 INFO:tasks.workunit.client.0.vm03.stdout:3/687: dwrite d5/d1e/f72 [0,4194304] 0 2026-03-09T16:14:45.238 INFO:tasks.workunit.client.0.vm03.stdout:8/728: mkdir da/def 0 2026-03-09T16:14:45.239 INFO:tasks.workunit.client.0.vm03.stdout:8/729: dread - da/d32/dad/fd9 zero size 2026-03-09T16:14:45.239 INFO:tasks.workunit.client.0.vm03.stdout:7/644: rmdir d4/da/d5d/db0/da9 39 2026-03-09T16:14:45.244 INFO:tasks.workunit.client.0.vm03.stdout:7/645: dwrite d4/d2d/f8c [0,4194304] 0 2026-03-09T16:14:45.251 INFO:tasks.workunit.client.0.vm03.stdout:4/722: dread d5/f9 [0,4194304] 0 2026-03-09T16:14:45.252 INFO:tasks.workunit.client.0.vm03.stdout:4/723: readlink d5/db/d25/d31/d4d/d5b/d72/la2 0 2026-03-09T16:14:45.252 INFO:tasks.workunit.client.0.vm03.stdout:4/724: read - d5/dd/d1f/fbb zero size 2026-03-09T16:14:45.253 INFO:tasks.workunit.client.0.vm03.stdout:4/725: chown d5/d17/da0 6047628 1 2026-03-09T16:14:45.256 INFO:tasks.workunit.client.0.vm03.stdout:6/676: dread d9/f15 [0,4194304] 0 2026-03-09T16:14:45.258 INFO:tasks.workunit.client.0.vm03.stdout:1/618: link d4/d6/d1d/d20/d93/f85 d4/d6/d1d/d20/fcc 0 2026-03-09T16:14:45.265 INFO:tasks.workunit.client.0.vm03.stdout:2/703: fdatasync db/d12/d2a/d61/d6d/f81 0 2026-03-09T16:14:45.273 INFO:tasks.workunit.client.0.vm03.stdout:0/697: mkdir d0/d7/d3e/d57/d5a/d82/d89/def 0 2026-03-09T16:14:45.274 INFO:tasks.workunit.client.0.vm03.stdout:0/698: fdatasync d0/d7/d3e/d57/fdb 0 2026-03-09T16:14:45.283 INFO:tasks.workunit.client.0.vm03.stdout:7/646: creat d4/d2d/fd5 x:0 0 0 2026-03-09T16:14:45.287 INFO:tasks.workunit.client.0.vm03.stdout:6/677: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcf x:0 0 0 2026-03-09T16:14:45.287 INFO:tasks.workunit.client.0.vm03.stdout:2/704: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/dfa 0 2026-03-09T16:14:45.289 INFO:tasks.workunit.client.0.vm03.stdout:5/777: rename d2/d7/de/d11/d38 to d2/d7/d8/d16/d5c/dfc/d106 0 2026-03-09T16:14:45.291 INFO:tasks.workunit.client.0.vm03.stdout:6/678: dwrite d9/d42/d45/d50/d80/d8a/d9c/f8c [0,4194304] 0 2026-03-09T16:14:45.293 INFO:tasks.workunit.client.0.vm03.stdout:3/688: mknod d5/ccd 0 2026-03-09T16:14:45.296 INFO:tasks.workunit.client.0.vm03.stdout:8/730: symlink da/d10/d28/d4f/d68/lf0 0 2026-03-09T16:14:45.298 INFO:tasks.workunit.client.0.vm03.stdout:1/619: dwrite d4/d6/d1d/d20/fcc [0,4194304] 0 2026-03-09T16:14:45.299 INFO:tasks.workunit.client.0.vm03.stdout:1/620: stat d4/f1b 0 2026-03-09T16:14:45.304 INFO:tasks.workunit.client.0.vm03.stdout:0/699: read d0/d7/d3e/d57/d5a/d5f/f71 [3529377,86216] 0 2026-03-09T16:14:45.308 INFO:tasks.workunit.client.0.vm03.stdout:6/679: dread d9/d42/d45/d50/d80/d90/f64 [0,4194304] 0 2026-03-09T16:14:45.311 INFO:tasks.workunit.client.0.vm03.stdout:3/689: truncate d5/d1e/d42/f2c 288062 0 2026-03-09T16:14:45.311 INFO:tasks.workunit.client.0.vm03.stdout:8/731: mkdir da/d32/df1 0 2026-03-09T16:14:45.312 INFO:tasks.workunit.client.0.vm03.stdout:6/680: dread - d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcf zero size 2026-03-09T16:14:45.318 INFO:tasks.workunit.client.0.vm03.stdout:0/700: dwrite d0/da/d1b/d9b/f61 [0,4194304] 0 2026-03-09T16:14:45.319 INFO:tasks.workunit.client.0.vm03.stdout:0/701: truncate d0/d7/d75/feb 932193 0 2026-03-09T16:14:45.322 INFO:tasks.workunit.client.0.vm03.stdout:0/702: chown d0/d7/d3e/d57/d5a/d5f/db2/f76 17691224 1 2026-03-09T16:14:45.327 INFO:tasks.workunit.client.0.vm03.stdout:2/705: symlink db/d12/d2a/d99/de7/df9/lfb 0 2026-03-09T16:14:45.329 INFO:tasks.workunit.client.0.vm03.stdout:2/706: chown db/d12/da5/de2/dd5/ld9 2593770 1 2026-03-09T16:14:45.330 INFO:tasks.workunit.client.0.vm03.stdout:5/778: sync 2026-03-09T16:14:45.332 INFO:tasks.workunit.client.0.vm03.stdout:5/779: readlink d2/d7/d3c/l83 0 2026-03-09T16:14:45.335 INFO:tasks.workunit.client.0.vm03.stdout:5/780: write d2/d7/d8/d16/d5c/dcf/fe4 [1046725,66385] 0 2026-03-09T16:14:45.335 INFO:tasks.workunit.client.0.vm03.stdout:6/681: symlink d9/d42/d45/d50/d80/d8a/d9c/d97/ld0 0 2026-03-09T16:14:45.336 INFO:tasks.workunit.client.0.vm03.stdout:2/707: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/f6b [0,4194304] 0 2026-03-09T16:14:45.342 INFO:tasks.workunit.client.0.vm03.stdout:3/690: dwrite d5/d1e/f9b [0,4194304] 0 2026-03-09T16:14:45.344 INFO:tasks.workunit.client.0.vm03.stdout:0/703: mknod d0/da/d5c/db6/cf0 0 2026-03-09T16:14:45.347 INFO:tasks.workunit.client.0.vm03.stdout:8/732: mknod da/db/cf2 0 2026-03-09T16:14:45.362 INFO:tasks.workunit.client.0.vm03.stdout:9/750: dwrite d2/d54/fba [4194304,4194304] 0 2026-03-09T16:14:45.362 INFO:tasks.workunit.client.0.vm03.stdout:3/691: mkdir d5/d6d/d5a/d63/dce 0 2026-03-09T16:14:45.362 INFO:tasks.workunit.client.0.vm03.stdout:5/781: link d2/d7/d8/d24/d27/d43/d4b/fd1 d2/d75/f107 0 2026-03-09T16:14:45.363 INFO:tasks.workunit.client.0.vm03.stdout:3/692: dwrite d5/d1e/d42/d55/f7e [0,4194304] 0 2026-03-09T16:14:45.372 INFO:tasks.workunit.client.0.vm03.stdout:2/708: mknod db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda/cfc 0 2026-03-09T16:14:45.375 INFO:tasks.workunit.client.0.vm03.stdout:8/733: symlink da/d1d/lf3 0 2026-03-09T16:14:45.377 INFO:tasks.workunit.client.0.vm03.stdout:9/751: mknod d2/d4/d11/d29/ce5 0 2026-03-09T16:14:45.380 INFO:tasks.workunit.client.0.vm03.stdout:5/782: chown d2/d7/c47 0 1 2026-03-09T16:14:45.401 INFO:tasks.workunit.client.0.vm03.stdout:6/682: link d9/d14/da5/lb6 d9/d42/d45/d50/d80/ld1 0 2026-03-09T16:14:45.402 INFO:tasks.workunit.client.0.vm03.stdout:0/704: mknod d0/da/d1b/de0/cf1 0 2026-03-09T16:14:45.406 INFO:tasks.workunit.client.0.vm03.stdout:2/709: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda/ffd x:0 0 0 2026-03-09T16:14:45.407 INFO:tasks.workunit.client.0.vm03.stdout:2/710: fsync db/d12/d2a/d99/de7/df9/fe9 0 2026-03-09T16:14:45.410 INFO:tasks.workunit.client.0.vm03.stdout:9/752: fsync d2/de/d88/f75 0 2026-03-09T16:14:45.411 INFO:tasks.workunit.client.0.vm03.stdout:8/734: dread da/d32/d79/d95/fb6 [0,4194304] 0 2026-03-09T16:14:45.417 INFO:tasks.workunit.client.0.vm03.stdout:5/783: dread - d2/fdf zero size 2026-03-09T16:14:45.422 INFO:tasks.workunit.client.0.vm03.stdout:5/784: dwrite d2/d7/d8/d16/d5c/dfc/d106/d52/fc6 [0,4194304] 0 2026-03-09T16:14:45.424 INFO:tasks.workunit.client.0.vm03.stdout:4/726: truncate d5/fa 12471427 0 2026-03-09T16:14:45.425 INFO:tasks.workunit.client.0.vm03.stdout:7/647: write d4/da/d45/f63 [2186885,70373] 0 2026-03-09T16:14:45.429 INFO:tasks.workunit.client.0.vm03.stdout:6/683: dread d9/d42/d45/d50/d80/d90/f64 [0,4194304] 0 2026-03-09T16:14:45.434 INFO:tasks.workunit.client.0.vm03.stdout:2/711: dwrite db/d12/d2a/d61/d79/fb7 [0,4194304] 0 2026-03-09T16:14:45.435 INFO:tasks.workunit.client.0.vm03.stdout:9/753: truncate d2/f5a 1205645 0 2026-03-09T16:14:45.436 INFO:tasks.workunit.client.0.vm03.stdout:9/754: write d2/d4/d11/d12/db2/fd1 [836735,126304] 0 2026-03-09T16:14:45.438 INFO:tasks.workunit.client.0.vm03.stdout:8/735: truncate da/d45/faa 4615810 0 2026-03-09T16:14:45.451 INFO:tasks.workunit.client.0.vm03.stdout:5/785: fdatasync d2/fdf 0 2026-03-09T16:14:45.458 INFO:tasks.workunit.client.0.vm03.stdout:1/621: write d4/d6/d1d/d20/d23/f30 [4361221,26517] 0 2026-03-09T16:14:45.472 INFO:tasks.workunit.client.0.vm03.stdout:7/648: rmdir d4/d2d/d4b 39 2026-03-09T16:14:45.473 INFO:tasks.workunit.client.0.vm03.stdout:3/693: truncate d5/fb 2199357 0 2026-03-09T16:14:45.474 INFO:tasks.workunit.client.0.vm03.stdout:3/694: read d5/f2b [3165791,110008] 0 2026-03-09T16:14:45.482 INFO:tasks.workunit.client.0.vm03.stdout:0/705: creat d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/ff2 x:0 0 0 2026-03-09T16:14:45.484 INFO:tasks.workunit.client.0.vm03.stdout:3/695: dread d5/d53/f96 [0,4194304] 0 2026-03-09T16:14:45.497 INFO:tasks.workunit.client.0.vm03.stdout:8/736: symlink da/d10/d28/d4f/d68/ddc/lf4 0 2026-03-09T16:14:45.512 INFO:tasks.workunit.client.0.vm03.stdout:4/727: dwrite d5/db/d25/f26 [0,4194304] 0 2026-03-09T16:14:45.528 INFO:tasks.workunit.client.0.vm03.stdout:3/696: rmdir d5/d53 39 2026-03-09T16:14:45.528 INFO:tasks.workunit.client.0.vm03.stdout:3/697: chown d5/d1e/d42/d55 69775 1 2026-03-09T16:14:45.531 INFO:tasks.workunit.client.0.vm03.stdout:6/684: symlink d9/d42/d45/d65/dbf/dc9/ld2 0 2026-03-09T16:14:45.533 INFO:tasks.workunit.client.0.vm03.stdout:2/712: symlink db/d12/d2a/d99/de7/df9/lfe 0 2026-03-09T16:14:45.535 INFO:tasks.workunit.client.0.vm03.stdout:8/737: creat da/d32/ff5 x:0 0 0 2026-03-09T16:14:45.535 INFO:tasks.workunit.client.0.vm03.stdout:8/738: stat da/d10/d28/d4f/d68/f8f 0 2026-03-09T16:14:45.547 INFO:tasks.workunit.client.0.vm03.stdout:8/739: dwrite da/d32/dad/fd9 [0,4194304] 0 2026-03-09T16:14:45.553 INFO:tasks.workunit.client.0.vm03.stdout:9/755: link d2/d4/d11/f13 d2/d4/d11/d29/d2a/d4d/fe6 0 2026-03-09T16:14:45.554 INFO:tasks.workunit.client.0.vm03.stdout:9/756: fdatasync d2/d54/d7d/d8f/fbb 0 2026-03-09T16:14:45.558 INFO:tasks.workunit.client.0.vm03.stdout:0/706: write d0/d7/d3e/d57/d5a/d82/d89/dbd/f7e [796044,7086] 0 2026-03-09T16:14:45.570 INFO:tasks.workunit.client.0.vm03.stdout:7/649: creat d4/d2d/d4b/fd6 x:0 0 0 2026-03-09T16:14:45.579 INFO:tasks.workunit.client.0.vm03.stdout:5/786: dwrite d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:45.606 INFO:tasks.workunit.client.0.vm03.stdout:8/740: symlink da/d1d/d3b/lf6 0 2026-03-09T16:14:45.606 INFO:tasks.workunit.client.0.vm03.stdout:9/757: creat d2/d4/d11/d29/d2a/d38/db6/fe7 x:0 0 0 2026-03-09T16:14:45.607 INFO:tasks.workunit.client.0.vm03.stdout:0/707: creat d0/da/d1b/d9b/ff3 x:0 0 0 2026-03-09T16:14:45.607 INFO:tasks.workunit.client.0.vm03.stdout:0/708: stat d0/c22 0 2026-03-09T16:14:45.607 INFO:tasks.workunit.client.0.vm03.stdout:1/622: link d4/d6/d3b/d6b/d25/c6c d4/d6/d3b/d63/ccd 0 2026-03-09T16:14:45.608 INFO:tasks.workunit.client.0.vm03.stdout:7/650: rmdir d4/da/d45/d51 39 2026-03-09T16:14:45.608 INFO:tasks.workunit.client.0.vm03.stdout:5/787: mkdir d2/d7/d8/d16/d5c/dfc/d106/d108 0 2026-03-09T16:14:45.609 INFO:tasks.workunit.client.0.vm03.stdout:7/651: readlink d4/da/d45/l8e 0 2026-03-09T16:14:45.613 INFO:tasks.workunit.client.0.vm03.stdout:5/788: dread d2/d7/de/d11/dbf/fc5 [0,4194304] 0 2026-03-09T16:14:45.615 INFO:tasks.workunit.client.0.vm03.stdout:5/789: chown d2/d7/d1a 40 1 2026-03-09T16:14:45.615 INFO:tasks.workunit.client.0.vm03.stdout:5/790: chown d2/d7/d3c/c8c 23657 1 2026-03-09T16:14:45.628 INFO:tasks.workunit.client.0.vm03.stdout:4/728: dwrite d5/db/d25/d31/d33/fa5 [0,4194304] 0 2026-03-09T16:14:45.629 INFO:tasks.workunit.client.0.vm03.stdout:2/713: dwrite db/d12/d2a/d61/f4c [0,4194304] 0 2026-03-09T16:14:45.635 INFO:tasks.workunit.client.0.vm03.stdout:6/685: creat d9/d8e/fd3 x:0 0 0 2026-03-09T16:14:45.636 INFO:tasks.workunit.client.0.vm03.stdout:6/686: write d9/d42/fa0 [1606628,109147] 0 2026-03-09T16:14:45.653 INFO:tasks.workunit.client.0.vm03.stdout:9/758: creat d2/d4/d1f/d83/fe8 x:0 0 0 2026-03-09T16:14:45.654 INFO:tasks.workunit.client.0.vm03.stdout:9/759: write d2/d4/d11/d29/d2a/d46/f9e [2881227,54081] 0 2026-03-09T16:14:45.655 INFO:tasks.workunit.client.0.vm03.stdout:8/741: dread da/d10/d28/d64/fab [0,4194304] 0 2026-03-09T16:14:45.658 INFO:tasks.workunit.client.0.vm03.stdout:0/709: write d0/d7/d3e/d57/d5a/d5f/db2/dcf/fd7 [431509,67386] 0 2026-03-09T16:14:45.677 INFO:tasks.workunit.client.0.vm03.stdout:1/623: write d4/d6/d3b/d63/f78 [4996051,45036] 0 2026-03-09T16:14:45.679 INFO:tasks.workunit.client.0.vm03.stdout:7/652: write d4/da/f20 [1083322,13964] 0 2026-03-09T16:14:45.686 INFO:tasks.workunit.client.0.vm03.stdout:3/698: rename d5/d53/d88/l89 to d5/d6d/d6a/dbd/lcf 0 2026-03-09T16:14:45.687 INFO:tasks.workunit.client.0.vm03.stdout:5/791: write d2/d7/d1a/d1c/f5e [9273944,52995] 0 2026-03-09T16:14:45.695 INFO:tasks.workunit.client.0.vm03.stdout:4/729: mknod d5/db/d25/d31/d4d/d5b/cda 0 2026-03-09T16:14:45.696 INFO:tasks.workunit.client.0.vm03.stdout:4/730: chown d5/db/f34 861984 1 2026-03-09T16:14:45.705 INFO:tasks.workunit.client.0.vm03.stdout:9/760: unlink d2/lc1 0 2026-03-09T16:14:45.713 INFO:tasks.workunit.client.0.vm03.stdout:0/710: dread d0/f4e [0,4194304] 0 2026-03-09T16:14:45.721 INFO:tasks.workunit.client.0.vm03.stdout:1/624: dread d4/d6/d3b/d6b/d25/f84 [0,4194304] 0 2026-03-09T16:14:45.724 INFO:tasks.workunit.client.0.vm03.stdout:7/653: dread - d4/da/d45/fb9 zero size 2026-03-09T16:14:45.728 INFO:tasks.workunit.client.0.vm03.stdout:3/699: creat d5/d1e/d42/d8b/fd0 x:0 0 0 2026-03-09T16:14:45.731 INFO:tasks.workunit.client.0.vm03.stdout:5/792: creat d2/d7/de/d11/d19/f109 x:0 0 0 2026-03-09T16:14:45.735 INFO:tasks.workunit.client.0.vm03.stdout:4/731: mknod d5/db/d25/d8b/da8/d81/cdb 0 2026-03-09T16:14:45.748 INFO:tasks.workunit.client.0.vm03.stdout:9/761: dwrite d2/df/f64 [0,4194304] 0 2026-03-09T16:14:45.751 INFO:tasks.workunit.client.0.vm03.stdout:8/742: creat da/d10/d28/d4f/daf/dee/ff7 x:0 0 0 2026-03-09T16:14:45.753 INFO:tasks.workunit.client.0.vm03.stdout:0/711: mkdir d0/d7/d3e/d57/d5a/d5f/db2/dab/df4 0 2026-03-09T16:14:45.754 INFO:tasks.workunit.client.0.vm03.stdout:1/625: dread - d4/db/d8b/fb0 zero size 2026-03-09T16:14:45.763 INFO:tasks.workunit.client.0.vm03.stdout:2/714: creat db/d12/fff x:0 0 0 2026-03-09T16:14:45.763 INFO:tasks.workunit.client.0.vm03.stdout:2/715: readlink db/d12/d2a/d61/la2 0 2026-03-09T16:14:45.763 INFO:tasks.workunit.client.0.vm03.stdout:2/716: fdatasync db/d12/d2a/f38 0 2026-03-09T16:14:45.764 INFO:tasks.workunit.client.0.vm03.stdout:2/717: write db/d12/d2a/d99/fcd [4193320,103969] 0 2026-03-09T16:14:45.765 INFO:tasks.workunit.client.0.vm03.stdout:2/718: write db/d12/fe8 [1094832,3924] 0 2026-03-09T16:14:45.771 INFO:tasks.workunit.client.0.vm03.stdout:7/654: write d4/d2d/f8c [4113723,113928] 0 2026-03-09T16:14:45.772 INFO:tasks.workunit.client.0.vm03.stdout:7/655: truncate d4/da/d5d/db0/d61/fd2 470346 0 2026-03-09T16:14:45.781 INFO:tasks.workunit.client.0.vm03.stdout:9/762: mkdir d2/d4/d11/d12/db2/dc2/de9 0 2026-03-09T16:14:45.783 INFO:tasks.workunit.client.0.vm03.stdout:8/743: truncate da/d1d/f4a 4785284 0 2026-03-09T16:14:45.790 INFO:tasks.workunit.client.0.vm03.stdout:1/626: rename d4/d6/d1d/d3d/c8f to d4/d39/cce 0 2026-03-09T16:14:45.791 INFO:tasks.workunit.client.0.vm03.stdout:0/712: write d0/da/d5c/f39 [4533105,73532] 0 2026-03-09T16:14:45.794 INFO:tasks.workunit.client.0.vm03.stdout:3/700: getdents d5/d6d/d5a/d63/dce 0 2026-03-09T16:14:45.797 INFO:tasks.workunit.client.0.vm03.stdout:5/793: unlink d2/c15 0 2026-03-09T16:14:45.797 INFO:tasks.workunit.client.0.vm03.stdout:5/794: chown d2/d7/de/d11/d19/d29/d90/fac 878 1 2026-03-09T16:14:45.805 INFO:tasks.workunit.client.0.vm03.stdout:2/719: truncate db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fdc 469936 0 2026-03-09T16:14:45.807 INFO:tasks.workunit.client.0.vm03.stdout:2/720: dread db/d12/d2a/d61/d6d/f81 [0,4194304] 0 2026-03-09T16:14:45.813 INFO:tasks.workunit.client.0.vm03.stdout:4/732: symlink d5/db/d25/d8b/da8/d81/dd4/ldc 0 2026-03-09T16:14:45.816 INFO:tasks.workunit.client.0.vm03.stdout:6/687: getdents d9/d42/d45/d50 0 2026-03-09T16:14:45.817 INFO:tasks.workunit.client.0.vm03.stdout:9/763: truncate d2/df/f76 320584 0 2026-03-09T16:14:45.820 INFO:tasks.workunit.client.0.vm03.stdout:8/744: symlink da/d32/dad/lf8 0 2026-03-09T16:14:45.831 INFO:tasks.workunit.client.0.vm03.stdout:0/713: dwrite d0/d7/d48/f2e [0,4194304] 0 2026-03-09T16:14:45.833 INFO:tasks.workunit.client.0.vm03.stdout:0/714: chown d0/d7/d3e/d57/d5a/d82/d89/dc0/fe1 2493228 1 2026-03-09T16:14:45.837 INFO:tasks.workunit.client.0.vm03.stdout:5/795: creat d2/d7/de/d11/d19/d31/d35/d87/f10a x:0 0 0 2026-03-09T16:14:45.841 INFO:tasks.workunit.client.0.vm03.stdout:2/721: rename db/d12/d2a/d99/de7/df9/d64/dbd/f9e to db/d12/d2a/d61/d79/f100 0 2026-03-09T16:14:45.842 INFO:tasks.workunit.client.0.vm03.stdout:2/722: chown db/d12/d2a/d99/fd4 2536 1 2026-03-09T16:14:45.842 INFO:tasks.workunit.client.0.vm03.stdout:2/723: chown db/d12/d2a/d99/de7/df9/cee 0 1 2026-03-09T16:14:45.853 INFO:tasks.workunit.client.0.vm03.stdout:7/656: symlink d4/da/d18/ld7 0 2026-03-09T16:14:45.865 INFO:tasks.workunit.client.0.vm03.stdout:4/733: dwrite d5/f7 [0,4194304] 0 2026-03-09T16:14:45.879 INFO:tasks.workunit.client.0.vm03.stdout:8/745: dwrite da/db/f75 [0,4194304] 0 2026-03-09T16:14:45.897 INFO:tasks.workunit.client.0.vm03.stdout:0/715: creat d0/da/d7a/ff5 x:0 0 0 2026-03-09T16:14:45.898 INFO:tasks.workunit.client.0.vm03.stdout:0/716: chown d0/d7/d3e/d57/d5a/d5f/db2/dcf 1248 1 2026-03-09T16:14:45.915 INFO:tasks.workunit.client.0.vm03.stdout:7/657: rename d4/da/d18 to d4/da/d5d/dd8 0 2026-03-09T16:14:45.916 INFO:tasks.workunit.client.0.vm03.stdout:7/658: read d4/da/f5f [3149933,72900] 0 2026-03-09T16:14:45.925 INFO:tasks.workunit.client.0.vm03.stdout:9/764: dwrite d2/d4/d11/d29/d2a/d38/fa7 [0,4194304] 0 2026-03-09T16:14:45.927 INFO:tasks.workunit.client.0.vm03.stdout:7/659: dread d4/da/d45/f63 [0,4194304] 0 2026-03-09T16:14:45.937 INFO:tasks.workunit.client.0.vm03.stdout:6/688: fsync d9/f5c 0 2026-03-09T16:14:45.941 INFO:tasks.workunit.client.0.vm03.stdout:5/796: dwrite d2/d7/d3c/fdb [0,4194304] 0 2026-03-09T16:14:45.957 INFO:tasks.workunit.client.0.vm03.stdout:1/627: rmdir d4/d6/d3b/dc5 0 2026-03-09T16:14:45.960 INFO:tasks.workunit.client.0.vm03.stdout:3/701: link d5/c93 d5/d53/d6c/d79/cd1 0 2026-03-09T16:14:45.967 INFO:tasks.workunit.client.0.vm03.stdout:2/724: symlink db/d12/d2a/d99/de7/df9/d64/dbd/df5/l101 0 2026-03-09T16:14:45.967 INFO:tasks.workunit.client.0.vm03.stdout:2/725: chown db/d12/f84 11 1 2026-03-09T16:14:45.972 INFO:tasks.workunit.client.0.vm03.stdout:2/726: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda/ffd [0,4194304] 0 2026-03-09T16:14:45.974 INFO:tasks.workunit.client.0.vm03.stdout:2/727: chown db/d12/d2a/d99/de7/df9/d52/fd0 117988 1 2026-03-09T16:14:45.975 INFO:tasks.workunit.client.0.vm03.stdout:2/728: read f7 [1317432,113110] 0 2026-03-09T16:14:45.984 INFO:tasks.workunit.client.0.vm03.stdout:0/717: write d0/d7/d48/fb8 [120332,44192] 0 2026-03-09T16:14:45.985 INFO:tasks.workunit.client.0.vm03.stdout:0/718: read - d0/d7/d3e/d57/d5a/d82/d89/dbd/f9e zero size 2026-03-09T16:14:45.997 INFO:tasks.workunit.client.0.vm03.stdout:6/689: mkdir d9/d42/d45/d50/d80/d8a/dc1/dd4 0 2026-03-09T16:14:45.998 INFO:tasks.workunit.client.0.vm03.stdout:6/690: stat d9/d42/d45/d50/d80/d8a/d9c/d97/faf 0 2026-03-09T16:14:46.002 INFO:tasks.workunit.client.0.vm03.stdout:5/797: read d2/d7/d1a/d1c/d3f/f92 [1011150,98022] 0 2026-03-09T16:14:46.009 INFO:tasks.workunit.client.0.vm03.stdout:4/734: dwrite d5/db/d25/d8b/fa3 [0,4194304] 0 2026-03-09T16:14:46.017 INFO:tasks.workunit.client.0.vm03.stdout:1/628: rmdir d4/d6/d3b/d8e 39 2026-03-09T16:14:46.026 INFO:tasks.workunit.client.0.vm03.stdout:2/729: read db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/fa3 [84533,81486] 0 2026-03-09T16:14:46.032 INFO:tasks.workunit.client.0.vm03.stdout:9/765: creat d2/d4/d11/d12/db2/dbf/fea x:0 0 0 2026-03-09T16:14:46.036 INFO:tasks.workunit.client.0.vm03.stdout:8/746: write da/d45/faa [90954,92225] 0 2026-03-09T16:14:46.037 INFO:tasks.workunit.client.0.vm03.stdout:8/747: stat da/db/c7b 0 2026-03-09T16:14:46.037 INFO:tasks.workunit.client.0.vm03.stdout:8/748: chown da/d10/d28/d4f/d68/ld4 25602237 1 2026-03-09T16:14:46.038 INFO:tasks.workunit.client.0.vm03.stdout:8/749: fsync da/d10/d28/d4f/d68/fa7 0 2026-03-09T16:14:46.042 INFO:tasks.workunit.client.0.vm03.stdout:0/719: dwrite d0/d7/d3e/d57/d5a/d82/f8a [0,4194304] 0 2026-03-09T16:14:46.044 INFO:tasks.workunit.client.0.vm03.stdout:7/660: dread - d4/da/d45/d51/f9e zero size 2026-03-09T16:14:46.045 INFO:tasks.workunit.client.0.vm03.stdout:7/661: truncate d4/da/d5d/db0/d9d/fc2 754669 0 2026-03-09T16:14:46.072 INFO:tasks.workunit.client.0.vm03.stdout:6/691: dwrite d9/d42/d45/d50/d80/fa1 [0,4194304] 0 2026-03-09T16:14:46.073 INFO:tasks.workunit.client.0.vm03.stdout:6/692: readlink d9/d42/d45/d50/d80/d90/db7/lbc 0 2026-03-09T16:14:46.087 INFO:tasks.workunit.client.0.vm03.stdout:2/730: creat db/d12/d2a/d99/de7/df9/d64/dbd/df5/f102 x:0 0 0 2026-03-09T16:14:46.088 INFO:tasks.workunit.client.0.vm03.stdout:9/766: unlink d2/f7 0 2026-03-09T16:14:46.092 INFO:tasks.workunit.client.0.vm03.stdout:0/720: mkdir d0/d7/d3e/d57/d5a/d47/dce/df6 0 2026-03-09T16:14:46.097 INFO:tasks.workunit.client.0.vm03.stdout:5/798: unlink d2/d7/d8/d24/fb1 0 2026-03-09T16:14:46.100 INFO:tasks.workunit.client.0.vm03.stdout:1/629: mknod d4/db/ccf 0 2026-03-09T16:14:46.102 INFO:tasks.workunit.client.0.vm03.stdout:9/767: creat d2/d4/d11/d12/db2/dc2/feb x:0 0 0 2026-03-09T16:14:46.106 INFO:tasks.workunit.client.0.vm03.stdout:2/731: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/f103 x:0 0 0 2026-03-09T16:14:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/702: getdents d5/d1e/d42/d55 0 2026-03-09T16:14:46.114 INFO:tasks.workunit.client.0.vm03.stdout:3/703: dwrite d5/d1e/d42/d34/d70/fc7 [0,4194304] 0 2026-03-09T16:14:46.124 INFO:tasks.workunit.client.0.vm03.stdout:0/721: creat d0/d7/d3e/d57/de9/ff7 x:0 0 0 2026-03-09T16:14:46.125 INFO:tasks.workunit.client.0.vm03.stdout:7/662: sync 2026-03-09T16:14:46.132 INFO:tasks.workunit.client.0.vm03.stdout:6/693: creat d9/d42/d45/fd5 x:0 0 0 2026-03-09T16:14:46.137 INFO:tasks.workunit.client.0.vm03.stdout:3/704: rename d5/d6d/d5a/d63 to d5/d1e/d42/d34/dd2 0 2026-03-09T16:14:46.143 INFO:tasks.workunit.client.0.vm03.stdout:7/663: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/d77/dd9 0 2026-03-09T16:14:46.149 INFO:tasks.workunit.client.0.vm03.stdout:8/750: dwrite da/d10/d28/f57 [0,4194304] 0 2026-03-09T16:14:46.161 INFO:tasks.workunit.client.0.vm03.stdout:4/735: truncate d5/db/d25/f26 1806741 0 2026-03-09T16:14:46.161 INFO:tasks.workunit.client.0.vm03.stdout:1/630: write d4/d6/d3b/d6b/d25/f4e [1160388,8092] 0 2026-03-09T16:14:46.161 INFO:tasks.workunit.client.0.vm03.stdout:2/732: write db/d12/f63 [4337288,82342] 0 2026-03-09T16:14:46.163 INFO:tasks.workunit.client.0.vm03.stdout:1/631: stat d4/db/l4b 0 2026-03-09T16:14:46.165 INFO:tasks.workunit.client.0.vm03.stdout:0/722: dwrite d0/d7/d3e/d95/f99 [0,4194304] 0 2026-03-09T16:14:46.166 INFO:tasks.workunit.client.0.vm03.stdout:9/768: dwrite d2/d4/d11/d29/d2a/d4d/fe6 [0,4194304] 0 2026-03-09T16:14:46.166 INFO:tasks.workunit.client.0.vm03.stdout:1/632: write d4/d6/d1d/d20/d93/f85 [3835774,41069] 0 2026-03-09T16:14:46.174 INFO:tasks.workunit.client.0.vm03.stdout:6/694: creat d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/fd6 x:0 0 0 2026-03-09T16:14:46.177 INFO:tasks.workunit.client.0.vm03.stdout:3/705: truncate d5/f33 8398281 0 2026-03-09T16:14:46.190 INFO:tasks.workunit.client.0.vm03.stdout:8/751: unlink da/db/da8/ccc 0 2026-03-09T16:14:46.190 INFO:tasks.workunit.client.0.vm03.stdout:5/799: link d2/d7/de/d11/dbf/cc2 d2/d7/de/d11/c10b 0 2026-03-09T16:14:46.205 INFO:tasks.workunit.client.0.vm03.stdout:4/736: dread d5/db/f5d [0,4194304] 0 2026-03-09T16:14:46.206 INFO:tasks.workunit.client.0.vm03.stdout:3/706: dread d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:46.210 INFO:tasks.workunit.client.0.vm03.stdout:2/733: rename db/d12/d2a/lc4 to db/d12/da5/de4/l104 0 2026-03-09T16:14:46.210 INFO:tasks.workunit.client.0.vm03.stdout:9/769: symlink d2/d4/d11/d29/d2a/d46/dd6/lec 0 2026-03-09T16:14:46.211 INFO:tasks.workunit.client.0.vm03.stdout:6/695: creat d9/d42/d45/d50/fd7 x:0 0 0 2026-03-09T16:14:46.211 INFO:tasks.workunit.client.0.vm03.stdout:9/770: write d2/d54/d7d/fa4 [826721,10250] 0 2026-03-09T16:14:46.213 INFO:tasks.workunit.client.0.vm03.stdout:9/771: stat d2/d4/d11/d29/d2a/d46/ca2 0 2026-03-09T16:14:46.216 INFO:tasks.workunit.client.0.vm03.stdout:1/633: dwrite d4/d6/d1d/d20/fc2 [0,4194304] 0 2026-03-09T16:14:46.222 INFO:tasks.workunit.client.0.vm03.stdout:1/634: dread d4/d6/d1d/d20/fc8 [0,4194304] 0 2026-03-09T16:14:46.223 INFO:tasks.workunit.client.0.vm03.stdout:1/635: write d4/d6/d3b/f98 [496702,75714] 0 2026-03-09T16:14:46.225 INFO:tasks.workunit.client.0.vm03.stdout:7/664: link d4/da/d45/d51/f91 d4/da/d5d/db0/d9d/dc9/fda 0 2026-03-09T16:14:46.232 INFO:tasks.workunit.client.0.vm03.stdout:6/696: sync 2026-03-09T16:14:46.239 INFO:tasks.workunit.client.0.vm03.stdout:7/665: sync 2026-03-09T16:14:46.245 INFO:tasks.workunit.client.0.vm03.stdout:1/636: creat d4/d31/d5c/da8/fd0 x:0 0 0 2026-03-09T16:14:46.245 INFO:tasks.workunit.client.0.vm03.stdout:1/637: stat d4/d6/d3b/d6b/da5/la7 0 2026-03-09T16:14:46.249 INFO:tasks.workunit.client.0.vm03.stdout:5/800: write d2/d7/de/d11/f32 [3831476,90385] 0 2026-03-09T16:14:46.250 INFO:tasks.workunit.client.0.vm03.stdout:8/752: dwrite da/d10/d28/d4f/d68/d80/f58 [0,4194304] 0 2026-03-09T16:14:46.251 INFO:tasks.workunit.client.0.vm03.stdout:8/753: chown da/d10/d28/c6d 12392 1 2026-03-09T16:14:46.256 INFO:tasks.workunit.client.0.vm03.stdout:6/697: unlink d9/d14/l25 0 2026-03-09T16:14:46.257 INFO:tasks.workunit.client.0.vm03.stdout:4/737: mknod d5/db/d25/d8b/dd6/cdd 0 2026-03-09T16:14:46.263 INFO:tasks.workunit.client.0.vm03.stdout:0/723: creat d0/d7/ff8 x:0 0 0 2026-03-09T16:14:46.268 INFO:tasks.workunit.client.0.vm03.stdout:7/666: creat d4/da/d5d/db0/d61/fdb x:0 0 0 2026-03-09T16:14:46.269 INFO:tasks.workunit.client.0.vm03.stdout:7/667: fdatasync d4/da/d5d/dd8/d22/d24/d15/f34 0 2026-03-09T16:14:46.275 INFO:tasks.workunit.client.0.vm03.stdout:3/707: write d5/f33 [772754,49064] 0 2026-03-09T16:14:46.290 INFO:tasks.workunit.client.0.vm03.stdout:1/638: dread d4/d6/d1d/d3d/f45 [0,4194304] 0 2026-03-09T16:14:46.292 INFO:tasks.workunit.client.0.vm03.stdout:8/754: creat da/d6c/d7a/ff9 x:0 0 0 2026-03-09T16:14:46.298 INFO:tasks.workunit.client.0.vm03.stdout:6/698: write d9/d14/f1d [72943,33768] 0 2026-03-09T16:14:46.299 INFO:tasks.workunit.client.0.vm03.stdout:4/738: creat d5/db/d25/d31/d4d/d5b/d9a/fde x:0 0 0 2026-03-09T16:14:46.302 INFO:tasks.workunit.client.0.vm03.stdout:0/724: creat d0/da/d7a/d98/ff9 x:0 0 0 2026-03-09T16:14:46.303 INFO:tasks.workunit.client.0.vm03.stdout:0/725: dread - d0/d7/d3e/d57/fe4 zero size 2026-03-09T16:14:46.305 INFO:tasks.workunit.client.0.vm03.stdout:9/772: link d2/d4/d11/d12/d28/c98 d2/de/ced 0 2026-03-09T16:14:46.316 INFO:tasks.workunit.client.0.vm03.stdout:1/639: creat d4/d31/d5c/da8/da1/fd1 x:0 0 0 2026-03-09T16:14:46.316 INFO:tasks.workunit.client.0.vm03.stdout:1/640: chown d4/db/l4c 353018831 1 2026-03-09T16:14:46.321 INFO:tasks.workunit.client.0.vm03.stdout:4/739: mknod d5/db/d25/d31/d33/d79/cdf 0 2026-03-09T16:14:46.322 INFO:tasks.workunit.client.0.vm03.stdout:4/740: fsync d5/db/d25/d8b/da8/dbe/fc5 0 2026-03-09T16:14:46.325 INFO:tasks.workunit.client.0.vm03.stdout:6/699: dread d9/f40 [0,4194304] 0 2026-03-09T16:14:46.327 INFO:tasks.workunit.client.0.vm03.stdout:0/726: creat d0/d7/d3e/d95/ffa x:0 0 0 2026-03-09T16:14:46.334 INFO:tasks.workunit.client.0.vm03.stdout:9/773: rename d2/d4/d11/d12/db2 to d2/d4/d11/d12/dc7/dee 0 2026-03-09T16:14:46.334 INFO:tasks.workunit.client.0.vm03.stdout:9/774: write d2/fc6 [3052548,52796] 0 2026-03-09T16:14:46.339 INFO:tasks.workunit.client.0.vm03.stdout:7/668: link d4/da/d5d/dd8/f44 d4/da/d5d/db0/d61/dbc/fdc 0 2026-03-09T16:14:46.345 INFO:tasks.workunit.client.0.vm03.stdout:2/734: symlink db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/l105 0 2026-03-09T16:14:46.345 INFO:tasks.workunit.client.0.vm03.stdout:2/735: readlink db/l75 0 2026-03-09T16:14:46.348 INFO:tasks.workunit.client.0.vm03.stdout:3/708: truncate d5/d1e/fc5 4059471 0 2026-03-09T16:14:46.353 INFO:tasks.workunit.client.0.vm03.stdout:1/641: read d4/d39/d7f/f88 [234951,125987] 0 2026-03-09T16:14:46.354 INFO:tasks.workunit.client.0.vm03.stdout:8/755: mkdir da/d6c/dfa 0 2026-03-09T16:14:46.355 INFO:tasks.workunit.client.0.vm03.stdout:8/756: chown da/d32/d79/d95 4527 1 2026-03-09T16:14:46.359 INFO:tasks.workunit.client.0.vm03.stdout:6/700: mkdir d9/d14/da5/dd8 0 2026-03-09T16:14:46.360 INFO:tasks.workunit.client.0.vm03.stdout:3/709: sync 2026-03-09T16:14:46.369 INFO:tasks.workunit.client.0.vm03.stdout:9/775: rename d2/df to d2/d54/d7d/d8f/dad/def 0 2026-03-09T16:14:46.381 INFO:tasks.workunit.client.0.vm03.stdout:7/669: dwrite d4/da/d5d/db0/d9d/dc9/fda [0,4194304] 0 2026-03-09T16:14:46.383 INFO:tasks.workunit.client.0.vm03.stdout:2/736: chown db/d12/d2a/d99/de7/df9/d64/l8b 70 1 2026-03-09T16:14:46.390 INFO:tasks.workunit.client.0.vm03.stdout:1/642: stat d4/d6/d3b/d6b/d25/d50/c7a 0 2026-03-09T16:14:46.390 INFO:tasks.workunit.client.0.vm03.stdout:1/643: fsync d4/d6/d1d/d20/d93/f85 0 2026-03-09T16:14:46.393 INFO:tasks.workunit.client.0.vm03.stdout:8/757: creat da/d32/d79/d95/ffb x:0 0 0 2026-03-09T16:14:46.396 INFO:tasks.workunit.client.0.vm03.stdout:6/701: rmdir d9/d42/d45/d65/dbf/dc9 39 2026-03-09T16:14:46.402 INFO:tasks.workunit.client.0.vm03.stdout:0/727: link d0/d7/d3e/d57/d5a/d82/d89/dbd/f9e d0/d7/d3e/d57/d5a/d47/dce/ffb 0 2026-03-09T16:14:46.402 INFO:tasks.workunit.client.0.vm03.stdout:0/728: chown d0/d7/d48 1429 1 2026-03-09T16:14:46.402 INFO:tasks.workunit.client.0.vm03.stdout:0/729: chown d0/da/f1c 85 1 2026-03-09T16:14:46.406 INFO:tasks.workunit.client.0.vm03.stdout:4/741: rename d5/dd/d1f/c6d to d5/db/d25/dc8/dd2/dd1/ce0 0 2026-03-09T16:14:46.440 INFO:tasks.workunit.client.0.vm03.stdout:7/670: mkdir d4/da/d5d/db0/d61/dca/ddd 0 2026-03-09T16:14:46.443 INFO:tasks.workunit.client.0.vm03.stdout:7/671: dwrite d4/d2d/d4b/f6b [0,4194304] 0 2026-03-09T16:14:46.444 INFO:tasks.workunit.client.0.vm03.stdout:7/672: chown d4/da/d5d/db0/l7c 80781 1 2026-03-09T16:14:46.448 INFO:tasks.workunit.client.0.vm03.stdout:5/801: link d2/d7/d8/d16/l4f d2/d7/de/d11/d19/d29/d90/dbe/df5/l10c 0 2026-03-09T16:14:46.456 INFO:tasks.workunit.client.0.vm03.stdout:1/644: fdatasync d4/d6/d1d/d3d/f49 0 2026-03-09T16:14:46.464 INFO:tasks.workunit.client.0.vm03.stdout:8/758: creat da/d32/d79/d95/ffc x:0 0 0 2026-03-09T16:14:46.465 INFO:tasks.workunit.client.0.vm03.stdout:6/702: rmdir d9/d42/d45/d65/dbf 39 2026-03-09T16:14:46.466 INFO:tasks.workunit.client.0.vm03.stdout:0/730: creat d0/da/d5c/db6/ffc x:0 0 0 2026-03-09T16:14:46.466 INFO:tasks.workunit.client.0.vm03.stdout:0/731: chown d0/d7/d3e/d57/d5a/d5f/db2/d8e 1007683 1 2026-03-09T16:14:46.473 INFO:tasks.workunit.client.0.vm03.stdout:4/742: dread d5/d17/d44/f90 [0,4194304] 0 2026-03-09T16:14:46.473 INFO:tasks.workunit.client.0.vm03.stdout:6/703: sync 2026-03-09T16:14:46.477 INFO:tasks.workunit.client.0.vm03.stdout:9/776: mkdir d2/d4/d11/d29/d2a/d38/dcd/df0 0 2026-03-09T16:14:46.498 INFO:tasks.workunit.client.0.vm03.stdout:0/732: rename d0/da/d7a/ff5 to d0/da/d1b/ffd 0 2026-03-09T16:14:46.501 INFO:tasks.workunit.client.0.vm03.stdout:2/737: write db/d12/d2a/f8d [1000550,1537] 0 2026-03-09T16:14:46.504 INFO:tasks.workunit.client.0.vm03.stdout:5/802: dwrite d2/d7/de/fd8 [0,4194304] 0 2026-03-09T16:14:46.507 INFO:tasks.workunit.client.0.vm03.stdout:6/704: symlink d9/d42/d45/d65/dae/ld9 0 2026-03-09T16:14:46.514 INFO:tasks.workunit.client.0.vm03.stdout:4/743: read d5/db/d25/d31/d4d/d5b/d72/f75 [1302462,128449] 0 2026-03-09T16:14:46.515 INFO:tasks.workunit.client.0.vm03.stdout:1/645: symlink d4/d6/ld2 0 2026-03-09T16:14:46.517 INFO:tasks.workunit.client.0.vm03.stdout:8/759: creat da/d10/d28/db1/dce/de8/ffd x:0 0 0 2026-03-09T16:14:46.520 INFO:tasks.workunit.client.0.vm03.stdout:0/733: symlink d0/d7/d75/lfe 0 2026-03-09T16:14:46.525 INFO:tasks.workunit.client.0.vm03.stdout:5/803: chown d2/d7/d1a/d1c/l85 126 1 2026-03-09T16:14:46.528 INFO:tasks.workunit.client.0.vm03.stdout:5/804: write d2/d7/d1a/f6e [2870943,6442] 0 2026-03-09T16:14:46.535 INFO:tasks.workunit.client.0.vm03.stdout:7/673: symlink d4/da/d5d/db0/da9/db8/lde 0 2026-03-09T16:14:46.541 INFO:tasks.workunit.client.0.vm03.stdout:7/674: dread d4/da/d45/d51/f5b [0,4194304] 0 2026-03-09T16:14:46.546 INFO:tasks.workunit.client.0.vm03.stdout:3/710: rmdir d5/d1e/d42/d34/dd2/dce 0 2026-03-09T16:14:46.548 INFO:tasks.workunit.client.0.vm03.stdout:0/734: mknod d0/da/d1b/de0/cff 0 2026-03-09T16:14:46.559 INFO:tasks.workunit.client.0.vm03.stdout:2/738: write db/d12/d2a/d61/f54 [792021,116505] 0 2026-03-09T16:14:46.559 INFO:tasks.workunit.client.0.vm03.stdout:2/739: readlink db/d12/d2a/d61/la6 0 2026-03-09T16:14:46.564 INFO:tasks.workunit.client.0.vm03.stdout:4/744: symlink d5/db/d25/d31/d4d/da9/le1 0 2026-03-09T16:14:46.566 INFO:tasks.workunit.client.0.vm03.stdout:1/646: creat d4/d6/da2/fd3 x:0 0 0 2026-03-09T16:14:46.569 INFO:tasks.workunit.client.0.vm03.stdout:7/675: mkdir d4/da/d5d/db0/da9/db8/ddf 0 2026-03-09T16:14:46.569 INFO:tasks.workunit.client.0.vm03.stdout:7/676: chown d4/da/d45/d51/d36/d66 2748869 1 2026-03-09T16:14:46.571 INFO:tasks.workunit.client.0.vm03.stdout:8/760: symlink da/def/lfe 0 2026-03-09T16:14:46.574 INFO:tasks.workunit.client.0.vm03.stdout:3/711: chown d5/d53/c9a 20282 1 2026-03-09T16:14:46.574 INFO:tasks.workunit.client.0.vm03.stdout:3/712: fsync d5/d1e/d42/d4c/f7d 0 2026-03-09T16:14:46.576 INFO:tasks.workunit.client.0.vm03.stdout:0/735: rmdir d0/d7/d3e/d57/d5a/d82 39 2026-03-09T16:14:46.576 INFO:tasks.workunit.client.0.vm03.stdout:0/736: chown d0/d7/d3e/d57/d5a/d5f/db2 16 1 2026-03-09T16:14:46.579 INFO:tasks.workunit.client.0.vm03.stdout:0/737: dwrite d0/da/d5c/fe7 [0,4194304] 0 2026-03-09T16:14:46.581 INFO:tasks.workunit.client.0.vm03.stdout:0/738: chown d0/f60 2951 1 2026-03-09T16:14:46.588 INFO:tasks.workunit.client.0.vm03.stdout:2/740: creat db/d12/da5/de4/f106 x:0 0 0 2026-03-09T16:14:46.592 INFO:tasks.workunit.client.0.vm03.stdout:6/705: mknod d9/d42/d45/d65/dbf/cda 0 2026-03-09T16:14:46.593 INFO:tasks.workunit.client.0.vm03.stdout:6/706: fsync d9/d42/d45/d65/f7f 0 2026-03-09T16:14:46.595 INFO:tasks.workunit.client.0.vm03.stdout:9/777: getdents d2/de/d88 0 2026-03-09T16:14:46.599 INFO:tasks.workunit.client.0.vm03.stdout:9/778: dwrite d2/d4/d11/d29/d2a/d38/fca [0,4194304] 0 2026-03-09T16:14:46.601 INFO:tasks.workunit.client.0.vm03.stdout:4/745: truncate d5/db/f34 4014860 0 2026-03-09T16:14:46.603 INFO:tasks.workunit.client.0.vm03.stdout:1/647: mknod d4/d6/da2/cd4 0 2026-03-09T16:14:46.611 INFO:tasks.workunit.client.0.vm03.stdout:8/761: unlink da/d45/fe0 0 2026-03-09T16:14:46.614 INFO:tasks.workunit.client.0.vm03.stdout:3/713: truncate d5/d1e/d42/f20 5675498 0 2026-03-09T16:14:46.625 INFO:tasks.workunit.client.0.vm03.stdout:2/741: sync 2026-03-09T16:14:46.625 INFO:tasks.workunit.client.0.vm03.stdout:2/742: dread - db/d12/d2a/d99/de7/df9/d64/ff6 zero size 2026-03-09T16:14:46.629 INFO:tasks.workunit.client.0.vm03.stdout:2/743: dwrite db/d12/d2a/d99/fcd [0,4194304] 0 2026-03-09T16:14:46.631 INFO:tasks.workunit.client.0.vm03.stdout:2/744: stat db/d12/d2a/d99/de7/df9/cae 0 2026-03-09T16:14:46.631 INFO:tasks.workunit.client.0.vm03.stdout:2/745: chown db/d12/da5/de2 36 1 2026-03-09T16:14:46.642 INFO:tasks.workunit.client.0.vm03.stdout:4/746: read - d5/db/d25/d31/d4d/d5b/d9a/fac zero size 2026-03-09T16:14:46.651 INFO:tasks.workunit.client.0.vm03.stdout:7/677: write d4/d2d/f90 [5120441,63545] 0 2026-03-09T16:14:46.660 INFO:tasks.workunit.client.0.vm03.stdout:0/739: dwrite d0/da/d5c/f33 [0,4194304] 0 2026-03-09T16:14:46.663 INFO:tasks.workunit.client.0.vm03.stdout:1/648: write d4/d6/d1d/d20/f2a [2896304,26047] 0 2026-03-09T16:14:46.671 INFO:tasks.workunit.client.0.vm03.stdout:7/678: dread d4/da/d5d/db0/fb2 [0,4194304] 0 2026-03-09T16:14:46.672 INFO:tasks.workunit.client.0.vm03.stdout:7/679: stat d4/d2d/d4b/f4c 0 2026-03-09T16:14:46.678 INFO:tasks.workunit.client.0.vm03.stdout:5/805: truncate d2/d7/d8/d16/d5c/dfc/d106/d52/fc6 2984389 0 2026-03-09T16:14:46.680 INFO:tasks.workunit.client.0.vm03.stdout:9/779: write d2/d4/d11/d29/d2a/d38/fb7 [423230,83628] 0 2026-03-09T16:14:46.681 INFO:tasks.workunit.client.0.vm03.stdout:9/780: read d2/f15 [8349114,129713] 0 2026-03-09T16:14:46.685 INFO:tasks.workunit.client.0.vm03.stdout:8/762: mkdir da/d32/d79/d95/dff 0 2026-03-09T16:14:46.688 INFO:tasks.workunit.client.0.vm03.stdout:3/714: rename d5/d2e/db6 to d5/d53/d88/dd3 0 2026-03-09T16:14:46.693 INFO:tasks.workunit.client.0.vm03.stdout:6/707: symlink d9/d42/d45/d65/ldb 0 2026-03-09T16:14:46.705 INFO:tasks.workunit.client.0.vm03.stdout:0/740: readlink d0/d7/l3a 0 2026-03-09T16:14:46.711 INFO:tasks.workunit.client.0.vm03.stdout:7/680: symlink d4/da/d5d/dd8/d22/d24/d15/le0 0 2026-03-09T16:14:46.721 INFO:tasks.workunit.client.0.vm03.stdout:5/806: dwrite d2/d7/de/faa [4194304,4194304] 0 2026-03-09T16:14:46.724 INFO:tasks.workunit.client.0.vm03.stdout:9/781: symlink d2/d4/d1f/lf1 0 2026-03-09T16:14:46.734 INFO:tasks.workunit.client.0.vm03.stdout:8/763: symlink da/d32/db5/l100 0 2026-03-09T16:14:46.735 INFO:tasks.workunit.client.0.vm03.stdout:8/764: stat da/d10/d28/d4f/daf 0 2026-03-09T16:14:46.738 INFO:tasks.workunit.client.0.vm03.stdout:8/765: dwrite da/fba [0,4194304] 0 2026-03-09T16:14:46.755 INFO:tasks.workunit.client.0.vm03.stdout:2/746: mknod db/d12/d2a/c107 0 2026-03-09T16:14:46.759 INFO:tasks.workunit.client.0.vm03.stdout:2/747: dread db/d12/fe8 [0,4194304] 0 2026-03-09T16:14:46.763 INFO:tasks.workunit.client.0.vm03.stdout:2/748: fsync db/d12/da5/dc2/dc9/ff1 0 2026-03-09T16:14:46.769 INFO:tasks.workunit.client.0.vm03.stdout:1/649: dread d4/d6/d3b/d63/f89 [0,4194304] 0 2026-03-09T16:14:46.774 INFO:tasks.workunit.client.0.vm03.stdout:4/747: dwrite d5/dd/d1f/d5f/f98 [0,4194304] 0 2026-03-09T16:14:46.781 INFO:tasks.workunit.client.0.vm03.stdout:5/807: creat d2/d7/d8/d16/d5c/f10d x:0 0 0 2026-03-09T16:14:46.783 INFO:tasks.workunit.client.0.vm03.stdout:9/782: creat d2/d4/d11/d29/d2a/d46/ff2 x:0 0 0 2026-03-09T16:14:46.787 INFO:tasks.workunit.client.0.vm03.stdout:8/766: symlink da/db/d43/l101 0 2026-03-09T16:14:46.793 INFO:tasks.workunit.client.0.vm03.stdout:2/749: rmdir db/d12/da5/dc2 39 2026-03-09T16:14:46.796 INFO:tasks.workunit.client.0.vm03.stdout:0/741: mknod d0/d7/d3e/c100 0 2026-03-09T16:14:46.802 INFO:tasks.workunit.client.0.vm03.stdout:1/650: rename d4/d6/d3b/d6b/d25/lab to d4/d6/d3b/d6b/da5/dc0/ld5 0 2026-03-09T16:14:46.803 INFO:tasks.workunit.client.0.vm03.stdout:1/651: write d4/d31/f4f [3578557,8040] 0 2026-03-09T16:14:46.808 INFO:tasks.workunit.client.0.vm03.stdout:3/715: truncate d5/d1e/f9b 537948 0 2026-03-09T16:14:46.816 INFO:tasks.workunit.client.0.vm03.stdout:9/783: mknod d2/d4/d11/d29/d2a/db3/dbe/cf3 0 2026-03-09T16:14:46.826 INFO:tasks.workunit.client.0.vm03.stdout:0/742: truncate d0/d7/d3e/d57/d5a/d52/d9f/fda 772078 0 2026-03-09T16:14:46.835 INFO:tasks.workunit.client.0.vm03.stdout:4/748: write d5/dd/d1f/fbb [326570,64194] 0 2026-03-09T16:14:46.839 INFO:tasks.workunit.client.0.vm03.stdout:0/743: dread d0/d7/d3e/d57/d5a/d52/d9f/fd3 [0,4194304] 0 2026-03-09T16:14:46.840 INFO:tasks.workunit.client.0.vm03.stdout:0/744: chown d0/d7/d3e/d57/d5a/d52/d9f/caf 207 1 2026-03-09T16:14:46.842 INFO:tasks.workunit.client.0.vm03.stdout:8/767: write da/db/da8/fc5 [788312,62852] 0 2026-03-09T16:14:46.843 INFO:tasks.workunit.client.0.vm03.stdout:8/768: readlink da/d1d/lf3 0 2026-03-09T16:14:46.847 INFO:tasks.workunit.client.0.vm03.stdout:1/652: fdatasync d4/d6/d3b/d6b/d25/fc7 0 2026-03-09T16:14:46.849 INFO:tasks.workunit.client.0.vm03.stdout:3/716: rmdir d5/d6d 39 2026-03-09T16:14:46.850 INFO:tasks.workunit.client.0.vm03.stdout:3/717: chown d5/d1e/f26 1244 1 2026-03-09T16:14:46.853 INFO:tasks.workunit.client.0.vm03.stdout:7/681: creat d4/da/d5d/dd8/d22/d24/d16/d2b/fe1 x:0 0 0 2026-03-09T16:14:46.860 INFO:tasks.workunit.client.0.vm03.stdout:9/784: truncate d2/d4/d11/d29/f70 936211 0 2026-03-09T16:14:46.863 INFO:tasks.workunit.client.0.vm03.stdout:6/708: mknod d9/d42/d45/d50/d80/cdc 0 2026-03-09T16:14:46.876 INFO:tasks.workunit.client.0.vm03.stdout:0/745: truncate d0/d7/d3e/d57/d5a/d5f/db2/fa2 575708 0 2026-03-09T16:14:46.887 INFO:tasks.workunit.client.0.vm03.stdout:1/653: dread d4/d6/d1d/d20/f72 [0,4194304] 0 2026-03-09T16:14:46.893 INFO:tasks.workunit.client.0.vm03.stdout:7/682: symlink d4/da/d45/d51/d36/le2 0 2026-03-09T16:14:46.893 INFO:tasks.workunit.client.0.vm03.stdout:7/683: write d4/da/d5d/db0/d61/f8b [2836165,77920] 0 2026-03-09T16:14:46.894 INFO:tasks.workunit.client.0.vm03.stdout:7/684: write d4/d2d/d4b/fd6 [391005,52857] 0 2026-03-09T16:14:46.898 INFO:tasks.workunit.client.0.vm03.stdout:7/685: fsync d4/d2d/d4b/f4c 0 2026-03-09T16:14:46.898 INFO:tasks.workunit.client.0.vm03.stdout:7/686: fdatasync d4/da/d5d/dd8/f6a 0 2026-03-09T16:14:46.907 INFO:tasks.workunit.client.0.vm03.stdout:9/785: unlink d2/d4/d11/d12/l21 0 2026-03-09T16:14:46.913 INFO:tasks.workunit.client.0.vm03.stdout:2/750: mknod db/d12/d2a/d99/de7/df9/c108 0 2026-03-09T16:14:46.922 INFO:tasks.workunit.client.0.vm03.stdout:0/746: creat d0/d7/d3e/d57/d5a/d5f/db2/dcf/f101 x:0 0 0 2026-03-09T16:14:46.935 INFO:tasks.workunit.client.0.vm03.stdout:0/747: fsync d0/d7/d3e/d57/d5a/d5f/db2/dcf/f101 0 2026-03-09T16:14:46.936 INFO:tasks.workunit.client.0.vm03.stdout:8/769: write da/db/f1c [617463,102334] 0 2026-03-09T16:14:46.938 INFO:tasks.workunit.client.0.vm03.stdout:8/770: read da/d10/d28/d4f/d85/fa1 [294858,29336] 0 2026-03-09T16:14:46.938 INFO:tasks.workunit.client.0.vm03.stdout:1/654: write d4/d6/d1d/d20/d93/f8c [202131,51128] 0 2026-03-09T16:14:46.953 INFO:tasks.workunit.client.0.vm03.stdout:3/718: write d5/d1e/f9b [1516742,10387] 0 2026-03-09T16:14:46.956 INFO:tasks.workunit.client.0.vm03.stdout:5/808: link d2/d7/d8/d16/c17 d2/d7/de/d54/c10e 0 2026-03-09T16:14:46.957 INFO:tasks.workunit.client.0.vm03.stdout:5/809: write d2/d7/d8/d16/d5c/dcf/fe4 [1899169,39312] 0 2026-03-09T16:14:46.963 INFO:tasks.workunit.client.0.vm03.stdout:9/786: write d2/d54/d7d/d8f/dad/def/f14 [4250166,60256] 0 2026-03-09T16:14:46.964 INFO:tasks.workunit.client.0.vm03.stdout:9/787: dread - d2/d4/d11/d29/d2a/db3/fe1 zero size 2026-03-09T16:14:46.972 INFO:tasks.workunit.client.0.vm03.stdout:6/709: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f83 [0,4194304] 0 2026-03-09T16:14:46.980 INFO:tasks.workunit.client.0.vm03.stdout:4/749: link d5/dd/d1f/l67 d5/db/d25/d31/d4d/d5b/d72/d77/le2 0 2026-03-09T16:14:46.992 INFO:tasks.workunit.client.0.vm03.stdout:0/748: fsync d0/d7/fbb 0 2026-03-09T16:14:46.992 INFO:tasks.workunit.client.0.vm03.stdout:0/749: chown d0/d7/d3e/d57 42 1 2026-03-09T16:14:46.995 INFO:tasks.workunit.client.0.vm03.stdout:8/771: mknod da/d1d/c102 0 2026-03-09T16:14:46.996 INFO:tasks.workunit.client.0.vm03.stdout:8/772: dread - da/d32/ff5 zero size 2026-03-09T16:14:46.996 INFO:tasks.workunit.client.0.vm03.stdout:8/773: chown da/d10/d28/d4f/d85/d9c/ld7 87 1 2026-03-09T16:14:46.999 INFO:tasks.workunit.client.0.vm03.stdout:7/687: mknod d4/d2d/ce3 0 2026-03-09T16:14:46.999 INFO:tasks.workunit.client.0.vm03.stdout:7/688: stat d4/da/d5d/dd8/d22/d24/d15/d71/d79 0 2026-03-09T16:14:47.002 INFO:tasks.workunit.client.0.vm03.stdout:3/719: rename d5/d44/f5d to d5/d2e/fd4 0 2026-03-09T16:14:47.011 INFO:tasks.workunit.client.0.vm03.stdout:0/750: symlink d0/d7/d3e/d57/d5a/d47/l102 0 2026-03-09T16:14:47.016 INFO:tasks.workunit.client.0.vm03.stdout:6/710: sync 2026-03-09T16:14:47.019 INFO:tasks.workunit.client.0.vm03.stdout:6/711: dread d9/d42/d45/d50/d80/fa1 [0,4194304] 0 2026-03-09T16:14:47.021 INFO:tasks.workunit.client.0.vm03.stdout:7/689: symlink d4/da/d5d/db0/d61/dca/le4 0 2026-03-09T16:14:47.026 INFO:tasks.workunit.client.0.vm03.stdout:2/751: write db/d12/d2a/f60 [1033269,15552] 0 2026-03-09T16:14:47.037 INFO:tasks.workunit.client.0.vm03.stdout:4/750: rename d5/db/d25/d8b/da8/fae to d5/dd/d1f/fe3 0 2026-03-09T16:14:47.038 INFO:tasks.workunit.client.0.vm03.stdout:4/751: write d5/db/d25/d31/d33/d79/fa6 [160840,54748] 0 2026-03-09T16:14:47.038 INFO:tasks.workunit.client.0.vm03.stdout:4/752: write d5/db/d25/d8b/fc6 [5052895,24128] 0 2026-03-09T16:14:47.040 INFO:tasks.workunit.client.0.vm03.stdout:3/720: stat d5/d1e/d42/f20 0 2026-03-09T16:14:47.050 INFO:tasks.workunit.client.0.vm03.stdout:1/655: link d4/d39/d7f/fcb d4/db/fd6 0 2026-03-09T16:14:47.052 INFO:tasks.workunit.client.0.vm03.stdout:1/656: read d4/d6/d1d/d20/d93/f48 [3625835,117313] 0 2026-03-09T16:14:47.058 INFO:tasks.workunit.client.0.vm03.stdout:6/712: read d9/d42/fa6 [27437,95530] 0 2026-03-09T16:14:47.060 INFO:tasks.workunit.client.0.vm03.stdout:7/690: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5 0 2026-03-09T16:14:47.061 INFO:tasks.workunit.client.0.vm03.stdout:2/752: mkdir db/d12/d2a/d99/d109 0 2026-03-09T16:14:47.062 INFO:tasks.workunit.client.0.vm03.stdout:5/810: write d2/d7/d8/d16/d5c/dfc/d106/d52/fc6 [3596554,66267] 0 2026-03-09T16:14:47.068 INFO:tasks.workunit.client.0.vm03.stdout:3/721: dread d5/d1e/d42/d34/d70/fc7 [0,4194304] 0 2026-03-09T16:14:47.073 INFO:tasks.workunit.client.0.vm03.stdout:9/788: link d2/d54/f90 d2/d4/d11/d29/d92/ff4 0 2026-03-09T16:14:47.074 INFO:tasks.workunit.client.0.vm03.stdout:8/774: getdents da/d10/d28/d4f/d85 0 2026-03-09T16:14:47.074 INFO:tasks.workunit.client.0.vm03.stdout:1/657: symlink d4/d6/d1d/d20/d5f/ld7 0 2026-03-09T16:14:47.075 INFO:tasks.workunit.client.0.vm03.stdout:1/658: fsync d4/d6/d1d/d20/d23/f30 0 2026-03-09T16:14:47.075 INFO:tasks.workunit.client.0.vm03.stdout:1/659: chown d4/d6/d3b/d6b/da5 26 1 2026-03-09T16:14:47.080 INFO:tasks.workunit.client.0.vm03.stdout:3/722: dread d5/d1e/f72 [0,4194304] 0 2026-03-09T16:14:47.082 INFO:tasks.workunit.client.0.vm03.stdout:6/713: write d9/d42/d45/d50/d80/d90/f64 [2326936,101947] 0 2026-03-09T16:14:47.089 INFO:tasks.workunit.client.0.vm03.stdout:2/753: dread db/d12/d2a/d61/d79/fb7 [0,4194304] 0 2026-03-09T16:14:47.092 INFO:tasks.workunit.client.0.vm03.stdout:4/753: write d5/db/d25/d31/d4d/d5b/d72/d77/f91 [1097333,92764] 0 2026-03-09T16:14:47.094 INFO:tasks.workunit.client.0.vm03.stdout:4/754: read d5/d17/da0/fb9 [2814301,25610] 0 2026-03-09T16:14:47.095 INFO:tasks.workunit.client.0.vm03.stdout:5/811: rename d2/d75/c95 to d2/d7/d8/d16/d5c/dfc/c10f 0 2026-03-09T16:14:47.104 INFO:tasks.workunit.client.0.vm03.stdout:0/751: symlink d0/d7/d3e/d57/d5a/d82/d89/l103 0 2026-03-09T16:14:47.105 INFO:tasks.workunit.client.0.vm03.stdout:8/775: creat da/d32/d79/f103 x:0 0 0 2026-03-09T16:14:47.107 INFO:tasks.workunit.client.0.vm03.stdout:9/789: creat d2/d4/d11/d12/ff5 x:0 0 0 2026-03-09T16:14:47.109 INFO:tasks.workunit.client.0.vm03.stdout:8/776: dwrite da/d10/d28/d64/fc8 [0,4194304] 0 2026-03-09T16:14:47.117 INFO:tasks.workunit.client.0.vm03.stdout:1/660: rmdir d4/d31/d5c/da8 39 2026-03-09T16:14:47.118 INFO:tasks.workunit.client.0.vm03.stdout:1/661: chown d4/d6/d1d/d20/fcc 4065803 1 2026-03-09T16:14:47.119 INFO:tasks.workunit.client.0.vm03.stdout:3/723: mknod d5/d2e/cd5 0 2026-03-09T16:14:47.142 INFO:tasks.workunit.client.0.vm03.stdout:3/724: sync 2026-03-09T16:14:47.143 INFO:tasks.workunit.client.0.vm03.stdout:3/725: write d5/d53/fcc [937141,32296] 0 2026-03-09T16:14:47.145 INFO:tasks.workunit.client.0.vm03.stdout:0/752: mkdir d0/da/d1b/dc8/d104 0 2026-03-09T16:14:47.148 INFO:tasks.workunit.client.0.vm03.stdout:2/754: write db/f93 [495376,94427] 0 2026-03-09T16:14:47.151 INFO:tasks.workunit.client.0.vm03.stdout:7/691: dwrite d4/da/d5d/db0/d9d/dc9/fd0 [0,4194304] 0 2026-03-09T16:14:47.153 INFO:tasks.workunit.client.0.vm03.stdout:9/790: read d2/d4/d11/d12/f1e [3549300,54516] 0 2026-03-09T16:14:47.157 INFO:tasks.workunit.client.0.vm03.stdout:4/755: dwrite d5/db/d25/d31/d4d/d5b/d9a/fac [0,4194304] 0 2026-03-09T16:14:47.158 INFO:tasks.workunit.client.0.vm03.stdout:9/791: fdatasync d2/d54/d7d/d8f/dad/def/f14 0 2026-03-09T16:14:47.161 INFO:tasks.workunit.client.0.vm03.stdout:4/756: chown d5/dd/d1f/f59 894795448 1 2026-03-09T16:14:47.164 INFO:tasks.workunit.client.0.vm03.stdout:1/662: fdatasync d4/d6/d1d/d20/d23/f9f 0 2026-03-09T16:14:47.171 INFO:tasks.workunit.client.0.vm03.stdout:5/812: symlink d2/d7/de/d11/d19/d29/d90/l110 0 2026-03-09T16:14:47.182 INFO:tasks.workunit.client.0.vm03.stdout:8/777: creat da/d6c/dfa/f104 x:0 0 0 2026-03-09T16:14:47.184 INFO:tasks.workunit.client.0.vm03.stdout:7/692: mknod d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/ce6 0 2026-03-09T16:14:47.185 INFO:tasks.workunit.client.0.vm03.stdout:7/693: write d4/da/d5d/dd8/d22/d24/d15/d71/fd1 [632011,89541] 0 2026-03-09T16:14:47.190 INFO:tasks.workunit.client.0.vm03.stdout:4/757: mkdir d5/db/d25/d31/d4d/d5b/d72/d82/de4 0 2026-03-09T16:14:47.201 INFO:tasks.workunit.client.0.vm03.stdout:3/726: dwrite d5/f11 [0,4194304] 0 2026-03-09T16:14:47.202 INFO:tasks.workunit.client.0.vm03.stdout:2/755: dwrite db/d12/d2a/d99/de7/df9/d64/f80 [0,4194304] 0 2026-03-09T16:14:47.219 INFO:tasks.workunit.client.0.vm03.stdout:0/753: link d0/d7/d3e/d57/d5a/d82/d89/dbd/f7e d0/d7/d3e/d57/de9/f105 0 2026-03-09T16:14:47.221 INFO:tasks.workunit.client.0.vm03.stdout:8/778: mkdir da/d6c/dfa/d105 0 2026-03-09T16:14:47.223 INFO:tasks.workunit.client.0.vm03.stdout:7/694: creat d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/fe7 x:0 0 0 2026-03-09T16:14:47.228 INFO:tasks.workunit.client.0.vm03.stdout:9/792: unlink d2/d4/d11/d29/d2a/d4d/f56 0 2026-03-09T16:14:47.232 INFO:tasks.workunit.client.0.vm03.stdout:4/758: fsync d5/dd/f1e 0 2026-03-09T16:14:47.235 INFO:tasks.workunit.client.0.vm03.stdout:1/663: dwrite d4/d6/d3b/d6b/d25/fb8 [0,4194304] 0 2026-03-09T16:14:47.241 INFO:tasks.workunit.client.0.vm03.stdout:6/714: link d9/d14/da5/lb6 d9/d42/d45/d50/d80/d8a/d9c/d97/da8/dbd/ldd 0 2026-03-09T16:14:47.246 INFO:tasks.workunit.client.0.vm03.stdout:3/727: symlink d5/d53/d88/dd3/ld6 0 2026-03-09T16:14:47.248 INFO:tasks.workunit.client.0.vm03.stdout:2/756: mknod db/d12/da5/de4/c10a 0 2026-03-09T16:14:47.251 INFO:tasks.workunit.client.0.vm03.stdout:8/779: truncate da/d10/d28/d4f/d68/fa9 597772 0 2026-03-09T16:14:47.256 INFO:tasks.workunit.client.0.vm03.stdout:7/695: truncate d4/da/d45/d51/f5b 4262128 0 2026-03-09T16:14:47.257 INFO:tasks.workunit.client.0.vm03.stdout:7/696: write d4/da/f20 [5241584,44132] 0 2026-03-09T16:14:47.261 INFO:tasks.workunit.client.0.vm03.stdout:9/793: read d2/d4/d11/d29/f95 [155187,23149] 0 2026-03-09T16:14:47.263 INFO:tasks.workunit.client.0.vm03.stdout:9/794: read d2/d4/d11/d12/d28/f2f [956593,72822] 0 2026-03-09T16:14:47.267 INFO:tasks.workunit.client.0.vm03.stdout:1/664: symlink d4/d6/d1d/d20/d5f/ld8 0 2026-03-09T16:14:47.268 INFO:tasks.workunit.client.0.vm03.stdout:1/665: stat d4/d6/d1d/d20/fc1 0 2026-03-09T16:14:47.268 INFO:tasks.workunit.client.0.vm03.stdout:1/666: fsync d4/d6/d1d/d69/f76 0 2026-03-09T16:14:47.271 INFO:tasks.workunit.client.0.vm03.stdout:3/728: rename d5/d1e/d42/d34/d70 to d5/d53/d88/dd7 0 2026-03-09T16:14:47.272 INFO:tasks.workunit.client.0.vm03.stdout:7/697: sync 2026-03-09T16:14:47.277 INFO:tasks.workunit.client.0.vm03.stdout:2/757: symlink db/d12/da5/de2/dd5/l10b 0 2026-03-09T16:14:47.280 INFO:tasks.workunit.client.0.vm03.stdout:2/758: dread db/d12/d2a/d99/de7/df9/d64/f80 [0,4194304] 0 2026-03-09T16:14:47.280 INFO:tasks.workunit.client.0.vm03.stdout:2/759: stat db/d12/f63 0 2026-03-09T16:14:47.289 INFO:tasks.workunit.client.0.vm03.stdout:8/780: mknod da/db/d30/dc7/c106 0 2026-03-09T16:14:47.291 INFO:tasks.workunit.client.0.vm03.stdout:8/781: dread - da/d10/d28/d4f/d68/fc1 zero size 2026-03-09T16:14:47.292 INFO:tasks.workunit.client.0.vm03.stdout:2/760: dread db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fd6 [0,4194304] 0 2026-03-09T16:14:47.297 INFO:tasks.workunit.client.0.vm03.stdout:9/795: fsync d2/d4/d11/d12/dc7/dee/fe4 0 2026-03-09T16:14:47.300 INFO:tasks.workunit.client.0.vm03.stdout:4/759: mknod d5/dd/dd5/ce5 0 2026-03-09T16:14:47.305 INFO:tasks.workunit.client.0.vm03.stdout:7/698: dread - d4/da/d5d/dd8/d22/d24/fa3 zero size 2026-03-09T16:14:47.305 INFO:tasks.workunit.client.0.vm03.stdout:4/760: sync 2026-03-09T16:14:47.307 INFO:tasks.workunit.client.0.vm03.stdout:4/761: sync 2026-03-09T16:14:47.308 INFO:tasks.workunit.client.0.vm03.stdout:4/762: write d5/db/d25/d31/d4d/fb2 [3148998,100209] 0 2026-03-09T16:14:47.316 INFO:tasks.workunit.client.0.vm03.stdout:5/813: link d2/d7/de/d54/c9c d2/d7/de9/c111 0 2026-03-09T16:14:47.318 INFO:tasks.workunit.client.0.vm03.stdout:3/729: dwrite d5/d1e/d42/f1d [4194304,4194304] 0 2026-03-09T16:14:47.320 INFO:tasks.workunit.client.0.vm03.stdout:0/754: creat d0/d7/d3e/d57/d5a/d5f/db2/f106 x:0 0 0 2026-03-09T16:14:47.321 INFO:tasks.workunit.client.0.vm03.stdout:0/755: stat d0/da/d1b/de0/cf1 0 2026-03-09T16:14:47.325 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:14:47.325 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: mgrmap e26: vm03.gbgzmu(active, starting, since 0.0103141s) 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/crt"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/key"}]: dispatch 2026-03-09T16:14:47.326 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:47.329 INFO:tasks.workunit.client.0.vm03.stdout:2/761: truncate db/d12/d2a/d61/f5c 920394 0 2026-03-09T16:14:47.335 INFO:tasks.workunit.client.0.vm03.stdout:6/715: creat d9/d42/d45/d50/d80/fde x:0 0 0 2026-03-09T16:14:47.340 INFO:tasks.workunit.client.0.vm03.stdout:4/763: truncate d5/db/d25/d31/d4d/d5b/d7d/f9d 910227 0 2026-03-09T16:14:47.340 INFO:tasks.workunit.client.0.vm03.stdout:0/756: rename d0/d7/d75/feb to d0/d7/d3e/d57/f107 0 2026-03-09T16:14:47.346 INFO:tasks.workunit.client.0.vm03.stdout:9/796: creat d2/d4/d11/d12/dc7/dee/dc2/de9/ff6 x:0 0 0 2026-03-09T16:14:47.357 INFO:tasks.workunit.client.0.vm03.stdout:8/782: dwrite da/d10/f33 [0,4194304] 0 2026-03-09T16:14:47.370 INFO:tasks.workunit.client.0.vm03.stdout:9/797: mknod d2/d4/d11/d29/d2a/db3/dbe/cf7 0 2026-03-09T16:14:47.371 INFO:tasks.workunit.client.0.vm03.stdout:1/667: getdents d4/d6/d1d/d3d 0 2026-03-09T16:14:47.376 INFO:tasks.workunit.client.0.vm03.stdout:6/716: mkdir d9/d42/d45/ddf 0 2026-03-09T16:14:47.378 INFO:tasks.workunit.client.0.vm03.stdout:2/762: dwrite db/d12/d2a/d99/fde [0,4194304] 0 2026-03-09T16:14:47.383 INFO:tasks.workunit.client.0.vm03.stdout:4/764: write d5/db/fb3 [468761,81069] 0 2026-03-09T16:14:47.386 INFO:tasks.workunit.client.0.vm03.stdout:7/699: dwrite d4/da/d45/d51/d36/f94 [0,4194304] 0 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: mgrmap e26: vm03.gbgzmu(active, starting, since 0.0103141s) 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:14:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/crt"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.gbgzmu/key"}]: dispatch 2026-03-09T16:14:47.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:47.393 INFO:tasks.workunit.client.0.vm03.stdout:5/814: truncate d2/d7/de/f48 3569586 0 2026-03-09T16:14:47.409 INFO:tasks.workunit.client.0.vm03.stdout:7/700: mkdir d4/da/d5d/db0/da9/de8 0 2026-03-09T16:14:47.409 INFO:tasks.workunit.client.0.vm03.stdout:3/730: getdents d5/d53/d88 0 2026-03-09T16:14:47.410 INFO:tasks.workunit.client.0.vm03.stdout:0/757: creat d0/d7/d3e/d57/d5a/d52/f108 x:0 0 0 2026-03-09T16:14:47.413 INFO:tasks.workunit.client.0.vm03.stdout:9/798: creat d2/d4/d11/dac/ff8 x:0 0 0 2026-03-09T16:14:47.414 INFO:tasks.workunit.client.0.vm03.stdout:9/799: chown d2/d54/d7d/d8f/dad/def/f64 3 1 2026-03-09T16:14:47.414 INFO:tasks.workunit.client.0.vm03.stdout:9/800: write d2/d4/d11/d29/d2a/db3/fe1 [349207,84625] 0 2026-03-09T16:14:47.416 INFO:tasks.workunit.client.0.vm03.stdout:1/668: truncate d4/f1b 294194 0 2026-03-09T16:14:47.416 INFO:tasks.workunit.client.0.vm03.stdout:6/717: mkdir d9/d42/d45/d65/dbf/dc9/de0 0 2026-03-09T16:14:47.418 INFO:tasks.workunit.client.0.vm03.stdout:4/765: dread d5/db/d25/d31/fa1 [0,4194304] 0 2026-03-09T16:14:47.427 INFO:tasks.workunit.client.0.vm03.stdout:9/801: fsync d2/d4/d1f/f51 0 2026-03-09T16:14:47.427 INFO:tasks.workunit.client.0.vm03.stdout:3/731: dread d5/d1e/d42/f74 [4194304,4194304] 0 2026-03-09T16:14:47.436 INFO:tasks.workunit.client.0.vm03.stdout:6/718: unlink d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/c2b 0 2026-03-09T16:14:47.437 INFO:tasks.workunit.client.0.vm03.stdout:4/766: symlink d5/db/d25/d31/d4d/d5b/le6 0 2026-03-09T16:14:47.441 INFO:tasks.workunit.client.0.vm03.stdout:0/758: unlink d0/d7/d48/f13 0 2026-03-09T16:14:47.443 INFO:tasks.workunit.client.0.vm03.stdout:9/802: mknod d2/de/d88/d7a/cf9 0 2026-03-09T16:14:47.448 INFO:tasks.workunit.client.0.vm03.stdout:2/763: link db/f34 db/d12/d2a/d61/f10c 0 2026-03-09T16:14:47.449 INFO:tasks.workunit.client.0.vm03.stdout:0/759: rmdir d0/da/d5c/db6 39 2026-03-09T16:14:47.451 INFO:tasks.workunit.client.0.vm03.stdout:9/803: symlink d2/d4/d11/d29/d92/lfa 0 2026-03-09T16:14:47.452 INFO:tasks.workunit.client.0.vm03.stdout:9/804: truncate d2/d4/d11/d12/dc7/dee/dc2/feb 484828 0 2026-03-09T16:14:47.459 INFO:tasks.workunit.client.0.vm03.stdout:0/760: dwrite d0/da/d5c/db6/ffc [0,4194304] 0 2026-03-09T16:14:47.473 INFO:tasks.workunit.client.0.vm03.stdout:9/805: read d2/d4/d11/d29/d92/ff4 [3117796,23077] 0 2026-03-09T16:14:47.475 INFO:tasks.workunit.client.0.vm03.stdout:2/764: symlink db/d12/l10d 0 2026-03-09T16:14:47.476 INFO:tasks.workunit.client.0.vm03.stdout:2/765: read db/d12/d2a/d61/d6d/fa8 [160383,28151] 0 2026-03-09T16:14:47.478 INFO:tasks.workunit.client.0.vm03.stdout:0/761: fsync d0/da/d5c/fae 0 2026-03-09T16:14:47.478 INFO:tasks.workunit.client.0.vm03.stdout:8/783: write da/d10/d28/fd3 [493223,40404] 0 2026-03-09T16:14:47.482 INFO:tasks.workunit.client.0.vm03.stdout:3/732: getdents d5/d53/d6c 0 2026-03-09T16:14:47.482 INFO:tasks.workunit.client.0.vm03.stdout:3/733: read d5/d1e/d42/f20 [5381809,94622] 0 2026-03-09T16:14:47.483 INFO:tasks.workunit.client.0.vm03.stdout:9/806: truncate d2/d4/d11/f66 2596629 0 2026-03-09T16:14:47.484 INFO:tasks.workunit.client.0.vm03.stdout:5/815: write d2/d7/de/d33/f9e [804076,67121] 0 2026-03-09T16:14:47.484 INFO:tasks.workunit.client.0.vm03.stdout:9/807: chown d2/d4/d11/d29/d2a/d38/dcd 992092 1 2026-03-09T16:14:47.485 INFO:tasks.workunit.client.0.vm03.stdout:9/808: readlink d2/d4/d11/d12/l1b 0 2026-03-09T16:14:47.485 INFO:tasks.workunit.client.0.vm03.stdout:9/809: fdatasync d2/d4/d11/d29/d2a/db3/fe1 0 2026-03-09T16:14:47.492 INFO:tasks.workunit.client.0.vm03.stdout:3/734: write d5/d1e/d42/d34/fad [1474018,70125] 0 2026-03-09T16:14:47.495 INFO:tasks.workunit.client.0.vm03.stdout:3/735: chown d5/d44/f56 3 1 2026-03-09T16:14:47.495 INFO:tasks.workunit.client.0.vm03.stdout:5/816: fdatasync d2/d7/d1a/d1c/d3f/f92 0 2026-03-09T16:14:47.497 INFO:tasks.workunit.client.0.vm03.stdout:1/669: dwrite d4/d6/d3b/d6b/d25/f84 [0,4194304] 0 2026-03-09T16:14:47.504 INFO:tasks.workunit.client.0.vm03.stdout:8/784: sync 2026-03-09T16:14:47.510 INFO:tasks.workunit.client.0.vm03.stdout:7/701: truncate d4/d2d/f8c 336501 0 2026-03-09T16:14:47.511 INFO:tasks.workunit.client.0.vm03.stdout:3/736: rmdir d5/d53/d88/dd3 39 2026-03-09T16:14:47.514 INFO:tasks.workunit.client.0.vm03.stdout:8/785: readlink da/lda 0 2026-03-09T16:14:47.515 INFO:tasks.workunit.client.0.vm03.stdout:8/786: write da/d6c/d7a/ff9 [18564,106717] 0 2026-03-09T16:14:47.516 INFO:tasks.workunit.client.0.vm03.stdout:4/767: dwrite d5/dd/f1e [0,4194304] 0 2026-03-09T16:14:47.523 INFO:tasks.workunit.client.0.vm03.stdout:5/817: mkdir d2/d7/d8/d16/d5c/dfc/d106/d108/d112 0 2026-03-09T16:14:47.523 INFO:tasks.workunit.client.0.vm03.stdout:8/787: creat da/db/da8/f107 x:0 0 0 2026-03-09T16:14:47.524 INFO:tasks.workunit.client.0.vm03.stdout:4/768: rmdir d5/db/d25/d31/d4d/da9 39 2026-03-09T16:14:47.524 INFO:tasks.workunit.client.0.vm03.stdout:0/762: link d0/f4d d0/d7/d3e/d57/d5a/d52/d9f/f109 0 2026-03-09T16:14:47.525 INFO:tasks.workunit.client.0.vm03.stdout:8/788: write da/d10/d28/db1/dce/de8/ffd [1017183,45202] 0 2026-03-09T16:14:47.525 INFO:tasks.workunit.client.0.vm03.stdout:7/702: creat d4/da/d5d/db0/da9/db8/ddf/fe9 x:0 0 0 2026-03-09T16:14:47.529 INFO:tasks.workunit.client.0.vm03.stdout:7/703: dwrite d4/da/d5d/db0/d61/fd2 [0,4194304] 0 2026-03-09T16:14:47.533 INFO:tasks.workunit.client.0.vm03.stdout:5/818: creat d2/d7/d3c/f113 x:0 0 0 2026-03-09T16:14:47.543 INFO:tasks.workunit.client.0.vm03.stdout:4/769: fsync d5/d17/f80 0 2026-03-09T16:14:47.546 INFO:tasks.workunit.client.0.vm03.stdout:0/763: dread d0/d7/d3e/d57/d5a/d47/f88 [0,4194304] 0 2026-03-09T16:14:47.554 INFO:tasks.workunit.client.0.vm03.stdout:6/719: dwrite d9/f35 [0,4194304] 0 2026-03-09T16:14:47.556 INFO:tasks.workunit.client.0.vm03.stdout:1/670: getdents d4/d39/d7f 0 2026-03-09T16:14:47.562 INFO:tasks.workunit.client.0.vm03.stdout:7/704: fsync d4/da/f42 0 2026-03-09T16:14:47.565 INFO:tasks.workunit.client.0.vm03.stdout:0/764: sync 2026-03-09T16:14:47.568 INFO:tasks.workunit.client.0.vm03.stdout:4/770: creat d5/db/d25/d31/d4d/d5b/d72/d82/fe7 x:0 0 0 2026-03-09T16:14:47.569 INFO:tasks.workunit.client.0.vm03.stdout:4/771: chown d5/db/d25/d31/d33/d79/cdf 661 1 2026-03-09T16:14:47.569 INFO:tasks.workunit.client.0.vm03.stdout:4/772: stat d5/dd/f23 0 2026-03-09T16:14:47.573 INFO:tasks.workunit.client.0.vm03.stdout:1/671: stat d4/d6/f9 0 2026-03-09T16:14:47.576 INFO:tasks.workunit.client.0.vm03.stdout:2/766: dwrite db/d12/d2a/d61/f74 [0,4194304] 0 2026-03-09T16:14:47.583 INFO:tasks.workunit.client.0.vm03.stdout:2/767: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/f6b [0,4194304] 0 2026-03-09T16:14:47.587 INFO:tasks.workunit.client.0.vm03.stdout:9/810: write d2/d4/d11/d12/d28/fc5 [6487692,98019] 0 2026-03-09T16:14:47.588 INFO:tasks.workunit.client.0.vm03.stdout:7/705: mkdir d4/da/d45/d51/dea 0 2026-03-09T16:14:47.599 INFO:tasks.workunit.client.0.vm03.stdout:5/819: chown d2/d7/de/f48 430 1 2026-03-09T16:14:47.605 INFO:tasks.workunit.client.0.vm03.stdout:3/737: dwrite d5/d1e/d42/f20 [4194304,4194304] 0 2026-03-09T16:14:47.616 INFO:tasks.workunit.client.0.vm03.stdout:9/811: creat d2/d4/d11/d12/dc7/dee/dc2/de9/ffb x:0 0 0 2026-03-09T16:14:47.616 INFO:tasks.workunit.client.0.vm03.stdout:0/765: fsync d0/d7/d3e/d57/d5a/d52/d9f/f109 0 2026-03-09T16:14:47.616 INFO:tasks.workunit.client.0.vm03.stdout:8/789: getdents da/d10/d28/d4f/daf/dee 0 2026-03-09T16:14:47.616 INFO:tasks.workunit.client.0.vm03.stdout:5/820: mkdir d2/d7/d8/d16/d5c/dfc/d106/d3b/d114 0 2026-03-09T16:14:47.616 INFO:tasks.workunit.client.0.vm03.stdout:7/706: mkdir d4/da/dbf/deb 0 2026-03-09T16:14:47.620 INFO:tasks.workunit.client.0.vm03.stdout:3/738: mknod d5/d53/d6c/d79/d91/cd8 0 2026-03-09T16:14:47.623 INFO:tasks.workunit.client.0.vm03.stdout:0/766: creat d0/d7/d3e/d57/d5a/d47/f10a x:0 0 0 2026-03-09T16:14:47.624 INFO:tasks.workunit.client.0.vm03.stdout:0/767: chown d0/d7/d3e/d57/d5a/d5f/db2/d8e 12265 1 2026-03-09T16:14:47.626 INFO:tasks.workunit.client.0.vm03.stdout:0/768: dwrite d0/da/d5c/f33 [0,4194304] 0 2026-03-09T16:14:47.633 INFO:tasks.workunit.client.0.vm03.stdout:0/769: dwrite d0/fde [0,4194304] 0 2026-03-09T16:14:47.636 INFO:tasks.workunit.client.0.vm03.stdout:1/672: link d4/db/d8b/la3 d4/db/d59/ld9 0 2026-03-09T16:14:47.637 INFO:tasks.workunit.client.0.vm03.stdout:8/790: truncate da/d10/d28/f8c 1321989 0 2026-03-09T16:14:47.650 INFO:tasks.workunit.client.0.vm03.stdout:8/791: dread da/d10/d28/fd3 [0,4194304] 0 2026-03-09T16:14:47.653 INFO:tasks.workunit.client.0.vm03.stdout:3/739: mkdir d5/d53/d6c/d79/dd9 0 2026-03-09T16:14:47.654 INFO:tasks.workunit.client.0.vm03.stdout:6/720: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fb2 [546606,39545] 0 2026-03-09T16:14:47.660 INFO:tasks.workunit.client.0.vm03.stdout:2/768: write db/d12/d2a/d61/d79/f7f [7724055,42417] 0 2026-03-09T16:14:47.662 INFO:tasks.workunit.client.0.vm03.stdout:5/821: truncate d2/d7/d8/d24/d27/d43/d4b/fd1 367301 0 2026-03-09T16:14:47.669 INFO:tasks.workunit.client.0.vm03.stdout:4/773: dwrite d5/d56/f6a [0,4194304] 0 2026-03-09T16:14:47.674 INFO:tasks.workunit.client.0.vm03.stdout:3/740: dread d5/d1e/d42/d4c/f7d [0,4194304] 0 2026-03-09T16:14:47.676 INFO:tasks.workunit.client.0.vm03.stdout:3/741: chown d5/d2e/c81 7167 1 2026-03-09T16:14:47.685 INFO:tasks.workunit.client.0.vm03.stdout:8/792: fsync da/f4c 0 2026-03-09T16:14:47.693 INFO:tasks.workunit.client.0.vm03.stdout:9/812: dwrite d2/d4/d11/d12/d28/f2c [0,4194304] 0 2026-03-09T16:14:47.704 INFO:tasks.workunit.client.0.vm03.stdout:1/673: fsync d4/d6/d1d/d20/d23/f62 0 2026-03-09T16:14:47.708 INFO:tasks.workunit.client.0.vm03.stdout:6/721: dread d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:47.709 INFO:tasks.workunit.client.0.vm03.stdout:2/769: rmdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4 39 2026-03-09T16:14:47.714 INFO:tasks.workunit.client.0.vm03.stdout:2/770: dwrite db/d12/d2a/d99/de7/df9/d64/ff6 [0,4194304] 0 2026-03-09T16:14:47.730 INFO:tasks.workunit.client.0.vm03.stdout:7/707: rmdir d4/da/d5d/dd8/d22/d24/d15/d71/d79 0 2026-03-09T16:14:47.736 INFO:tasks.workunit.client.0.vm03.stdout:1/674: creat d4/d39/d7f/fda x:0 0 0 2026-03-09T16:14:47.736 INFO:tasks.workunit.client.0.vm03.stdout:1/675: fsync d4/db/d8b/db2/fc6 0 2026-03-09T16:14:47.737 INFO:tasks.workunit.client.0.vm03.stdout:1/676: dread - d4/db/d8b/fb0 zero size 2026-03-09T16:14:47.738 INFO:tasks.workunit.client.0.vm03.stdout:6/722: mknod d9/d8e/ce1 0 2026-03-09T16:14:47.743 INFO:tasks.workunit.client.0.vm03.stdout:3/742: link d5/d53/fcc d5/d1e/d42/d55/d86/dae/fda 0 2026-03-09T16:14:47.745 INFO:tasks.workunit.client.0.vm03.stdout:7/708: mknod d4/da/d45/d51/d36/d66/cec 0 2026-03-09T16:14:47.747 INFO:tasks.workunit.client.0.vm03.stdout:1/677: creat d4/d6/d3b/d63/fdb x:0 0 0 2026-03-09T16:14:47.747 INFO:tasks.workunit.client.0.vm03.stdout:9/813: sync 2026-03-09T16:14:47.748 INFO:tasks.workunit.client.0.vm03.stdout:1/678: write d4/d6/d1d/d20/fc1 [894395,105408] 0 2026-03-09T16:14:47.752 INFO:tasks.workunit.client.0.vm03.stdout:6/723: dread d9/d14/f3d [0,4194304] 0 2026-03-09T16:14:47.756 INFO:tasks.workunit.client.0.vm03.stdout:1/679: dwrite d4/d6/d3b/f36 [0,4194304] 0 2026-03-09T16:14:47.770 INFO:tasks.workunit.client.0.vm03.stdout:4/774: link d5/c42 d5/db/d25/d8b/ce8 0 2026-03-09T16:14:47.772 INFO:tasks.workunit.client.0.vm03.stdout:8/793: link da/d10/d28/l3f da/d6c/dfa/d105/l108 0 2026-03-09T16:14:47.773 INFO:tasks.workunit.client.0.vm03.stdout:8/794: chown da/d10/d28/d4f/d68/d80/ccf 122345240 1 2026-03-09T16:14:47.775 INFO:tasks.workunit.client.0.vm03.stdout:7/709: fsync d4/da/d5d/dd8/d22/d24/f41 0 2026-03-09T16:14:47.779 INFO:tasks.workunit.client.0.vm03.stdout:9/814: creat d2/d4/d11/d29/d2a/d38/db6/ffc x:0 0 0 2026-03-09T16:14:47.779 INFO:tasks.workunit.client.0.vm03.stdout:9/815: chown d2/d4/d11/d12/d28/c8e 1913 1 2026-03-09T16:14:47.785 INFO:tasks.workunit.client.0.vm03.stdout:7/710: sync 2026-03-09T16:14:47.785 INFO:tasks.workunit.client.0.vm03.stdout:7/711: write d4/da/f20 [1008537,119365] 0 2026-03-09T16:14:47.796 INFO:tasks.workunit.client.0.vm03.stdout:1/680: mknod d4/d6/d1d/d20/d5f/cdc 0 2026-03-09T16:14:47.803 INFO:tasks.workunit.client.0.vm03.stdout:8/795: truncate da/d10/d28/d4f/d68/f8f 2621334 0 2026-03-09T16:14:47.806 INFO:tasks.workunit.client.0.vm03.stdout:8/796: dwrite da/d10/fa4 [4194304,4194304] 0 2026-03-09T16:14:47.817 INFO:tasks.workunit.client.0.vm03.stdout:0/770: dwrite d0/f29 [0,4194304] 0 2026-03-09T16:14:47.838 INFO:tasks.workunit.client.0.vm03.stdout:9/816: symlink d2/d4/d11/d29/d2a/d46/lfd 0 2026-03-09T16:14:47.838 INFO:tasks.workunit.client.0.vm03.stdout:5/822: rename d2/d7/d8 to d2/d7/d115 0 2026-03-09T16:14:47.851 INFO:tasks.workunit.client.0.vm03.stdout:6/724: mknod d9/d42/d45/d50/ce2 0 2026-03-09T16:14:47.857 INFO:tasks.workunit.client.0.vm03.stdout:9/817: sync 2026-03-09T16:14:47.861 INFO:tasks.workunit.client.0.vm03.stdout:4/775: truncate d5/d17/f2b 2610384 0 2026-03-09T16:14:47.861 INFO:tasks.workunit.client.0.vm03.stdout:4/776: stat d5/db/d25/d31/d4d/d5b/d72/dcb 0 2026-03-09T16:14:47.862 INFO:tasks.workunit.client.0.vm03.stdout:4/777: read d5/d17/f83 [257931,14020] 0 2026-03-09T16:14:47.867 INFO:tasks.workunit.client.0.vm03.stdout:8/797: dread - da/db/f9e zero size 2026-03-09T16:14:47.870 INFO:tasks.workunit.client.0.vm03.stdout:8/798: dwrite da/d32/ff5 [0,4194304] 0 2026-03-09T16:14:47.872 INFO:tasks.workunit.client.0.vm03.stdout:8/799: chown da/d10/f33 100133985 1 2026-03-09T16:14:47.881 INFO:tasks.workunit.client.0.vm03.stdout:0/771: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:47.886 INFO:tasks.workunit.client.0.vm03.stdout:5/823: creat d2/d7/d115/d16/d5c/dfc/f116 x:0 0 0 2026-03-09T16:14:47.889 INFO:tasks.workunit.client.0.vm03.stdout:0/772: dread d0/d7/d3e/d57/d5a/d5f/db2/f5e [0,4194304] 0 2026-03-09T16:14:47.903 INFO:tasks.workunit.client.0.vm03.stdout:1/681: mknod d4/d31/d5c/cdd 0 2026-03-09T16:14:47.903 INFO:tasks.workunit.client.0.vm03.stdout:9/818: truncate d2/d4/d1f/d83/fb9 434933 0 2026-03-09T16:14:47.904 INFO:tasks.workunit.client.0.vm03.stdout:9/819: read - d2/d4/d11/d12/dc7/dee/dc2/de9/ff6 zero size 2026-03-09T16:14:47.905 INFO:tasks.workunit.client.0.vm03.stdout:2/771: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fdc [0,4194304] 0 2026-03-09T16:14:47.917 INFO:tasks.workunit.client.0.vm03.stdout:3/743: creat d5/d6d/fdb x:0 0 0 2026-03-09T16:14:47.923 INFO:tasks.workunit.client.0.vm03.stdout:4/778: fsync d5/db/d25/d31/d4d/d5b/d72/f94 0 2026-03-09T16:14:47.925 INFO:tasks.workunit.client.0.vm03.stdout:4/779: readlink d5/dd/d1f/lcd 0 2026-03-09T16:14:47.925 INFO:tasks.workunit.client.0.vm03.stdout:4/780: read d5/dd/d1f/d5f/f98 [3192025,43388] 0 2026-03-09T16:14:47.931 INFO:tasks.workunit.client.0.vm03.stdout:4/781: dread d5/db/d25/d8b/da8/f62 [0,4194304] 0 2026-03-09T16:14:47.935 INFO:tasks.workunit.client.0.vm03.stdout:8/800: mkdir da/db/da8/db8/d109 0 2026-03-09T16:14:47.935 INFO:tasks.workunit.client.0.vm03.stdout:8/801: chown da/d32/fa6 12019 1 2026-03-09T16:14:47.948 INFO:tasks.workunit.client.0.vm03.stdout:5/824: creat d2/d7/d115/d16/d5c/dfc/d106/d52/f117 x:0 0 0 2026-03-09T16:14:47.973 INFO:tasks.workunit.client.0.vm03.stdout:7/712: creat d4/da/d45/fed x:0 0 0 2026-03-09T16:14:47.977 INFO:tasks.workunit.client.0.vm03.stdout:7/713: dwrite d4/d2d/fd5 [0,4194304] 0 2026-03-09T16:14:47.986 INFO:tasks.workunit.client.0.vm03.stdout:6/725: symlink d9/d42/d45/d65/dbf/le3 0 2026-03-09T16:14:47.992 INFO:tasks.workunit.client.0.vm03.stdout:0/773: write d0/da/d7a/d98/f9d [5053918,23494] 0 2026-03-09T16:14:47.998 INFO:tasks.workunit.client.0.vm03.stdout:0/774: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:47.999 INFO:tasks.workunit.client.0.vm03.stdout:2/772: read db/d12/d2a/d99/de7/df9/d64/f80 [4049952,12243] 0 2026-03-09T16:14:47.999 INFO:tasks.workunit.client.0.vm03.stdout:3/744: rename d5/d53/c64 to d5/d53/d6c/d79/cdc 0 2026-03-09T16:14:48.000 INFO:tasks.workunit.client.0.vm03.stdout:3/745: stat d5/d1e/d42/d55/d86/dae/fda 0 2026-03-09T16:14:48.001 INFO:tasks.workunit.client.0.vm03.stdout:3/746: write d5/d1e/d42/d4c/f7d [4991542,118193] 0 2026-03-09T16:14:48.002 INFO:tasks.workunit.client.0.vm03.stdout:3/747: chown d5/d44/d61 10880 1 2026-03-09T16:14:48.013 INFO:tasks.workunit.client.0.vm03.stdout:1/682: write d4/d6/d1d/d3d/f45 [3435347,112685] 0 2026-03-09T16:14:48.033 INFO:tasks.workunit.client.0.vm03.stdout:4/782: creat d5/dd/fe9 x:0 0 0 2026-03-09T16:14:48.034 INFO:tasks.workunit.client.0.vm03.stdout:4/783: chown d5/db/d25/d31/d4d/d5b/le6 75 1 2026-03-09T16:14:48.048 INFO:tasks.workunit.client.0.vm03.stdout:5/825: write d2/d7/de/d11/d19/d29/d90/fac [1421000,12005] 0 2026-03-09T16:14:48.057 INFO:tasks.workunit.client.0.vm03.stdout:7/714: symlink d4/da/d45/d51/d36/d66/lee 0 2026-03-09T16:14:48.058 INFO:tasks.workunit.client.0.vm03.stdout:7/715: write d4/da/d45/d51/f91 [4042622,3444] 0 2026-03-09T16:14:48.064 INFO:tasks.workunit.client.0.vm03.stdout:6/726: chown d9/d8e/f9f 1 1 2026-03-09T16:14:48.080 INFO:tasks.workunit.client.0.vm03.stdout:6/727: sync 2026-03-09T16:14:48.084 INFO:tasks.workunit.client.0.vm03.stdout:1/683: unlink d4/c53 0 2026-03-09T16:14:48.099 INFO:tasks.workunit.client.0.vm03.stdout:2/773: dwrite db/d12/d2a/d61/f9b [0,4194304] 0 2026-03-09T16:14:48.101 INFO:tasks.workunit.client.0.vm03.stdout:2/774: stat db/d12/d2a/d99/de7/df9/d64/ff6 0 2026-03-09T16:14:48.135 INFO:tasks.workunit.client.0.vm03.stdout:4/784: unlink d5/db/d25/d8b/da8/fca 0 2026-03-09T16:14:48.159 INFO:tasks.workunit.client.0.vm03.stdout:5/826: write d2/d7/de/d11/d19/d31/f99 [7899668,15543] 0 2026-03-09T16:14:48.164 INFO:tasks.workunit.client.0.vm03.stdout:0/775: truncate d0/d7/d3e/d57/d5a/d52/d9f/fe3 755 0 2026-03-09T16:14:48.168 INFO:tasks.workunit.client.0.vm03.stdout:6/728: mkdir d9/d42/d45/d65/dbf/dc9/de4 0 2026-03-09T16:14:48.187 INFO:tasks.workunit.client.0.vm03.stdout:2/775: truncate db/f23 2104350 0 2026-03-09T16:14:48.188 INFO:tasks.workunit.client.0.vm03.stdout:2/776: chown db/d12/d2a/d99/de7/df9/d64/dbd/da0/fcb 106392746 1 2026-03-09T16:14:48.196 INFO:tasks.workunit.client.0.vm03.stdout:8/802: rmdir da/db/da8/db8/d109 0 2026-03-09T16:14:48.198 INFO:tasks.workunit.client.0.vm03.stdout:1/684: write d4/d6/d3b/f35 [3214226,47580] 0 2026-03-09T16:14:48.199 INFO:tasks.workunit.client.0.vm03.stdout:7/716: symlink d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/lef 0 2026-03-09T16:14:48.208 INFO:tasks.workunit.client.0.vm03.stdout:9/820: getdents d2/d4/d11/d12/dc7/dcc 0 2026-03-09T16:14:48.213 INFO:tasks.workunit.client.0.vm03.stdout:9/821: truncate d2/d4/d11/d12/dc7/dee/dce/fde 897114 0 2026-03-09T16:14:48.217 INFO:tasks.workunit.client.0.vm03.stdout:7/717: dread d4/da/f42 [0,4194304] 0 2026-03-09T16:14:48.218 INFO:tasks.workunit.client.0.vm03.stdout:7/718: chown d4/da/d5d/db0/l7c 264 1 2026-03-09T16:14:48.226 INFO:tasks.workunit.client.0.vm03.stdout:6/729: readlink d9/lb1 0 2026-03-09T16:14:48.246 INFO:tasks.workunit.client.0.vm03.stdout:8/803: mknod da/d10/d28/c10a 0 2026-03-09T16:14:48.252 INFO:tasks.workunit.client.0.vm03.stdout:1/685: creat d4/d7b/fde x:0 0 0 2026-03-09T16:14:48.254 INFO:tasks.workunit.client.0.vm03.stdout:9/822: creat d2/d4/d11/d12/d28/ffe x:0 0 0 2026-03-09T16:14:48.257 INFO:tasks.workunit.client.0.vm03.stdout:3/748: getdents d5 0 2026-03-09T16:14:48.272 INFO:tasks.workunit.client.0.vm03.stdout:2/777: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/dfa/f10e x:0 0 0 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:48.277 INFO:tasks.workunit.client.0.vm03.stdout:8/804: mkdir da/d1d/d10b 0 2026-03-09T16:14:48.277 INFO:tasks.workunit.client.0.vm03.stdout:8/805: read - da/d10/d28/d4f/d68/fc1 zero size 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/2400477097' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:48.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:48 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:14:48.282 INFO:tasks.workunit.client.0.vm03.stdout:5/827: creat d2/f118 x:0 0 0 2026-03-09T16:14:48.297 INFO:tasks.workunit.client.0.vm03.stdout:9/823: fsync d2/de/f85 0 2026-03-09T16:14:48.304 INFO:tasks.workunit.client.0.vm03.stdout:1/686: dwrite d4/d6/d1d/d20/d23/f9f [0,4194304] 0 2026-03-09T16:14:48.306 INFO:tasks.workunit.client.0.vm03.stdout:3/749: truncate d5/d44/f56 2114118 0 2026-03-09T16:14:48.315 INFO:tasks.workunit.client.0.vm03.stdout:1/687: dread d4/d6/d1d/d20/fcc [0,4194304] 0 2026-03-09T16:14:48.328 INFO:tasks.workunit.client.0.vm03.stdout:7/719: mknod d4/da/d5d/dd8/d22/d24/d16/d3e/db5/dd4/cf0 0 2026-03-09T16:14:48.339 INFO:tasks.workunit.client.0.vm03.stdout:7/720: dread d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:48.343 INFO:tasks.workunit.client.0.vm03.stdout:4/785: getdents d5/dd/d1f/d5f 0 2026-03-09T16:14:48.357 INFO:tasks.workunit.client.0.vm03.stdout:2/778: creat db/d12/d2a/d61/dbe/f10f x:0 0 0 2026-03-09T16:14:48.379 INFO:tasks.workunit.client.0.vm03.stdout:5/828: mkdir d2/d75/d119 0 2026-03-09T16:14:48.391 INFO:tasks.workunit.client.0.vm03.stdout:9/824: write d2/d4/d11/d29/d2a/d38/fb4 [2177053,116049] 0 2026-03-09T16:14:48.392 INFO:tasks.workunit.client.0.vm03.stdout:9/825: chown d2/d4/d11/d12 954 1 2026-03-09T16:14:48.412 INFO:tasks.workunit.client.0.vm03.stdout:1/688: mknod d4/d6/d1d/d3d/cdf 0 2026-03-09T16:14:48.419 INFO:tasks.workunit.client.0.vm03.stdout:6/730: mkdir d9/d42/d45/d50/d80/d8a/dc1/dd4/de5 0 2026-03-09T16:14:48.421 INFO:tasks.workunit.client.0.vm03.stdout:1/689: write d4/d7b/fde [838048,19395] 0 2026-03-09T16:14:48.421 INFO:tasks.workunit.client.0.vm03.stdout:7/721: creat d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/ff1 x:0 0 0 2026-03-09T16:14:48.450 INFO:tasks.workunit.client.0.vm03.stdout:2/779: unlink db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/c98 0 2026-03-09T16:14:48.513 INFO:tasks.workunit.client.0.vm03.stdout:0/776: link d0/d7/d3e/d57/d5a/d52/d9f/fe3 d0/da/d5c/f10b 0 2026-03-09T16:14:48.517 INFO:tasks.workunit.client.0.vm03.stdout:6/731: chown d9/d84/c9b 1 1 2026-03-09T16:14:48.521 INFO:tasks.workunit.client.0.vm03.stdout:7/722: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/df2 0 2026-03-09T16:14:48.521 INFO:tasks.workunit.client.0.vm03.stdout:7/723: chown d4/da/d45/c87 17698 1 2026-03-09T16:14:48.521 INFO:tasks.workunit.client.0.vm03.stdout:4/786: mknod d5/db/d25/d31/cea 0 2026-03-09T16:14:48.524 INFO:tasks.workunit.client.0.vm03.stdout:8/806: creat da/d6c/f10c x:0 0 0 2026-03-09T16:14:48.526 INFO:tasks.workunit.client.0.vm03.stdout:2/780: fdatasync db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/fa3 0 2026-03-09T16:14:48.526 INFO:tasks.workunit.client.0.vm03.stdout:5/829: creat d2/d75/d119/f11a x:0 0 0 2026-03-09T16:14:48.527 INFO:tasks.workunit.client.0.vm03.stdout:5/830: read - d2/d7/d115/d16/d5c/ff1 zero size 2026-03-09T16:14:48.529 INFO:tasks.workunit.client.0.vm03.stdout:0/777: fsync d0/f4e 0 2026-03-09T16:14:48.529 INFO:tasks.workunit.client.0.vm03.stdout:0/778: readlink d0/da/d5c/l49 0 2026-03-09T16:14:48.530 INFO:tasks.workunit.client.0.vm03.stdout:0/779: readlink d0/d7/d75/le8 0 2026-03-09T16:14:48.535 INFO:tasks.workunit.client.0.vm03.stdout:1/690: rename d4/d31/d5c/da8/c4d to d4/db/d8b/db2/ce0 0 2026-03-09T16:14:48.546 INFO:tasks.workunit.client.0.vm03.stdout:4/787: mkdir d5/dd/dd5/deb 0 2026-03-09T16:14:48.547 INFO:tasks.workunit.client.0.vm03.stdout:8/807: mknod da/d32/db5/c10d 0 2026-03-09T16:14:48.547 INFO:tasks.workunit.client.0.vm03.stdout:3/750: getdents d5/d2e 0 2026-03-09T16:14:48.548 INFO:tasks.workunit.client.0.vm03.stdout:6/732: symlink d9/d14/da5/dd8/le6 0 2026-03-09T16:14:48.548 INFO:tasks.workunit.client.0.vm03.stdout:0/780: creat d0/d7/d3e/d57/d5a/d47/f10c x:0 0 0 2026-03-09T16:14:48.548 INFO:tasks.workunit.client.0.vm03.stdout:1/691: dread - d4/d31/f81 zero size 2026-03-09T16:14:48.549 INFO:tasks.workunit.client.0.vm03.stdout:5/831: sync 2026-03-09T16:14:48.550 INFO:tasks.workunit.client.0.vm03.stdout:0/781: dread - d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc zero size 2026-03-09T16:14:48.551 INFO:tasks.workunit.client.0.vm03.stdout:5/832: write d2/d7/de/ffd [435153,35696] 0 2026-03-09T16:14:48.556 INFO:tasks.workunit.client.0.vm03.stdout:2/781: mkdir db/d12/da5/dc2/d110 0 2026-03-09T16:14:48.563 INFO:tasks.workunit.client.0.vm03.stdout:0/782: dwrite d0/fde [0,4194304] 0 2026-03-09T16:14:48.572 INFO:tasks.workunit.client.0.vm03.stdout:9/826: dread d2/d54/d7d/d8f/dad/def/f9f [0,4194304] 0 2026-03-09T16:14:48.572 INFO:tasks.workunit.client.0.vm03.stdout:0/783: sync 2026-03-09T16:14:48.574 INFO:tasks.workunit.client.0.vm03.stdout:9/827: sync 2026-03-09T16:14:48.583 INFO:tasks.workunit.client.0.vm03.stdout:5/833: dread d2/d7/d1a/d1c/d6c/f79 [0,4194304] 0 2026-03-09T16:14:48.595 INFO:tasks.workunit.client.0.vm03.stdout:0/784: dread d0/da/d5c/f39 [0,4194304] 0 2026-03-09T16:14:48.602 INFO:tasks.workunit.client.0.vm03.stdout:4/788: dread d5/d17/d44/f4a [8388608,4194304] 0 2026-03-09T16:14:48.611 INFO:tasks.workunit.client.0.vm03.stdout:0/785: dread d0/d7/d3e/d95/f99 [0,4194304] 0 2026-03-09T16:14:48.621 INFO:tasks.workunit.client.0.vm03.stdout:6/733: creat d9/d42/d45/d50/d80/d8a/d9c/fe7 x:0 0 0 2026-03-09T16:14:48.625 INFO:tasks.workunit.client.0.vm03.stdout:3/751: mkdir d5/d1e/d42/ddd 0 2026-03-09T16:14:48.637 INFO:tasks.workunit.client.0.vm03.stdout:5/834: creat d2/d7/de/d11/d19/d31/d35/f11b x:0 0 0 2026-03-09T16:14:48.637 INFO:tasks.workunit.client.0.vm03.stdout:9/828: dread d2/d4/d11/d29/d2a/d46/f81 [0,4194304] 0 2026-03-09T16:14:48.646 INFO:tasks.workunit.client.0.vm03.stdout:4/789: stat d5/dd/d1f/fe3 0 2026-03-09T16:14:48.652 INFO:tasks.workunit.client.0.vm03.stdout:1/692: mknod d4/d31/d5c/ce1 0 2026-03-09T16:14:48.655 INFO:tasks.workunit.client.0.vm03.stdout:7/724: dwrite d4/da/d45/f63 [0,4194304] 0 2026-03-09T16:14:48.694 INFO:tasks.workunit.client.0.vm03.stdout:2/782: getdents db/d12/da5/dc2/dc9 0 2026-03-09T16:14:48.715 INFO:tasks.workunit.client.0.vm03.stdout:4/790: creat d5/d17/d44/fec x:0 0 0 2026-03-09T16:14:48.719 INFO:tasks.workunit.client.0.vm03.stdout:4/791: dwrite d5/db/d25/d31/d4d/d5b/d72/d82/fe7 [0,4194304] 0 2026-03-09T16:14:48.856 INFO:tasks.workunit.client.0.vm03.stdout:8/808: write da/d6c/dc4/fe6 [79267,55642] 0 2026-03-09T16:14:48.860 INFO:tasks.workunit.client.0.vm03.stdout:6/734: write d9/d84/f91 [170784,51781] 0 2026-03-09T16:14:48.871 INFO:tasks.workunit.client.0.vm03.stdout:3/752: dwrite d5/d1e/d42/f74 [0,4194304] 0 2026-03-09T16:14:48.877 INFO:tasks.workunit.client.0.vm03.stdout:9/829: write d2/d4/d11/d12/dc7/dee/fe4 [1196004,29644] 0 2026-03-09T16:14:48.877 INFO:tasks.workunit.client.0.vm03.stdout:5/835: write d2/d7/de/d11/f80 [3328052,37356] 0 2026-03-09T16:14:48.917 INFO:tasks.workunit.client.0.vm03.stdout:3/753: dread d5/f33 [0,4194304] 0 2026-03-09T16:14:48.924 INFO:tasks.workunit.client.0.vm03.stdout:3/754: dread d5/d1e/d42/d55/d86/dae/fda [0,4194304] 0 2026-03-09T16:14:48.940 INFO:tasks.workunit.client.0.vm03.stdout:2/783: creat db/d12/da5/de2/f111 x:0 0 0 2026-03-09T16:14:48.941 INFO:tasks.workunit.client.0.vm03.stdout:1/693: symlink d4/d6/d3b/d6b/da5/dc3/le2 0 2026-03-09T16:14:48.944 INFO:tasks.workunit.client.0.vm03.stdout:7/725: creat d4/da/d45/d51/dea/ff3 x:0 0 0 2026-03-09T16:14:48.944 INFO:tasks.workunit.client.0.vm03.stdout:1/694: readlink d4/db/d59/l68 0 2026-03-09T16:14:48.949 INFO:tasks.workunit.client.0.vm03.stdout:1/695: chown d4/d6/d1d/d20/d93/f48 22650413 1 2026-03-09T16:14:48.950 INFO:tasks.workunit.client.0.vm03.stdout:0/786: rmdir d0/d7/d3e/d57/d5a/d5f/db2/dab/df4 0 2026-03-09T16:14:48.953 INFO:tasks.workunit.client.0.vm03.stdout:3/755: dread d5/d53/d88/dd7/fc7 [0,4194304] 0 2026-03-09T16:14:48.962 INFO:tasks.workunit.client.0.vm03.stdout:6/735: unlink d9/d42/d45/f4d 0 2026-03-09T16:14:48.965 INFO:tasks.workunit.client.0.vm03.stdout:3/756: dwrite d5/d1e/d42/f74 [4194304,4194304] 0 2026-03-09T16:14:48.992 INFO:tasks.workunit.client.0.vm03.stdout:9/830: creat d2/d4/d1f/fff x:0 0 0 2026-03-09T16:14:48.999 INFO:tasks.workunit.client.0.vm03.stdout:9/831: chown d2/d54/d7d/d8f/dad/def/l24 924343 1 2026-03-09T16:14:49.024 INFO:tasks.workunit.client.0.vm03.stdout:1/696: mkdir d4/d6/d3b/d6b/d25/d50/de3 0 2026-03-09T16:14:49.024 INFO:tasks.workunit.client.0.vm03.stdout:9/832: dread - d2/de/d88/d7a/fc8 zero size 2026-03-09T16:14:49.025 INFO:tasks.workunit.client.0.vm03.stdout:1/697: chown d4/db/d59 120905 1 2026-03-09T16:14:49.032 INFO:tasks.workunit.client.0.vm03.stdout:8/809: rename da/d6c/dfa to da/d10/d28/d4f/d85/d9c/d10e 0 2026-03-09T16:14:49.035 INFO:tasks.workunit.client.0.vm03.stdout:4/792: creat d5/fed x:0 0 0 2026-03-09T16:14:49.037 INFO:tasks.workunit.client.0.vm03.stdout:8/810: dwrite da/d6c/dc4/fe6 [0,4194304] 0 2026-03-09T16:14:49.068 INFO:tasks.workunit.client.0.vm03.stdout:1/698: creat d4/db/d59/fe4 x:0 0 0 2026-03-09T16:14:49.069 INFO:tasks.workunit.client.0.vm03.stdout:1/699: write d4/d31/d5c/da8/fd0 [864236,7158] 0 2026-03-09T16:14:49.074 INFO:tasks.workunit.client.0.vm03.stdout:5/836: rename d2/d7/d115/d24/d27/c34 to d2/d7/de/c11c 0 2026-03-09T16:14:49.091 INFO:tasks.workunit.client.0.vm03.stdout:8/811: dread da/d10/d28/d4f/d68/ddc/f73 [0,4194304] 0 2026-03-09T16:14:49.094 INFO:tasks.workunit.client.0.vm03.stdout:9/833: mknod d2/d4/d11/d29/d2a/c100 0 2026-03-09T16:14:49.095 INFO:tasks.workunit.client.0.vm03.stdout:5/837: fdatasync d2/f5a 0 2026-03-09T16:14:49.100 INFO:tasks.workunit.client.0.vm03.stdout:6/736: rename d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f43 to d9/d42/d45/d50/d80/d8a/dc1/fe8 0 2026-03-09T16:14:49.107 INFO:tasks.workunit.client.0.vm03.stdout:0/787: link d0/da/fbe d0/d7/d3e/d57/d5a/f10d 0 2026-03-09T16:14:49.108 INFO:tasks.workunit.client.0.vm03.stdout:1/700: sync 2026-03-09T16:14:49.108 INFO:tasks.workunit.client.0.vm03.stdout:0/788: truncate d0/da/d5c/fae 4869192 0 2026-03-09T16:14:49.109 INFO:tasks.workunit.client.0.vm03.stdout:9/834: sync 2026-03-09T16:14:49.113 INFO:tasks.workunit.client.0.vm03.stdout:3/757: getdents d5/d6d 0 2026-03-09T16:14:49.113 INFO:tasks.workunit.client.0.vm03.stdout:3/758: read - d5/d1e/d42/d8b/fd0 zero size 2026-03-09T16:14:49.121 INFO:tasks.workunit.client.0.vm03.stdout:8/812: unlink da/d6c/d7a/f7f 0 2026-03-09T16:14:49.129 INFO:tasks.workunit.client.0.vm03.stdout:5/838: mknod d2/d7/d115/c11d 0 2026-03-09T16:14:49.138 INFO:tasks.workunit.client.0.vm03.stdout:4/793: rename d5/db/d25/d31/d4d/d5b/d72/dcb to d5/db/d25/d9f/dc7/dee 0 2026-03-09T16:14:49.140 INFO:tasks.workunit.client.0.vm03.stdout:2/784: truncate db/d12/d2a/d61/f4c 2938430 0 2026-03-09T16:14:49.144 INFO:tasks.workunit.client.0.vm03.stdout:7/726: dwrite d4/f8d [0,4194304] 0 2026-03-09T16:14:49.146 INFO:tasks.workunit.client.0.vm03.stdout:7/727: fdatasync d4/d2d/fd5 0 2026-03-09T16:14:49.151 INFO:tasks.workunit.client.0.vm03.stdout:5/839: creat d2/d7/de/da9/f11e x:0 0 0 2026-03-09T16:14:49.158 INFO:tasks.workunit.client.0.vm03.stdout:6/737: rename d9/d42/d45/d50/d80/d8a/d9c/f8c to d9/d42/d45/d50/d80/d8a/fe9 0 2026-03-09T16:14:49.163 INFO:tasks.workunit.client.0.vm03.stdout:2/785: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/da0/d112 0 2026-03-09T16:14:49.163 INFO:tasks.workunit.client.0.vm03.stdout:2/786: chown db/d12/d2a/d99/de7/df9/lfb 2255896 1 2026-03-09T16:14:49.167 INFO:tasks.workunit.client.0.vm03.stdout:0/789: mknod d0/da/d5c/c10e 0 2026-03-09T16:14:49.178 INFO:tasks.workunit.client.0.vm03.stdout:8/813: write da/d10/f1f [2130113,38014] 0 2026-03-09T16:14:49.181 INFO:tasks.workunit.client.0.vm03.stdout:8/814: dwrite da/d10/d28/f57 [0,4194304] 0 2026-03-09T16:14:49.183 INFO:tasks.workunit.client.0.vm03.stdout:8/815: chown da/d1d/d3b 0 1 2026-03-09T16:14:49.190 INFO:tasks.workunit.client.0.vm03.stdout:3/759: creat d5/d1e/d42/d55/d86/dae/fde x:0 0 0 2026-03-09T16:14:49.191 INFO:tasks.workunit.client.0.vm03.stdout:3/760: readlink d5/d1e/l45 0 2026-03-09T16:14:49.200 INFO:tasks.workunit.client.0.vm03.stdout:6/738: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/fea x:0 0 0 2026-03-09T16:14:49.215 INFO:tasks.workunit.client.0.vm03.stdout:2/787: dwrite db/d12/d2a/f8d [0,4194304] 0 2026-03-09T16:14:49.219 INFO:tasks.workunit.client.0.vm03.stdout:2/788: chown db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/cba 3682 1 2026-03-09T16:14:49.237 INFO:tasks.workunit.client.0.vm03.stdout:7/728: dwrite d4/da/d5d/dd8/d22/d24/f41 [0,4194304] 0 2026-03-09T16:14:49.239 INFO:tasks.workunit.client.0.vm03.stdout:7/729: chown d4/da/d5d/db0/da9/db8/ddf 1 1 2026-03-09T16:14:49.240 INFO:tasks.workunit.client.0.vm03.stdout:5/840: write d2/f5a [109898,48285] 0 2026-03-09T16:14:49.243 INFO:tasks.workunit.client.0.vm03.stdout:5/841: dwrite d2/d7/d115/d24/d27/fc3 [0,4194304] 0 2026-03-09T16:14:49.244 INFO:tasks.workunit.client.0.vm03.stdout:9/835: creat d2/f101 x:0 0 0 2026-03-09T16:14:49.245 INFO:tasks.workunit.client.0.vm03.stdout:9/836: dread - d2/de/d88/d7a/fc8 zero size 2026-03-09T16:14:49.264 INFO:tasks.workunit.client.0.vm03.stdout:3/761: mkdir d5/d1e/d42/d8b/ddf 0 2026-03-09T16:14:49.272 INFO:tasks.workunit.client.0.vm03.stdout:0/790: dwrite d0/d7/d75/f62 [0,4194304] 0 2026-03-09T16:14:49.286 INFO:tasks.workunit.client.0.vm03.stdout:6/739: rename d9/d42/d45/d50/d80/d8a/d9c/d97/da8/f7d to d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/feb 0 2026-03-09T16:14:49.288 INFO:tasks.workunit.client.0.vm03.stdout:3/762: dread d5/d6d/d6a/fa9 [0,4194304] 0 2026-03-09T16:14:49.300 INFO:tasks.workunit.client.0.vm03.stdout:6/740: dwrite d9/d42/d45/d50/d80/d8a/d9c/fe7 [0,4194304] 0 2026-03-09T16:14:49.315 INFO:tasks.workunit.client.0.vm03.stdout:2/789: rmdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda 39 2026-03-09T16:14:49.338 INFO:tasks.workunit.client.0.vm03.stdout:7/730: mkdir d4/da/d45/d51/d36/d66/df4 0 2026-03-09T16:14:49.341 INFO:tasks.workunit.client.0.vm03.stdout:7/731: write d4/d2d/d4b/fd6 [42082,82003] 0 2026-03-09T16:14:49.350 INFO:tasks.workunit.client.0.vm03.stdout:9/837: creat d2/d54/d7d/f102 x:0 0 0 2026-03-09T16:14:49.350 INFO:tasks.workunit.client.0.vm03.stdout:9/838: chown d2/d4/c30 1 1 2026-03-09T16:14:49.352 INFO:tasks.workunit.client.0.vm03.stdout:1/701: link d4/d6/d1d/d20/d93/c54 d4/db/ce5 0 2026-03-09T16:14:49.353 INFO:tasks.workunit.client.0.vm03.stdout:9/839: write d2/d4/d11/d29/d2a/d38/fb4 [1164086,121467] 0 2026-03-09T16:14:49.384 INFO:tasks.workunit.client.0.vm03.stdout:5/842: rename d2/d75/l8a to d2/d7/d115/d16/d5c/dfc/d106/d3b/l11f 0 2026-03-09T16:14:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:49 vm03.local ceph-mon[51019]: mgrmap e27: vm03.gbgzmu(active, since 1.11187s), standbys: vm05.dygxfv 2026-03-09T16:14:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:49 vm03.local ceph-mon[51019]: pgmap v3: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:49 vm03.local ceph-mon[51019]: pgmap v4: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:49.404 INFO:tasks.workunit.client.0.vm03.stdout:8/816: truncate da/d10/d28/f57 1338024 0 2026-03-09T16:14:49.405 INFO:tasks.workunit.client.0.vm03.stdout:4/794: getdents d5/db/d25/d31/d33 0 2026-03-09T16:14:49.405 INFO:tasks.workunit.client.0.vm03.stdout:6/741: fdatasync d9/d42/d45/d50/d80/d8a/fc5 0 2026-03-09T16:14:49.412 INFO:tasks.workunit.client.0.vm03.stdout:7/732: symlink d4/da/d5d/db0/d61/lf5 0 2026-03-09T16:14:49.426 INFO:tasks.workunit.client.0.vm03.stdout:1/702: fsync d4/d6/d1d/d20/d5f/f57 0 2026-03-09T16:14:49.437 INFO:tasks.workunit.client.0.vm03.stdout:2/790: dwrite db/d12/d2a/d61/f47 [0,4194304] 0 2026-03-09T16:14:49.494 INFO:tasks.workunit.client.0.vm03.stdout:0/791: rename d0/d7/d3e/d57/d5a/d5f/db2/dab/fc4 to d0/d7/d3e/d57/d5a/d82/dd2/f10f 0 2026-03-09T16:14:49.494 INFO:tasks.workunit.client.0.vm03.stdout:0/792: dread - d0/da/d7a/d98/ff9 zero size 2026-03-09T16:14:49.495 INFO:tasks.workunit.client.0.vm03.stdout:0/793: read d0/da/d5c/f33 [2049205,75686] 0 2026-03-09T16:14:49.499 INFO:tasks.workunit.client.0.vm03.stdout:5/843: creat d2/d7/d115/d24/d27/d43/f120 x:0 0 0 2026-03-09T16:14:49.499 INFO:tasks.workunit.client.0.vm03.stdout:3/763: mkdir d5/d6d/db9/de0 0 2026-03-09T16:14:49.517 INFO:tasks.workunit.client.0.vm03.stdout:1/703: rmdir d4/d39/d70 39 2026-03-09T16:14:49.518 INFO:tasks.workunit.client.0.vm03.stdout:1/704: fdatasync d4/db/f7d 0 2026-03-09T16:14:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:14:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:49 vm05.local ceph-mon[58702]: mgrmap e27: vm03.gbgzmu(active, since 1.11187s), standbys: vm05.dygxfv 2026-03-09T16:14:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:14:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:49 vm05.local ceph-mon[58702]: pgmap v3: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:49 vm05.local ceph-mon[58702]: pgmap v4: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:49.536 INFO:tasks.workunit.client.0.vm03.stdout:2/791: rmdir db/d12/da5/dc2 39 2026-03-09T16:14:49.536 INFO:tasks.workunit.client.0.vm03.stdout:7/733: rename d4/da/d5d/db0/d61/dca/fcc to d4/da/d45/d51/d36/ff6 0 2026-03-09T16:14:49.536 INFO:tasks.workunit.client.0.vm03.stdout:4/795: dwrite d5/db/d25/d31/d4d/d5b/d7d/fbd [0,4194304] 0 2026-03-09T16:14:49.537 INFO:tasks.workunit.client.0.vm03.stdout:9/840: dwrite d2/d4/d11/d12/f3d [0,4194304] 0 2026-03-09T16:14:49.543 INFO:tasks.workunit.client.0.vm03.stdout:9/841: stat d2/d54/d7d/ce2 0 2026-03-09T16:14:49.543 INFO:tasks.workunit.client.0.vm03.stdout:2/792: fsync db/d12/da5/de4/f106 0 2026-03-09T16:14:49.544 INFO:tasks.workunit.client.0.vm03.stdout:7/734: truncate d4/da/f20 5516066 0 2026-03-09T16:14:49.548 INFO:tasks.workunit.client.0.vm03.stdout:5/844: creat d2/d7/d115/d24/d27/d43/f121 x:0 0 0 2026-03-09T16:14:49.577 INFO:tasks.workunit.client.0.vm03.stdout:4/796: truncate d5/dd/f22 2850858 0 2026-03-09T16:14:49.577 INFO:tasks.workunit.client.0.vm03.stdout:9/842: creat d2/d4/d11/d12/dc7/dcc/f103 x:0 0 0 2026-03-09T16:14:49.580 INFO:tasks.workunit.client.0.vm03.stdout:9/843: write d2/d4/d11/d29/d2a/d46/f9e [4441898,21696] 0 2026-03-09T16:14:49.596 INFO:tasks.workunit.client.0.vm03.stdout:2/793: truncate db/d12/da5/dbb/dc3/fe5 922659 0 2026-03-09T16:14:49.608 INFO:tasks.workunit.client.0.vm03.stdout:1/705: unlink d4/d6/d3b/d6b/da5/dc0/ld5 0 2026-03-09T16:14:49.608 INFO:tasks.workunit.client.0.vm03.stdout:3/764: write d5/d1e/f26 [3810336,103013] 0 2026-03-09T16:14:49.612 INFO:tasks.workunit.client.0.vm03.stdout:8/817: getdents da/d32 0 2026-03-09T16:14:49.614 INFO:tasks.workunit.client.0.vm03.stdout:6/742: dwrite d9/d42/d45/d50/d80/d8a/dc1/fe8 [0,4194304] 0 2026-03-09T16:14:49.620 INFO:tasks.workunit.client.0.vm03.stdout:9/844: sync 2026-03-09T16:14:49.624 INFO:tasks.workunit.client.0.vm03.stdout:8/818: dwrite da/d32/d79/d95/ffc [0,4194304] 0 2026-03-09T16:14:49.629 INFO:tasks.workunit.client.0.vm03.stdout:0/794: rename d0/d7/d3e/d57/d5a/f10d to d0/da/d1b/dc8/d104/f110 0 2026-03-09T16:14:49.634 INFO:tasks.workunit.client.0.vm03.stdout:9/845: read d2/d4/d11/d12/dc7/dee/dc2/fdd [2104077,7167] 0 2026-03-09T16:14:49.634 INFO:tasks.workunit.client.0.vm03.stdout:8/819: read da/d45/faa [844166,124268] 0 2026-03-09T16:14:49.644 INFO:tasks.workunit.client.0.vm03.stdout:8/820: truncate da/d10/d28/db1/dce/de8/ffd 1775408 0 2026-03-09T16:14:49.656 INFO:tasks.workunit.client.0.vm03.stdout:4/797: rename d5/d17/d44/f90 to d5/dd/dd5/fef 0 2026-03-09T16:14:49.657 INFO:tasks.workunit.client.0.vm03.stdout:8/821: symlink da/d32/dad/l10f 0 2026-03-09T16:14:49.660 INFO:tasks.workunit.client.0.vm03.stdout:0/795: mkdir d0/d7/d3e/d111 0 2026-03-09T16:14:49.665 INFO:tasks.workunit.client.0.vm03.stdout:9/846: mkdir d2/d4/d11/d29/d2a/d104 0 2026-03-09T16:14:49.667 INFO:tasks.workunit.client.0.vm03.stdout:9/847: fsync d2/d54/d7d/f102 0 2026-03-09T16:14:49.669 INFO:tasks.workunit.client.0.vm03.stdout:9/848: chown d2/d4/d11/lc3 1 1 2026-03-09T16:14:49.702 INFO:tasks.workunit.client.0.vm03.stdout:3/765: rename d5/d1e/d42/l71 to d5/d1e/d42/d34/dd2/le1 0 2026-03-09T16:14:49.706 INFO:tasks.workunit.client.0.vm03.stdout:4/798: symlink d5/db/d25/d31/dcc/lf0 0 2026-03-09T16:14:49.710 INFO:tasks.workunit.client.0.vm03.stdout:7/735: truncate d4/da/d5d/dd8/d22/d24/f41 1893365 0 2026-03-09T16:14:49.710 INFO:tasks.workunit.client.0.vm03.stdout:3/766: fsync d5/d1e/d42/d55/d86/dae/fde 0 2026-03-09T16:14:49.712 INFO:tasks.workunit.client.0.vm03.stdout:7/736: dread - d4/da/d45/fed zero size 2026-03-09T16:14:49.715 INFO:tasks.workunit.client.0.vm03.stdout:2/794: write db/d12/d2a/d99/de7/df9/d64/f80 [4754559,80011] 0 2026-03-09T16:14:49.720 INFO:tasks.workunit.client.0.vm03.stdout:7/737: write d4/da/d5d/db0/d9d/fc2 [247696,125095] 0 2026-03-09T16:14:49.724 INFO:tasks.workunit.client.0.vm03.stdout:7/738: dread - d4/da/d5d/dd8/d22/d24/d16/d2b/fe1 zero size 2026-03-09T16:14:49.725 INFO:tasks.workunit.client.0.vm03.stdout:5/845: dwrite d2/d7/d115/d16/fe5 [0,4194304] 0 2026-03-09T16:14:49.727 INFO:tasks.workunit.client.0.vm03.stdout:2/795: sync 2026-03-09T16:14:49.727 INFO:tasks.workunit.client.0.vm03.stdout:2/796: readlink db/d12/d2a/d61/dbe/ldb 0 2026-03-09T16:14:49.734 INFO:tasks.workunit.client.0.vm03.stdout:6/743: rename d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/lcc to d9/d42/d45/d50/d80/d8a/lec 0 2026-03-09T16:14:49.735 INFO:tasks.workunit.client.0.vm03.stdout:6/744: write d9/d42/d45/d65/f7f [2904633,2448] 0 2026-03-09T16:14:49.746 INFO:tasks.workunit.client.0.vm03.stdout:7/739: dread d4/da/f20 [0,4194304] 0 2026-03-09T16:14:49.765 INFO:tasks.workunit.client.0.vm03.stdout:3/767: read d5/d53/d6c/d79/f9d [93990,70404] 0 2026-03-09T16:14:49.779 INFO:tasks.workunit.client.0.vm03.stdout:3/768: dread d5/d1e/d42/d34/fad [0,4194304] 0 2026-03-09T16:14:49.779 INFO:tasks.workunit.client.0.vm03.stdout:3/769: chown d5/d1e/d42/d34 8 1 2026-03-09T16:14:49.781 INFO:tasks.workunit.client.0.vm03.stdout:9/849: write d2/d54/d7d/d8f/dad/def/f42 [3416371,54531] 0 2026-03-09T16:14:49.783 INFO:tasks.workunit.client.0.vm03.stdout:0/796: dwrite d0/da/d1b/d9b/f93 [0,4194304] 0 2026-03-09T16:14:49.800 INFO:tasks.workunit.client.0.vm03.stdout:6/745: rmdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad 39 2026-03-09T16:14:49.849 INFO:tasks.workunit.client.0.vm03.stdout:1/706: rename d4/c7 to d4/d39/d70/ce6 0 2026-03-09T16:14:49.873 INFO:tasks.workunit.client.0.vm03.stdout:6/746: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f37 [1511518,440] 0 2026-03-09T16:14:49.877 INFO:tasks.workunit.client.0.vm03.stdout:3/770: dwrite d5/d53/d6c/f9f [0,4194304] 0 2026-03-09T16:14:49.878 INFO:tasks.workunit.client.0.vm03.stdout:3/771: write d5/d1e/f26 [3459332,60462] 0 2026-03-09T16:14:49.878 INFO:tasks.workunit.client.0.vm03.stdout:6/747: truncate d9/d42/d45/d50/d80/d8a/d9c/d97/f9d 1853674 0 2026-03-09T16:14:49.879 INFO:tasks.workunit.client.0.vm03.stdout:6/748: fdatasync d9/d14/f1d 0 2026-03-09T16:14:49.880 INFO:tasks.workunit.client.0.vm03.stdout:3/772: chown d5/d1e/c46 447757 1 2026-03-09T16:14:49.892 INFO:tasks.workunit.client.0.vm03.stdout:9/850: mknod d2/d4/d11/d29/d2a/db3/dbe/de0/c105 0 2026-03-09T16:14:49.893 INFO:tasks.workunit.client.0.vm03.stdout:9/851: fdatasync d2/d54/d7d/d8f/dad/def/f42 0 2026-03-09T16:14:49.903 INFO:tasks.workunit.client.0.vm03.stdout:9/852: dwrite d2/d4/d11/d29/d2a/d38/fca [0,4194304] 0 2026-03-09T16:14:49.920 INFO:tasks.workunit.client.0.vm03.stdout:5/846: link d2/d7/d115/d16/d5c/dfc/d106/d52/fc6 d2/d7/de/d11/f122 0 2026-03-09T16:14:49.932 INFO:tasks.workunit.client.0.vm03.stdout:8/822: rename da/d10/d28/l5b to da/d1d/l110 0 2026-03-09T16:14:49.936 INFO:tasks.workunit.client.0.vm03.stdout:8/823: truncate da/d32/d79/f103 539812 0 2026-03-09T16:14:49.942 INFO:tasks.workunit.client.0.vm03.stdout:7/740: rmdir d4/da/d5d/db0/d61/dca/ddd 0 2026-03-09T16:14:49.954 INFO:tasks.workunit.client.0.vm03.stdout:4/799: link d5/db/d25/dc8/dd2/dd1/ce0 d5/d17/cf1 0 2026-03-09T16:14:50.007 INFO:tasks.workunit.client.0.vm03.stdout:5/847: mknod d2/d7/d3c/d3d/c123 0 2026-03-09T16:14:50.010 INFO:tasks.workunit.client.0.vm03.stdout:5/848: fsync d2/d75/d119/f11a 0 2026-03-09T16:14:50.010 INFO:tasks.workunit.client.0.vm03.stdout:5/849: stat d2/d7/de/d54/c10e 0 2026-03-09T16:14:50.015 INFO:tasks.workunit.client.0.vm03.stdout:0/797: creat d0/d7/d3e/d57/d5a/d5f/db2/f112 x:0 0 0 2026-03-09T16:14:50.022 INFO:tasks.workunit.client.0.vm03.stdout:2/797: rename db/d12/d2a/d99/de7/df9/d64/dbd/df5/f102 to db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/df7/f113 0 2026-03-09T16:14:50.033 INFO:tasks.workunit.client.0.vm03.stdout:8/824: mknod da/d6c/dc4/c111 0 2026-03-09T16:14:50.040 INFO:tasks.workunit.client.0.vm03.stdout:1/707: link d4/d31/d5c/f9e d4/d6/da2/fe7 0 2026-03-09T16:14:50.057 INFO:tasks.workunit.client.0.vm03.stdout:0/798: dread d0/da/d5c/db6/ffc [0,4194304] 0 2026-03-09T16:14:50.057 INFO:tasks.workunit.client.0.vm03.stdout:0/799: readlink d0/d7/d75/le8 0 2026-03-09T16:14:50.064 INFO:tasks.workunit.client.0.vm03.stdout:7/741: write d4/d2d/f52 [4535856,1385] 0 2026-03-09T16:14:50.064 INFO:tasks.workunit.client.0.vm03.stdout:4/800: creat d5/dd/d1f/d5f/ff2 x:0 0 0 2026-03-09T16:14:50.065 INFO:tasks.workunit.client.0.vm03.stdout:7/742: chown d4/da/d5d/db0/d61/dbc/fdc 5327 1 2026-03-09T16:14:50.067 INFO:tasks.workunit.client.0.vm03.stdout:6/749: symlink d9/d42/d45/led 0 2026-03-09T16:14:50.068 INFO:tasks.workunit.client.0.vm03.stdout:3/773: creat d5/d53/d6c/d79/d91/dc9/fe2 x:0 0 0 2026-03-09T16:14:50.070 INFO:tasks.workunit.client.0.vm03.stdout:9/853: link d2/d4/d11/d29/d2a/d46/ff2 d2/d4/d11/d29/d2a/db3/dbe/f106 0 2026-03-09T16:14:50.077 INFO:tasks.workunit.client.0.vm03.stdout:5/850: creat d2/d7/de/d33/f124 x:0 0 0 2026-03-09T16:14:50.077 INFO:tasks.workunit.client.0.vm03.stdout:0/800: sync 2026-03-09T16:14:50.077 INFO:tasks.workunit.client.0.vm03.stdout:7/743: sync 2026-03-09T16:14:50.108 INFO:tasks.workunit.client.0.vm03.stdout:2/798: rmdir db/d12/d2a/d61/d79 39 2026-03-09T16:14:50.108 INFO:tasks.workunit.client.0.vm03.stdout:8/825: rename c4 to da/d10/d28/d4f/d85/d9c/c112 0 2026-03-09T16:14:50.109 INFO:tasks.workunit.client.0.vm03.stdout:8/826: truncate da/fba 4697351 0 2026-03-09T16:14:50.124 INFO:tasks.workunit.client.0.vm03.stdout:6/750: creat d9/d42/d45/d50/d80/d8a/dc1/fee x:0 0 0 2026-03-09T16:14:50.129 INFO:tasks.workunit.client.0.vm03.stdout:7/744: chown d4/d2d/f8c 2094723 1 2026-03-09T16:14:50.133 INFO:tasks.workunit.client.0.vm03.stdout:5/851: stat d2/d7/d115/d16/d5c/dfc/d106/d3b/l11f 0 2026-03-09T16:14:50.134 INFO:tasks.workunit.client.0.vm03.stdout:1/708: dread d4/d6/d3b/d6b/f42 [0,4194304] 0 2026-03-09T16:14:50.135 INFO:tasks.workunit.client.0.vm03.stdout:0/801: creat d0/d7/d3e/d57/d5a/d5f/db2/dab/f113 x:0 0 0 2026-03-09T16:14:50.145 INFO:tasks.workunit.client.0.vm03.stdout:8/827: creat da/d32/d79/d95/f113 x:0 0 0 2026-03-09T16:14:50.146 INFO:tasks.workunit.client.0.vm03.stdout:8/828: chown da/d32/cc9 21 1 2026-03-09T16:14:50.148 INFO:tasks.workunit.client.0.vm03.stdout:6/751: mkdir d9/d8e/def 0 2026-03-09T16:14:50.149 INFO:tasks.workunit.client.0.vm03.stdout:6/752: chown d9/d42/d45/f4a 51 1 2026-03-09T16:14:50.164 INFO:tasks.workunit.client.0.vm03.stdout:0/802: dread d0/da/d5c/f31 [0,4194304] 0 2026-03-09T16:14:50.165 INFO:tasks.workunit.client.0.vm03.stdout:0/803: fsync d0/d7/d3e/d57/d5a/d5f/db2/dcf/f101 0 2026-03-09T16:14:50.166 INFO:tasks.workunit.client.0.vm03.stdout:1/709: mknod d4/d31/d5c/da8/ce8 0 2026-03-09T16:14:50.172 INFO:tasks.workunit.client.0.vm03.stdout:7/745: symlink d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/df2/lf7 0 2026-03-09T16:14:50.172 INFO:tasks.workunit.client.0.vm03.stdout:0/804: sync 2026-03-09T16:14:50.173 INFO:tasks.workunit.client.0.vm03.stdout:7/746: read d4/f8 [296282,78741] 0 2026-03-09T16:14:50.175 INFO:tasks.workunit.client.0.vm03.stdout:6/753: mkdir d9/d42/d45/d65/dae/df0 0 2026-03-09T16:14:50.187 INFO:tasks.workunit.client.0.vm03.stdout:5/852: truncate d2/d7/d1a/d1c/f5e 1265696 0 2026-03-09T16:14:50.193 INFO:tasks.workunit.client.0.vm03.stdout:4/801: rename d5/db/d25/d9f/dc7/dee to d5/db/d25/d8b/da8/df3 0 2026-03-09T16:14:50.198 INFO:tasks.workunit.client.0.vm03.stdout:1/710: dwrite d4/d31/d5c/da8/da1/fbf [0,4194304] 0 2026-03-09T16:14:50.216 INFO:tasks.workunit.client.0.vm03.stdout:5/853: dread - d2/d7/d115/d24/ff3 zero size 2026-03-09T16:14:50.216 INFO:tasks.workunit.client.0.vm03.stdout:5/854: read - d2/d7/de9/f103 zero size 2026-03-09T16:14:50.217 INFO:tasks.workunit.client.0.vm03.stdout:5/855: fdatasync d2/d75/d119/f11a 0 2026-03-09T16:14:50.217 INFO:tasks.workunit.client.0.vm03.stdout:5/856: stat d2/d7/d115/d16/d5c/dfc/d106/d52/f117 0 2026-03-09T16:14:50.218 INFO:tasks.workunit.client.0.vm03.stdout:5/857: dread - d2/d7/de/da9/f11e zero size 2026-03-09T16:14:50.223 INFO:tasks.workunit.client.0.vm03.stdout:5/858: dread d2/d7/de/d33/f9e [0,4194304] 0 2026-03-09T16:14:50.226 INFO:tasks.workunit.client.0.vm03.stdout:3/774: rename d5/d53/l60 to d5/d53/d6c/d79/d91/le3 0 2026-03-09T16:14:50.233 INFO:tasks.workunit.client.0.vm03.stdout:8/829: creat da/f114 x:0 0 0 2026-03-09T16:14:50.239 INFO:tasks.workunit.client.0.vm03.stdout:0/805: dwrite d0/f60 [0,4194304] 0 2026-03-09T16:14:50.241 INFO:tasks.workunit.client.0.vm03.stdout:7/747: write d4/da/d5d/dd8/f44 [2915002,104196] 0 2026-03-09T16:14:50.243 INFO:tasks.workunit.client.0.vm03.stdout:0/806: write d0/da/d1b/d9b/f61 [3182128,28126] 0 2026-03-09T16:14:50.283 INFO:tasks.workunit.client.0.vm03.stdout:6/754: dwrite d9/d84/fa9 [0,4194304] 0 2026-03-09T16:14:50.285 INFO:tasks.workunit.client.0.vm03.stdout:3/775: dwrite d5/d1e/d42/d55/f57 [0,4194304] 0 2026-03-09T16:14:50.309 INFO:tasks.workunit.client.0.vm03.stdout:0/807: unlink d0/d7/d3e/d57/d5a/d47/f10a 0 2026-03-09T16:14:50.313 INFO:tasks.workunit.client.0.vm03.stdout:5/859: getdents d2/d7/d115/d24/d27/d43/d4b/de6 0 2026-03-09T16:14:50.321 INFO:tasks.workunit.client.0.vm03.stdout:0/808: dread d0/d7/d3e/d57/d5a/d5f/f71 [0,4194304] 0 2026-03-09T16:14:50.329 INFO:tasks.workunit.client.0.vm03.stdout:8/830: truncate da/d1d/f4a 551888 0 2026-03-09T16:14:50.335 INFO:tasks.workunit.client.0.vm03.stdout:5/860: symlink d2/d7/de9/l125 0 2026-03-09T16:14:50.339 INFO:tasks.workunit.client.0.vm03.stdout:9/854: rename d2/d54/d7d/d8f/dad/def/f42 to d2/d4/d11/d29/d2a/f107 0 2026-03-09T16:14:50.342 INFO:tasks.workunit.client.0.vm03.stdout:6/755: truncate d9/d14/d71/fac 1009048 0 2026-03-09T16:14:50.345 INFO:tasks.workunit.client.0.vm03.stdout:4/802: getdents d5/d17/da0 0 2026-03-09T16:14:50.351 INFO:tasks.workunit.client.0.vm03.stdout:8/831: creat da/d10/d28/d4f/d85/d9c/d10e/f115 x:0 0 0 2026-03-09T16:14:50.368 INFO:tasks.workunit.client.0.vm03.stdout:5/861: write d2/d7/d115/d16/d5c/fb5 [4117527,55391] 0 2026-03-09T16:14:50.373 INFO:tasks.workunit.client.0.vm03.stdout:9/855: symlink d2/d4/d11/d12/dc7/dee/dbf/l108 0 2026-03-09T16:14:50.374 INFO:tasks.workunit.client.0.vm03.stdout:9/856: write d2/d4/d11/d12/dc7/dcc/f103 [767522,97000] 0 2026-03-09T16:14:50.377 INFO:tasks.workunit.client.0.vm03.stdout:4/803: mkdir d5/db/d25/d31/d33/d79/df4 0 2026-03-09T16:14:50.383 INFO:tasks.workunit.client.0.vm03.stdout:7/748: getdents d4/da/dbf 0 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: mgrmap e28: vm03.gbgzmu(active, since 2s), standbys: vm05.dygxfv 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:50 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:50.395 INFO:tasks.workunit.client.0.vm03.stdout:9/857: mkdir d2/d4/d11/d12/d28/d109 0 2026-03-09T16:14:50.399 INFO:tasks.workunit.client.0.vm03.stdout:6/756: creat d9/d42/d45/d50/d80/d90/db7/ff1 x:0 0 0 2026-03-09T16:14:50.406 INFO:tasks.workunit.client.0.vm03.stdout:4/804: write d5/db/d25/d31/d4d/d5b/d7d/fc3 [22046,38800] 0 2026-03-09T16:14:50.407 INFO:tasks.workunit.client.0.vm03.stdout:3/776: getdents d5/d6d/db9 0 2026-03-09T16:14:50.417 INFO:tasks.workunit.client.0.vm03.stdout:3/777: dread d5/d1e/f72 [0,4194304] 0 2026-03-09T16:14:50.418 INFO:tasks.workunit.client.0.vm03.stdout:8/832: getdents da/d6c/d7a/de4 0 2026-03-09T16:14:50.436 INFO:tasks.workunit.client.0.vm03.stdout:2/799: rename db/d12/d2a/d99/de7/df9/d52/l89 to db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/l114 0 2026-03-09T16:14:50.436 INFO:tasks.workunit.client.0.vm03.stdout:2/800: stat db/d12/d2a/f60 0 2026-03-09T16:14:50.444 INFO:tasks.workunit.client.0.vm03.stdout:5/862: truncate d2/d7/d3c/d3d/f93 1438731 0 2026-03-09T16:14:50.447 INFO:tasks.workunit.client.0.vm03.stdout:9/858: creat d2/d4/d11/d12/dc7/dee/dbf/f10a x:0 0 0 2026-03-09T16:14:50.451 INFO:tasks.workunit.client.0.vm03.stdout:4/805: creat d5/db/d25/d8b/da8/df3/ff5 x:0 0 0 2026-03-09T16:14:50.453 INFO:tasks.workunit.client.0.vm03.stdout:8/833: unlink da/d10/d28/d4f/d68/fc1 0 2026-03-09T16:14:50.461 INFO:tasks.workunit.client.0.vm03.stdout:8/834: dwrite da/d32/d79/d95/ffc [0,4194304] 0 2026-03-09T16:14:50.471 INFO:tasks.workunit.client.0.vm03.stdout:7/749: symlink d4/da/d5d/dd8/lf8 0 2026-03-09T16:14:50.471 INFO:tasks.workunit.client.0.vm03.stdout:2/801: unlink db/d12/d2a/d61/d6d/l7d 0 2026-03-09T16:14:50.471 INFO:tasks.workunit.client.0.vm03.stdout:9/859: mkdir d2/d4/d11/d29/d2a/d38/dcd/d10b 0 2026-03-09T16:14:50.472 INFO:tasks.workunit.client.0.vm03.stdout:7/750: write d4/da/d5d/dd8/d22/f48 [3340175,41824] 0 2026-03-09T16:14:50.472 INFO:tasks.workunit.client.0.vm03.stdout:9/860: fsync d2/d4/d11/d12/dc7/dee/fd1 0 2026-03-09T16:14:50.473 INFO:tasks.workunit.client.0.vm03.stdout:9/861: stat d2/d4/d11/d29/d2a/d46/ca2 0 2026-03-09T16:14:50.474 INFO:tasks.workunit.client.0.vm03.stdout:9/862: dread - d2/d4/d11/d12/dc7/dee/dc2/de9/ffb zero size 2026-03-09T16:14:50.475 INFO:tasks.workunit.client.0.vm03.stdout:6/757: unlink d9/d42/d45/f57 0 2026-03-09T16:14:50.476 INFO:tasks.workunit.client.0.vm03.stdout:4/806: creat d5/db/d25/d31/d4d/d5b/d9a/ff6 x:0 0 0 2026-03-09T16:14:50.476 INFO:tasks.workunit.client.0.vm03.stdout:3/778: creat d5/d1e/d42/d55/d86/dbe/fe4 x:0 0 0 2026-03-09T16:14:50.477 INFO:tasks.workunit.client.0.vm03.stdout:4/807: dread - d5/db/d25/d8b/da8/df3/ff5 zero size 2026-03-09T16:14:50.489 INFO:tasks.workunit.client.0.vm03.stdout:3/779: mknod d5/d6d/d6a/dbd/ce5 0 2026-03-09T16:14:50.496 INFO:tasks.workunit.client.0.vm03.stdout:1/711: rename d4/d6/d1d/d20/d93/cb6 to d4/d6/d3b/ce9 0 2026-03-09T16:14:50.503 INFO:tasks.workunit.client.0.vm03.stdout:9/863: dread d2/d4/d11/d29/d2a/d4d/fab [0,4194304] 0 2026-03-09T16:14:50.505 INFO:tasks.workunit.client.0.vm03.stdout:6/758: creat d9/d42/d45/d65/dbf/dc9/de0/ff2 x:0 0 0 2026-03-09T16:14:50.506 INFO:tasks.workunit.client.0.vm03.stdout:8/835: sync 2026-03-09T16:14:50.521 INFO:tasks.workunit.client.0.vm03.stdout:0/809: rename d0/da/d1b/de0/cf1 to d0/da/d7a/c114 0 2026-03-09T16:14:50.525 INFO:tasks.workunit.client.0.vm03.stdout:1/712: mkdir d4/d6/da2/dea 0 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: mgrmap e28: vm03.gbgzmu(active, since 2s), standbys: vm05.dygxfv 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:50.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:50 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:50.527 INFO:tasks.workunit.client.0.vm03.stdout:1/713: chown d4/fa 6221 1 2026-03-09T16:14:50.531 INFO:tasks.workunit.client.0.vm03.stdout:7/751: link d4/da/d5d/dd8/d22/d24/d16/lb4 d4/da/d5d/dd8/d22/d24/d15/d71/db7/lf9 0 2026-03-09T16:14:50.533 INFO:tasks.workunit.client.0.vm03.stdout:7/752: chown d4/da/d5d/dd8/d22/d24/d16/d2b 21379 1 2026-03-09T16:14:50.535 INFO:tasks.workunit.client.0.vm03.stdout:5/863: write d2/d75/fe3 [395410,116815] 0 2026-03-09T16:14:50.536 INFO:tasks.workunit.client.0.vm03.stdout:5/864: chown d2/d7/de/d11/d19/d31/d35/d87 11412 1 2026-03-09T16:14:50.545 INFO:tasks.workunit.client.0.vm03.stdout:9/864: fsync d2/d54/fba 0 2026-03-09T16:14:50.547 INFO:tasks.workunit.client.0.vm03.stdout:9/865: dread - d2/d4/d11/d12/dc7/dcc/fd0 zero size 2026-03-09T16:14:50.547 INFO:tasks.workunit.client.0.vm03.stdout:9/866: readlink d2/d4/d11/d29/d2a/d46/ld4 0 2026-03-09T16:14:50.554 INFO:tasks.workunit.client.0.vm03.stdout:8/836: mknod da/d32/dad/c116 0 2026-03-09T16:14:50.554 INFO:tasks.workunit.client.0.vm03.stdout:3/780: unlink d5/d1e/d42/f2c 0 2026-03-09T16:14:50.555 INFO:tasks.workunit.client.0.vm03.stdout:4/808: rename d5/db/d25/d31 to d5/db/d25/d8b/da8/df3/df7 0 2026-03-09T16:14:50.555 INFO:tasks.workunit.client.0.vm03.stdout:2/802: getdents db/d12 0 2026-03-09T16:14:50.555 INFO:tasks.workunit.client.0.vm03.stdout:8/837: chown da/d10/fa4 512359435 1 2026-03-09T16:14:50.556 INFO:tasks.workunit.client.0.vm03.stdout:4/809: dread - d5/dd/d1f/d5f/ff2 zero size 2026-03-09T16:14:50.556 INFO:tasks.workunit.client.0.vm03.stdout:3/781: chown d5/f2b 157501353 1 2026-03-09T16:14:50.557 INFO:tasks.workunit.client.0.vm03.stdout:4/810: write d5/dd/f1e [3759924,128016] 0 2026-03-09T16:14:50.562 INFO:tasks.workunit.client.0.vm03.stdout:0/810: symlink d0/da/d5c/db6/l115 0 2026-03-09T16:14:50.566 INFO:tasks.workunit.client.0.vm03.stdout:9/867: mkdir d2/d54/d7d/d10c 0 2026-03-09T16:14:50.569 INFO:tasks.workunit.client.0.vm03.stdout:7/753: rename d4/da/d5d/db0/l17 to d4/d2d/d4b/lfa 0 2026-03-09T16:14:50.570 INFO:tasks.workunit.client.0.vm03.stdout:2/803: mknod db/d12/d2a/d99/de7/df9/d64/dbd/dec/c115 0 2026-03-09T16:14:50.574 INFO:tasks.workunit.client.0.vm03.stdout:1/714: mkdir d4/d39/deb 0 2026-03-09T16:14:50.575 INFO:tasks.workunit.client.0.vm03.stdout:6/759: creat d9/d42/d45/ff3 x:0 0 0 2026-03-09T16:14:50.577 INFO:tasks.workunit.client.0.vm03.stdout:1/715: read d4/d6/d3b/d6b/d25/f84 [1937093,62423] 0 2026-03-09T16:14:50.579 INFO:tasks.workunit.client.0.vm03.stdout:0/811: symlink d0/d7/d3e/d57/d5a/d52/d9f/l116 0 2026-03-09T16:14:50.580 INFO:tasks.workunit.client.0.vm03.stdout:9/868: creat d2/d4/d11/d29/d2a/d38/db6/f10d x:0 0 0 2026-03-09T16:14:50.584 INFO:tasks.workunit.client.0.vm03.stdout:6/760: dwrite d9/d84/f91 [0,4194304] 0 2026-03-09T16:14:50.584 INFO:tasks.workunit.client.0.vm03.stdout:0/812: truncate d0/d7/d3e/d57/d5a/d82/d89/dbd/f9e 136465 0 2026-03-09T16:14:50.585 INFO:tasks.workunit.client.0.vm03.stdout:7/754: chown d4/f26 108 1 2026-03-09T16:14:50.586 INFO:tasks.workunit.client.0.vm03.stdout:7/755: fdatasync d4/d2d/d4b/f4c 0 2026-03-09T16:14:50.587 INFO:tasks.workunit.client.0.vm03.stdout:3/782: symlink d5/d1e/d42/d4c/da1/le6 0 2026-03-09T16:14:50.595 INFO:tasks.workunit.client.0.vm03.stdout:3/783: chown d5/d2e/l52 41515200 1 2026-03-09T16:14:50.601 INFO:tasks.workunit.client.0.vm03.stdout:4/811: truncate d5/db/f34 4079821 0 2026-03-09T16:14:50.609 INFO:tasks.workunit.client.0.vm03.stdout:1/716: mkdir d4/d6/d3b/d6b/da5/dc0/dec 0 2026-03-09T16:14:50.616 INFO:tasks.workunit.client.0.vm03.stdout:8/838: truncate da/d10/f1f 68441 0 2026-03-09T16:14:50.616 INFO:tasks.workunit.client.0.vm03.stdout:9/869: dread d2/d4/d11/d12/dc7/dee/fcb [0,4194304] 0 2026-03-09T16:14:50.616 INFO:tasks.workunit.client.0.vm03.stdout:0/813: mknod d0/d7/d3e/d57/d5a/d5f/db2/dab/c117 0 2026-03-09T16:14:50.618 INFO:tasks.workunit.client.0.vm03.stdout:5/865: dwrite d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:50.636 INFO:tasks.workunit.client.0.vm03.stdout:7/756: fsync d4/da/d5d/dd8/d22/d24/d16/d6e/f73 0 2026-03-09T16:14:50.639 INFO:tasks.workunit.client.0.vm03.stdout:9/870: dread d2/d4/d11/d12/dc7/dee/fd1 [0,4194304] 0 2026-03-09T16:14:50.647 INFO:tasks.workunit.client.0.vm03.stdout:3/784: mknod d5/d53/d6c/d79/d91/ce7 0 2026-03-09T16:14:50.655 INFO:tasks.workunit.client.0.vm03.stdout:4/812: write d5/f9 [605785,33694] 0 2026-03-09T16:14:50.655 INFO:tasks.workunit.client.0.vm03.stdout:6/761: mknod d9/d14/cf4 0 2026-03-09T16:14:50.656 INFO:tasks.workunit.client.0.vm03.stdout:1/717: fdatasync d4/d6/d3b/d6b/d25/f84 0 2026-03-09T16:14:50.660 INFO:tasks.workunit.client.0.vm03.stdout:2/804: dwrite db/d12/d2a/f5f [0,4194304] 0 2026-03-09T16:14:50.662 INFO:tasks.workunit.client.0.vm03.stdout:8/839: rename da/d10/fa4 to da/d10/d28/d4f/d85/d9c/d10e/f117 0 2026-03-09T16:14:50.681 INFO:tasks.workunit.client.0.vm03.stdout:1/718: sync 2026-03-09T16:14:50.685 INFO:tasks.workunit.client.0.vm03.stdout:0/814: symlink d0/d7/d3e/d111/l118 0 2026-03-09T16:14:50.697 INFO:tasks.workunit.client.0.vm03.stdout:2/805: dread - db/ff0 zero size 2026-03-09T16:14:50.699 INFO:tasks.workunit.client.0.vm03.stdout:8/840: unlink da/d32/ff5 0 2026-03-09T16:14:50.706 INFO:tasks.workunit.client.0.vm03.stdout:5/866: write d2/d7/d115/d16/d5c/dfc/d106/d3b/f88 [578700,13412] 0 2026-03-09T16:14:50.711 INFO:tasks.workunit.client.0.vm03.stdout:0/815: creat d0/d7/d3e/d111/f119 x:0 0 0 2026-03-09T16:14:50.712 INFO:tasks.workunit.client.0.vm03.stdout:7/757: write d4/da/d5d/db0/d61/f80 [911032,122122] 0 2026-03-09T16:14:50.712 INFO:tasks.workunit.client.0.vm03.stdout:0/816: chown d0/d7/d3e/d57/d5a/d82/d89/dc0 20 1 2026-03-09T16:14:50.714 INFO:tasks.workunit.client.0.vm03.stdout:0/817: dread d0/d7/d3e/d57/d5a/d82/d89/dbd/f9e [0,4194304] 0 2026-03-09T16:14:50.717 INFO:tasks.workunit.client.0.vm03.stdout:3/785: rename d5/d1e/d42/d34/cac to d5/d53/d6c/d79/ce8 0 2026-03-09T16:14:50.718 INFO:tasks.workunit.client.0.vm03.stdout:9/871: write d2/de/d88/d7a/fc8 [925066,29926] 0 2026-03-09T16:14:50.722 INFO:tasks.workunit.client.0.vm03.stdout:2/806: read fa [775164,79653] 0 2026-03-09T16:14:50.722 INFO:tasks.workunit.client.0.vm03.stdout:7/758: dread d4/da/d5d/db0/d9d/fac [0,4194304] 0 2026-03-09T16:14:50.723 INFO:tasks.workunit.client.0.vm03.stdout:7/759: dread - d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/fe7 zero size 2026-03-09T16:14:50.726 INFO:tasks.workunit.client.0.vm03.stdout:7/760: dwrite d4/da/d45/d51/f91 [0,4194304] 0 2026-03-09T16:14:50.739 INFO:tasks.workunit.client.0.vm03.stdout:5/867: symlink d2/d7/de/d11/d19/d29/d90/db6/l126 0 2026-03-09T16:14:50.747 INFO:tasks.workunit.client.0.vm03.stdout:1/719: write d4/d6/d3b/d63/f89 [644491,40270] 0 2026-03-09T16:14:50.747 INFO:tasks.workunit.client.0.vm03.stdout:1/720: chown d4/d6/d1d/d20/d23/f9f 1427748435 1 2026-03-09T16:14:50.748 INFO:tasks.workunit.client.0.vm03.stdout:1/721: chown d4/d31/d5c/da8/da1 1 1 2026-03-09T16:14:50.753 INFO:tasks.workunit.client.0.vm03.stdout:0/818: mkdir d0/d7/d3e/d57/d5a/d82/dd2/d11a 0 2026-03-09T16:14:50.754 INFO:tasks.workunit.client.0.vm03.stdout:0/819: write d0/da/d7a/d98/f9d [6037833,16649] 0 2026-03-09T16:14:50.756 INFO:tasks.workunit.client.0.vm03.stdout:0/820: write d0/d7/d3e/d57/d5a/d47/f10c [92269,50060] 0 2026-03-09T16:14:50.767 INFO:tasks.workunit.client.0.vm03.stdout:9/872: unlink d2/d4/d11/d12/dc7/dee/dc2/feb 0 2026-03-09T16:14:50.775 INFO:tasks.workunit.client.0.vm03.stdout:8/841: symlink da/d10/d28/d4f/d85/d9c/d10e/d105/l118 0 2026-03-09T16:14:50.777 INFO:tasks.workunit.client.0.vm03.stdout:4/813: mkdir d5/db/d25/d8b/da8/df8 0 2026-03-09T16:14:50.779 INFO:tasks.workunit.client.0.vm03.stdout:4/814: read d5/f54 [11730951,58301] 0 2026-03-09T16:14:50.783 INFO:tasks.workunit.client.0.vm03.stdout:7/761: mkdir d4/da/d5d/db0/da9/dfb 0 2026-03-09T16:14:50.784 INFO:tasks.workunit.client.0.vm03.stdout:7/762: chown d4/da/d5d/dd8/d22/d24/d15/le0 48391 1 2026-03-09T16:14:50.786 INFO:tasks.workunit.client.0.vm03.stdout:6/762: getdents d9/d42/d45/d50/d80/d8a/d9c/d97 0 2026-03-09T16:14:50.786 INFO:tasks.workunit.client.0.vm03.stdout:6/763: dread - d9/d8e/fd3 zero size 2026-03-09T16:14:50.786 INFO:tasks.workunit.client.0.vm03.stdout:6/764: chown d9 594 1 2026-03-09T16:14:50.790 INFO:tasks.workunit.client.0.vm03.stdout:1/722: rmdir d4/d6/d1d/d69 39 2026-03-09T16:14:50.796 INFO:tasks.workunit.client.0.vm03.stdout:3/786: unlink d5/d44/f56 0 2026-03-09T16:14:50.797 INFO:tasks.workunit.client.0.vm03.stdout:3/787: truncate d5/d1e/d42/f25 4254746 0 2026-03-09T16:14:50.804 INFO:tasks.workunit.client.0.vm03.stdout:0/821: dwrite d0/d7/f3d [4194304,4194304] 0 2026-03-09T16:14:50.817 INFO:tasks.workunit.client.0.vm03.stdout:8/842: rename da/d10/d28/d64/cc0 to da/d10/d28/d4f/daf/dee/c119 0 2026-03-09T16:14:50.821 INFO:tasks.workunit.client.0.vm03.stdout:4/815: truncate f1 4096018 0 2026-03-09T16:14:50.825 INFO:tasks.workunit.client.0.vm03.stdout:8/843: dread da/d6c/d7a/ff9 [0,4194304] 0 2026-03-09T16:14:50.825 INFO:tasks.workunit.client.0.vm03.stdout:4/816: read d5/db/d25/d8b/da8/df3/df7/fa1 [876831,40767] 0 2026-03-09T16:14:50.842 INFO:tasks.workunit.client.0.vm03.stdout:8/844: dread da/d10/f33 [0,4194304] 0 2026-03-09T16:14:50.848 INFO:tasks.workunit.client.0.vm03.stdout:2/807: write db/d12/d2a/d61/f10c [1155724,62725] 0 2026-03-09T16:14:50.853 INFO:tasks.workunit.client.0.vm03.stdout:2/808: dread db/d12/d2a/f8d [0,4194304] 0 2026-03-09T16:14:50.866 INFO:tasks.workunit.client.0.vm03.stdout:0/822: dread d0/d7/d3e/d57/d5a/d52/d9f/fe3 [0,4194304] 0 2026-03-09T16:14:50.870 INFO:tasks.workunit.client.0.vm03.stdout:9/873: mkdir d2/d54/d7d/dd3/d10e 0 2026-03-09T16:14:50.873 INFO:tasks.workunit.client.0.vm03.stdout:7/763: write d4/da/d5d/dd8/d22/d24/fa3 [755480,48525] 0 2026-03-09T16:14:50.874 INFO:tasks.workunit.client.0.vm03.stdout:8/845: dread da/d10/d28/d4f/d85/d9c/d10e/f117 [0,4194304] 0 2026-03-09T16:14:50.874 INFO:tasks.workunit.client.0.vm03.stdout:3/788: write d5/fb3 [3468679,77011] 0 2026-03-09T16:14:50.881 INFO:tasks.workunit.client.0.vm03.stdout:2/809: symlink db/d12/l116 0 2026-03-09T16:14:50.881 INFO:tasks.workunit.client.0.vm03.stdout:2/810: readlink db/laa 0 2026-03-09T16:14:50.882 INFO:tasks.workunit.client.0.vm03.stdout:8/846: chown da/def/lfe 1683769662 1 2026-03-09T16:14:50.885 INFO:tasks.workunit.client.0.vm03.stdout:5/868: getdents d2/d7/de/d11/d19 0 2026-03-09T16:14:50.885 INFO:tasks.workunit.client.0.vm03.stdout:6/765: creat d9/d14/ff5 x:0 0 0 2026-03-09T16:14:50.890 INFO:tasks.workunit.client.0.vm03.stdout:9/874: sync 2026-03-09T16:14:50.895 INFO:tasks.workunit.client.0.vm03.stdout:0/823: dwrite d0/da/d7a/d98/f9d [0,4194304] 0 2026-03-09T16:14:50.905 INFO:tasks.workunit.client.0.vm03.stdout:1/723: dwrite d4/fd [0,4194304] 0 2026-03-09T16:14:50.905 INFO:tasks.workunit.client.0.vm03.stdout:4/817: creat d5/ff9 x:0 0 0 2026-03-09T16:14:50.908 INFO:tasks.workunit.client.0.vm03.stdout:6/766: symlink d9/d42/d45/d65/dae/lf6 0 2026-03-09T16:14:50.912 INFO:tasks.workunit.client.0.vm03.stdout:5/869: creat d2/d7/d115/d24/d27/d43/d4b/f127 x:0 0 0 2026-03-09T16:14:50.916 INFO:tasks.workunit.client.0.vm03.stdout:3/789: link d5/fb3 d5/d1e/d42/d55/d86/fe9 0 2026-03-09T16:14:50.916 INFO:tasks.workunit.client.0.vm03.stdout:5/870: stat d2/d7/de/d11/d19/d31/d35/fd3 0 2026-03-09T16:14:50.916 INFO:tasks.workunit.client.0.vm03.stdout:5/871: stat d2/d75/d119 0 2026-03-09T16:14:50.926 INFO:tasks.workunit.client.0.vm03.stdout:9/875: dread d2/d54/d7d/d8f/dad/def/d84/d8a/fb5 [0,4194304] 0 2026-03-09T16:14:50.927 INFO:tasks.workunit.client.0.vm03.stdout:5/872: dwrite d2/d7/de/d11/d19/f109 [0,4194304] 0 2026-03-09T16:14:50.933 INFO:tasks.workunit.client.0.vm03.stdout:6/767: dread d9/d42/d45/d50/d80/d90/f64 [0,4194304] 0 2026-03-09T16:14:50.948 INFO:tasks.workunit.client.0.vm03.stdout:0/824: fdatasync d0/d7/d3e/d57/d5a/d82/d89/dbd/f9e 0 2026-03-09T16:14:50.953 INFO:tasks.workunit.client.0.vm03.stdout:1/724: creat d4/d6/d3b/d6b/da5/dc0/fed x:0 0 0 2026-03-09T16:14:50.960 INFO:tasks.workunit.client.0.vm03.stdout:4/818: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/ffa x:0 0 0 2026-03-09T16:14:50.970 INFO:tasks.workunit.client.0.vm03.stdout:3/790: rename d5/d1e/d42/d55/f57 to d5/d53/d6c/fea 0 2026-03-09T16:14:50.981 INFO:tasks.workunit.client.0.vm03.stdout:8/847: dwrite da/d10/d28/d64/fed [0,4194304] 0 2026-03-09T16:14:50.991 INFO:tasks.workunit.client.0.vm03.stdout:7/764: link d4/da/d5d/db0/l7c d4/da/d5d/db0/da9/db8/lfc 0 2026-03-09T16:14:50.997 INFO:tasks.workunit.client.0.vm03.stdout:5/873: unlink d2/d7/d1a/d1c/l85 0 2026-03-09T16:14:51.006 INFO:tasks.workunit.client.0.vm03.stdout:0/825: truncate d0/d7/d3e/d57/d5a/d5f/db2/f76 1796279 0 2026-03-09T16:14:51.006 INFO:tasks.workunit.client.0.vm03.stdout:0/826: chown d0/da 31200258 1 2026-03-09T16:14:51.006 INFO:tasks.workunit.client.0.vm03.stdout:0/827: chown d0/d7/d3e/d57/d5a/fc1 102014 1 2026-03-09T16:14:51.033 INFO:tasks.workunit.client.0.vm03.stdout:9/876: unlink d2/d4/d11/d12/l2e 0 2026-03-09T16:14:51.034 INFO:tasks.workunit.client.0.vm03.stdout:4/819: dwrite d5/db/d25/d8b/da8/df3/df7/d33/fb6 [0,4194304] 0 2026-03-09T16:14:51.042 INFO:tasks.workunit.client.0.vm03.stdout:8/848: dwrite da/d32/f61 [0,4194304] 0 2026-03-09T16:14:51.049 INFO:tasks.workunit.client.0.vm03.stdout:5/874: fsync d2/d7/de/d11/d19/d31/fcb 0 2026-03-09T16:14:51.049 INFO:tasks.workunit.client.0.vm03.stdout:5/875: readlink d2/l66 0 2026-03-09T16:14:51.053 INFO:tasks.workunit.client.0.vm03.stdout:0/828: symlink d0/da/d7a/l11b 0 2026-03-09T16:14:51.053 INFO:tasks.workunit.client.0.vm03.stdout:0/829: dread - d0/d7/d3e/d57/d5a/d5f/db2/f106 zero size 2026-03-09T16:14:51.054 INFO:tasks.workunit.client.0.vm03.stdout:6/768: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/cf7 0 2026-03-09T16:14:51.059 INFO:tasks.workunit.client.0.vm03.stdout:1/725: symlink d4/d6/d1d/db5/lee 0 2026-03-09T16:14:51.059 INFO:tasks.workunit.client.0.vm03.stdout:1/726: write d4/d31/d5c/da8/da1/fd1 [604521,54462] 0 2026-03-09T16:14:51.062 INFO:tasks.workunit.client.0.vm03.stdout:7/765: getdents d4/da/dbf/deb 0 2026-03-09T16:14:51.063 INFO:tasks.workunit.client.0.vm03.stdout:8/849: creat da/d10/d28/d64/f11a x:0 0 0 2026-03-09T16:14:51.065 INFO:tasks.workunit.client.0.vm03.stdout:5/876: creat d2/d7/de/d54/dce/f128 x:0 0 0 2026-03-09T16:14:51.067 INFO:tasks.workunit.client.0.vm03.stdout:1/727: dread d4/d6/d1d/d20/d93/f8c [0,4194304] 0 2026-03-09T16:14:51.069 INFO:tasks.workunit.client.0.vm03.stdout:6/769: fdatasync d9/d14/f31 0 2026-03-09T16:14:51.069 INFO:tasks.workunit.client.0.vm03.stdout:0/830: chown d0/da/d1b/c80 61 1 2026-03-09T16:14:51.072 INFO:tasks.workunit.client.0.vm03.stdout:0/831: dread - d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/ff2 zero size 2026-03-09T16:14:51.081 INFO:tasks.workunit.client.0.vm03.stdout:2/811: symlink db/d12/d2a/d61/d79/l117 0 2026-03-09T16:14:51.081 INFO:tasks.workunit.client.0.vm03.stdout:7/766: creat d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/df2/ffd x:0 0 0 2026-03-09T16:14:51.081 INFO:tasks.workunit.client.0.vm03.stdout:8/850: symlink da/d10/d28/db1/l11b 0 2026-03-09T16:14:51.081 INFO:tasks.workunit.client.0.vm03.stdout:1/728: chown d4/d39/cce 34586 1 2026-03-09T16:14:51.083 INFO:tasks.workunit.client.0.vm03.stdout:7/767: write d4/da/d5d/dd8/d22/d24/d16/d2b/fe1 [715895,71563] 0 2026-03-09T16:14:51.091 INFO:tasks.workunit.client.0.vm03.stdout:0/832: creat d0/d7/d3e/d57/d5a/d5f/db2/dab/f11c x:0 0 0 2026-03-09T16:14:51.094 INFO:tasks.workunit.client.0.vm03.stdout:9/877: creat d2/f10f x:0 0 0 2026-03-09T16:14:51.094 INFO:tasks.workunit.client.0.vm03.stdout:0/833: dread - d0/d7/fbb zero size 2026-03-09T16:14:51.096 INFO:tasks.workunit.client.0.vm03.stdout:7/768: dwrite d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/fe7 [0,4194304] 0 2026-03-09T16:14:51.096 INFO:tasks.workunit.client.0.vm03.stdout:7/769: truncate d4/d2d/fd5 4399840 0 2026-03-09T16:14:51.097 INFO:tasks.workunit.client.0.vm03.stdout:7/770: write d4/d2d/fd5 [3660568,46086] 0 2026-03-09T16:14:51.101 INFO:tasks.workunit.client.0.vm03.stdout:7/771: write d4/d2d/f52 [995873,127353] 0 2026-03-09T16:14:51.110 INFO:tasks.workunit.client.0.vm03.stdout:4/820: dread d5/db/d25/d8b/da8/df3/df7/d4d/f85 [0,4194304] 0 2026-03-09T16:14:51.126 INFO:tasks.workunit.client.0.vm03.stdout:1/729: mkdir d4/d6/d1d/d20/d5f/def 0 2026-03-09T16:14:51.130 INFO:tasks.workunit.client.0.vm03.stdout:3/791: getdents d5/d6d 0 2026-03-09T16:14:51.137 INFO:tasks.workunit.client.0.vm03.stdout:6/770: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f3f [590043,41412] 0 2026-03-09T16:14:51.140 INFO:tasks.workunit.client.0.vm03.stdout:4/821: sync 2026-03-09T16:14:51.149 INFO:tasks.workunit.client.0.vm03.stdout:4/822: readlink d5/dd/d1f/l27 0 2026-03-09T16:14:51.150 INFO:tasks.workunit.client.0.vm03.stdout:5/877: dwrite d2/d7/d115/d16/d5c/f94 [0,4194304] 0 2026-03-09T16:14:51.154 INFO:tasks.workunit.client.0.vm03.stdout:9/878: write d2/d4/d1f/f51 [792195,107492] 0 2026-03-09T16:14:51.156 INFO:tasks.workunit.client.0.vm03.stdout:3/792: dwrite d5/d1e/f9b [0,4194304] 0 2026-03-09T16:14:51.162 INFO:tasks.workunit.client.0.vm03.stdout:3/793: chown d5/d53/d6c/cb5 13972 1 2026-03-09T16:14:51.163 INFO:tasks.workunit.client.0.vm03.stdout:0/834: dread d0/da/fbe [0,4194304] 0 2026-03-09T16:14:51.180 INFO:tasks.workunit.client.0.vm03.stdout:7/772: write d4/da/d5d/dd8/d22/f33 [2393378,95772] 0 2026-03-09T16:14:51.180 INFO:tasks.workunit.client.0.vm03.stdout:2/812: write db/d12/fe8 [1442650,37991] 0 2026-03-09T16:14:51.190 INFO:tasks.workunit.client.0.vm03.stdout:8/851: dwrite da/d10/d28/d4f/d68/fa9 [0,4194304] 0 2026-03-09T16:14:51.197 INFO:tasks.workunit.client.0.vm03.stdout:8/852: stat f8 0 2026-03-09T16:14:51.197 INFO:tasks.workunit.client.0.vm03.stdout:5/878: creat d2/d7/de/d33/f129 x:0 0 0 2026-03-09T16:14:51.197 INFO:tasks.workunit.client.0.vm03.stdout:4/823: rename d5/db/d25/d8b/da8/d81/f8c to d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d82/de4/ffb 0 2026-03-09T16:14:51.210 INFO:tasks.workunit.client.0.vm03.stdout:3/794: creat d5/d1e/d42/d4c/feb x:0 0 0 2026-03-09T16:14:51.215 INFO:tasks.workunit.client.0.vm03.stdout:0/835: rmdir d0/da/d1b/dc8/d104 39 2026-03-09T16:14:51.215 INFO:tasks.workunit.client.0.vm03.stdout:2/813: fsync db/d12/d2a/d61/d6d/f91 0 2026-03-09T16:14:51.218 INFO:tasks.workunit.client.0.vm03.stdout:4/824: creat d5/d17/ffc x:0 0 0 2026-03-09T16:14:51.218 INFO:tasks.workunit.client.0.vm03.stdout:3/795: fdatasync d5/d6d/d6a/fc6 0 2026-03-09T16:14:51.221 INFO:tasks.workunit.client.0.vm03.stdout:7/773: link d4/d2d/f90 d4/da/d5d/db0/da9/db8/ddf/ffe 0 2026-03-09T16:14:51.223 INFO:tasks.workunit.client.0.vm03.stdout:1/730: truncate d4/d6/d1d/d69/f76 2040722 0 2026-03-09T16:14:51.227 INFO:tasks.workunit.client.0.vm03.stdout:3/796: dwrite d5/d6d/d6a/fc6 [0,4194304] 0 2026-03-09T16:14:51.254 INFO:tasks.workunit.client.0.vm03.stdout:4/825: rmdir d5/db/d25/d8b 39 2026-03-09T16:14:51.258 INFO:tasks.workunit.client.0.vm03.stdout:1/731: mkdir d4/d39/d70/df0 0 2026-03-09T16:14:51.261 INFO:tasks.workunit.client.0.vm03.stdout:9/879: dwrite d2/d4/d11/d29/fc0 [0,4194304] 0 2026-03-09T16:14:51.262 INFO:tasks.workunit.client.0.vm03.stdout:6/771: getdents d9/d42/d45/d50/d80/d8a/dc1/dd4 0 2026-03-09T16:14:51.268 INFO:tasks.workunit.client.0.vm03.stdout:9/880: write d2/d4/d11/f13 [1221120,38573] 0 2026-03-09T16:14:51.273 INFO:tasks.workunit.client.0.vm03.stdout:0/836: mknod d0/da/d1b/dc8/c11d 0 2026-03-09T16:14:51.274 INFO:tasks.workunit.client.0.vm03.stdout:8/853: dwrite f8 [4194304,4194304] 0 2026-03-09T16:14:51.275 INFO:tasks.workunit.client.0.vm03.stdout:8/854: chown da/d10/d28/d4f/d68/ddc/lf4 0 1 2026-03-09T16:14:51.283 INFO:tasks.workunit.client.0.vm03.stdout:1/732: dread - d4/d6/d1d/d69/fa9 zero size 2026-03-09T16:14:51.283 INFO:tasks.workunit.client.0.vm03.stdout:3/797: dwrite d5/d2e/fd4 [4194304,4194304] 0 2026-03-09T16:14:51.285 INFO:tasks.workunit.client.0.vm03.stdout:6/772: dwrite d9/d84/fa9 [0,4194304] 0 2026-03-09T16:14:51.293 INFO:tasks.workunit.client.0.vm03.stdout:6/773: write d9/d42/d45/d50/d80/d90/db7/ff1 [78252,18071] 0 2026-03-09T16:14:51.303 INFO:tasks.workunit.client.0.vm03.stdout:9/881: mknod d2/d4/d11/d29/d2a/db3/c110 0 2026-03-09T16:14:51.311 INFO:tasks.workunit.client.0.vm03.stdout:2/814: write db/d12/d2a/d61/d6d/f8a [783904,59358] 0 2026-03-09T16:14:51.311 INFO:tasks.workunit.client.0.vm03.stdout:8/855: fsync da/d10/f33 0 2026-03-09T16:14:51.311 INFO:tasks.workunit.client.0.vm03.stdout:5/879: getdents d2/d7/de/d11/d19/d29/d90/dbe 0 2026-03-09T16:14:51.325 INFO:tasks.workunit.client.0.vm03.stdout:1/733: unlink d4/d31/d5c/ce1 0 2026-03-09T16:14:51.326 INFO:tasks.workunit.client.0.vm03.stdout:2/815: dread db/d12/d2a/f60 [0,4194304] 0 2026-03-09T16:14:51.331 INFO:tasks.workunit.client.0.vm03.stdout:3/798: rename d5/d1e/f9b to d5/d2e/fec 0 2026-03-09T16:14:51.331 INFO:tasks.workunit.client.0.vm03.stdout:6/774: creat d9/d42/d45/d65/dbf/dc9/ff8 x:0 0 0 2026-03-09T16:14:51.331 INFO:tasks.workunit.client.0.vm03.stdout:3/799: chown d5/d53/d6c/l4f 0 1 2026-03-09T16:14:51.331 INFO:tasks.workunit.client.0.vm03.stdout:3/800: chown d5/ca 189 1 2026-03-09T16:14:51.337 INFO:tasks.workunit.client.0.vm03.stdout:8/856: dread da/db/f6a [0,4194304] 0 2026-03-09T16:14:51.338 INFO:tasks.workunit.client.0.vm03.stdout:7/774: link d4/da/d5d/dd8/ld7 d4/da/lff 0 2026-03-09T16:14:51.339 INFO:tasks.workunit.client.0.vm03.stdout:5/880: rmdir d2/d7/de/d11/d19/d31 39 2026-03-09T16:14:51.339 INFO:tasks.workunit.client.0.vm03.stdout:9/882: dread - d2/d4/d11/d29/d2a/f8b zero size 2026-03-09T16:14:51.339 INFO:tasks.workunit.client.0.vm03.stdout:7/775: readlink d4/da/d5d/db0/l83 0 2026-03-09T16:14:51.343 INFO:tasks.workunit.client.0.vm03.stdout:7/776: write d4/d2d/d4b/fd6 [665545,314] 0 2026-03-09T16:14:51.343 INFO:tasks.workunit.client.0.vm03.stdout:1/734: unlink d4/d6/d1d/d20/d93/f48 0 2026-03-09T16:14:51.344 INFO:tasks.workunit.client.0.vm03.stdout:7/777: write d4/d2d/f52 [5246380,89448] 0 2026-03-09T16:14:51.356 INFO:tasks.workunit.client.0.vm03.stdout:3/801: mkdir d5/ded 0 2026-03-09T16:14:51.357 INFO:tasks.workunit.client.0.vm03.stdout:8/857: symlink da/d10/d28/d4f/d68/ddc/l11c 0 2026-03-09T16:14:51.365 INFO:tasks.workunit.client.0.vm03.stdout:0/837: write d0/da/d5c/db6/fbc [5042030,119170] 0 2026-03-09T16:14:51.371 INFO:tasks.workunit.client.0.vm03.stdout:6/775: dread d9/f3b [0,4194304] 0 2026-03-09T16:14:51.374 INFO:tasks.workunit.client.0.vm03.stdout:9/883: mkdir d2/d4/d11/d29/d2a/d46/dd6/d111 0 2026-03-09T16:14:51.375 INFO:tasks.workunit.client.0.vm03.stdout:9/884: readlink d2/d4/d11/d12/l94 0 2026-03-09T16:14:51.382 INFO:tasks.workunit.client.0.vm03.stdout:3/802: sync 2026-03-09T16:14:51.382 INFO:tasks.workunit.client.0.vm03.stdout:7/778: read d4/da/d5d/dd8/d22/d24/d16/f6c [291847,127210] 0 2026-03-09T16:14:51.392 INFO:tasks.workunit.client.0.vm03.stdout:8/858: unlink da/d10/d28/d4f/d85/d9c/ld7 0 2026-03-09T16:14:51.397 INFO:tasks.workunit.client.0.vm03.stdout:2/816: dwrite db/d12/d2a/d61/d6d/f81 [0,4194304] 0 2026-03-09T16:14:51.397 INFO:tasks.workunit.client.0.vm03.stdout:8/859: sync 2026-03-09T16:14:51.398 INFO:tasks.workunit.client.0.vm03.stdout:8/860: write da/d32/d79/f103 [683982,53082] 0 2026-03-09T16:14:51.398 INFO:tasks.workunit.client.0.vm03.stdout:4/826: getdents d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a 0 2026-03-09T16:14:51.402 INFO:tasks.workunit.client.0.vm03.stdout:0/838: mknod d0/da/d1b/de0/c11e 0 2026-03-09T16:14:51.409 INFO:tasks.workunit.client.0.vm03.stdout:6/776: unlink d9/d14/c2e 0 2026-03-09T16:14:51.409 INFO:tasks.workunit.client.0.vm03.stdout:9/885: mkdir d2/d4/d11/d29/d2a/d46/d112 0 2026-03-09T16:14:51.416 INFO:tasks.workunit.client.0.vm03.stdout:3/803: dwrite d5/d1e/d42/d4c/feb [0,4194304] 0 2026-03-09T16:14:51.417 INFO:tasks.workunit.client.0.vm03.stdout:5/881: dwrite d2/d7/d1a/d1c/f5e [0,4194304] 0 2026-03-09T16:14:51.425 INFO:tasks.workunit.client.0.vm03.stdout:7/779: dread d4/da/d45/f63 [0,4194304] 0 2026-03-09T16:14:51.429 INFO:tasks.workunit.client.0.vm03.stdout:3/804: dwrite d5/d1e/d42/d55/d86/dbe/fe4 [0,4194304] 0 2026-03-09T16:14:51.438 INFO:tasks.workunit.client.0.vm03.stdout:3/805: dread - d5/d1e/d42/d55/d86/dae/fde zero size 2026-03-09T16:14:51.446 INFO:tasks.workunit.client.0.vm03.stdout:3/806: dwrite d5/d1e/d42/d4c/feb [0,4194304] 0 2026-03-09T16:14:51.470 INFO:tasks.workunit.client.0.vm03.stdout:8/861: symlink da/d6c/d7a/l11d 0 2026-03-09T16:14:51.470 INFO:tasks.workunit.client.0.vm03.stdout:4/827: truncate d5/d17/d44/f84 946155 0 2026-03-09T16:14:51.471 INFO:tasks.workunit.client.0.vm03.stdout:2/817: chown db/d12/c19 46 1 2026-03-09T16:14:51.473 INFO:tasks.workunit.client.0.vm03.stdout:1/735: rmdir d4/d6/d3b/d6b/da5/dc0/dec 0 2026-03-09T16:14:51.474 INFO:tasks.workunit.client.0.vm03.stdout:1/736: stat d4/d6/d3b/d6b/d25/fb8 0 2026-03-09T16:14:51.478 INFO:tasks.workunit.client.0.vm03.stdout:6/777: mkdir d9/d42/d45/d50/d80/d8a/dc1/dd4/df9 0 2026-03-09T16:14:51.479 INFO:tasks.workunit.client.0.vm03.stdout:9/886: symlink d2/d4/d11/d12/dc7/dee/l113 0 2026-03-09T16:14:51.486 INFO:tasks.workunit.client.0.vm03.stdout:8/862: sync 2026-03-09T16:14:51.489 INFO:tasks.workunit.client.0.vm03.stdout:7/780: dread d4/da/f20 [0,4194304] 0 2026-03-09T16:14:51.495 INFO:tasks.workunit.client.0.vm03.stdout:7/781: readlink d4/lb6 0 2026-03-09T16:14:51.495 INFO:tasks.workunit.client.0.vm03.stdout:7/782: chown d4/da/d45/d51/d36/cce 252 1 2026-03-09T16:14:51.495 INFO:tasks.workunit.client.0.vm03.stdout:6/778: dread d9/d42/d45/d50/d80/d8a/dc1/fe8 [0,4194304] 0 2026-03-09T16:14:51.495 INFO:tasks.workunit.client.0.vm03.stdout:3/807: readlink d5/d6d/la3 0 2026-03-09T16:14:51.501 INFO:tasks.workunit.client.0.vm03.stdout:2/818: creat db/d12/d2a/d99/de7/df9/d64/dbd/da0/f118 x:0 0 0 2026-03-09T16:14:51.515 INFO:tasks.workunit.client.0.vm03.stdout:8/863: creat da/d32/d79/f11e x:0 0 0 2026-03-09T16:14:51.517 INFO:tasks.workunit.client.0.vm03.stdout:8/864: chown da/d10/d28/f29 0 1 2026-03-09T16:14:51.521 INFO:tasks.workunit.client.0.vm03.stdout:1/737: dread d4/d6/f19 [0,4194304] 0 2026-03-09T16:14:51.522 INFO:tasks.workunit.client.0.vm03.stdout:3/808: read d5/d1e/d42/f29 [4862954,100892] 0 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:50] ENGINE Bus STARTING 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:50] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:50] ENGINE Client ('192.168.123.103', 51900) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:50] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: [09/Mar/2026:16:14:50] ENGINE Bus STARTED 2026-03-09T16:14:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:51 vm05.local ceph-mon[58702]: pgmap v5: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:51.535 INFO:tasks.workunit.client.0.vm03.stdout:0/839: truncate d0/d7/d3e/d95/ffa 1015485 0 2026-03-09T16:14:51.542 INFO:tasks.workunit.client.0.vm03.stdout:2/819: mknod db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/df7/c119 0 2026-03-09T16:14:51.543 INFO:tasks.workunit.client.0.vm03.stdout:2/820: write db/d12/d2a/d61/d6d/f8a [614576,15404] 0 2026-03-09T16:14:51.548 INFO:tasks.workunit.client.0.vm03.stdout:4/828: rename d5/db/d25/d8b/da8/df3/df7/d4d/da9/fc1 to d5/db/d25/d8b/da8/df3/df7/d33/ffd 0 2026-03-09T16:14:51.551 INFO:tasks.workunit.client.0.vm03.stdout:5/882: write d2/d7/de/d11/dbf/f102 [440397,127995] 0 2026-03-09T16:14:51.552 INFO:tasks.workunit.client.0.vm03.stdout:5/883: stat d2/d7/de/d11/d19/d29/c7c 0 2026-03-09T16:14:51.558 INFO:tasks.workunit.client.0.vm03.stdout:7/783: mkdir d4/da/dbf/deb/d100 0 2026-03-09T16:14:51.569 INFO:tasks.workunit.client.0.vm03.stdout:9/887: dwrite d2/d54/d7d/d8f/dad/def/d84/fa0 [0,4194304] 0 2026-03-09T16:14:51.571 INFO:tasks.workunit.client.0.vm03.stdout:0/840: mkdir d0/d7/d3e/d57/d5a/d47/dce/d11f 0 2026-03-09T16:14:51.572 INFO:tasks.workunit.client.0.vm03.stdout:6/779: write d9/d42/d45/f4a [955553,73772] 0 2026-03-09T16:14:51.583 INFO:tasks.workunit.client.0.vm03.stdout:3/809: dread d5/d1e/f66 [0,4194304] 0 2026-03-09T16:14:51.584 INFO:tasks.workunit.client.0.vm03.stdout:9/888: dwrite d2/d4/d11/d29/d2a/db3/fe1 [0,4194304] 0 2026-03-09T16:14:51.588 INFO:tasks.workunit.client.0.vm03.stdout:8/865: rename da/d10/d28/d4f/d68/ddc/l96 to da/d10/d28/d64/l11f 0 2026-03-09T16:14:51.591 INFO:tasks.workunit.client.0.vm03.stdout:9/889: fsync d2/d4/d11/d12/dc7/dee/dc2/de9/ffb 0 2026-03-09T16:14:51.595 INFO:tasks.workunit.client.0.vm03.stdout:4/829: mkdir d5/db/d25/d8b/dd6/dfe 0 2026-03-09T16:14:51.595 INFO:tasks.workunit.client.0.vm03.stdout:9/890: chown d2/d4/d1f 1901576 1 2026-03-09T16:14:51.596 INFO:tasks.workunit.client.0.vm03.stdout:1/738: write d4/d31/d5c/f9e [468707,82969] 0 2026-03-09T16:14:51.598 INFO:tasks.workunit.client.0.vm03.stdout:1/739: chown d4/d6/d1d/d20/d93/l67 169 1 2026-03-09T16:14:51.599 INFO:tasks.workunit.client.0.vm03.stdout:5/884: rmdir d2/d7/d115/d24 39 2026-03-09T16:14:51.607 INFO:tasks.workunit.client.0.vm03.stdout:2/821: mkdir db/d12/d11a 0 2026-03-09T16:14:51.607 INFO:tasks.workunit.client.0.vm03.stdout:6/780: truncate d9/f5c 287069 0 2026-03-09T16:14:51.608 INFO:tasks.workunit.client.0.vm03.stdout:4/830: dread d5/dd/d1f/fbb [0,4194304] 0 2026-03-09T16:14:51.610 INFO:tasks.workunit.client.0.vm03.stdout:1/740: dwrite d4/d6/da2/fe7 [0,4194304] 0 2026-03-09T16:14:51.616 INFO:tasks.workunit.client.0.vm03.stdout:1/741: stat d4/d6/da2 0 2026-03-09T16:14:51.616 INFO:tasks.workunit.client.0.vm03.stdout:7/784: rename d4/d2d/d4b/c82 to d4/da/d5d/dd8/d22/d24/d15/c101 0 2026-03-09T16:14:51.619 INFO:tasks.workunit.client.0.vm03.stdout:2/822: write db/d12/d2a/d61/d6d/f81 [4516920,22369] 0 2026-03-09T16:14:51.623 INFO:tasks.workunit.client.0.vm03.stdout:1/742: dread d4/d6/d1d/d20/d23/f62 [0,4194304] 0 2026-03-09T16:14:51.635 INFO:tasks.workunit.client.0.vm03.stdout:4/831: fsync d5/db/d25/d8b/da8/dbe/fc5 0 2026-03-09T16:14:51.637 INFO:tasks.workunit.client.0.vm03.stdout:6/781: mkdir d9/d42/d45/dfa 0 2026-03-09T16:14:51.639 INFO:tasks.workunit.client.0.vm03.stdout:9/891: read d2/de/d88/f86 [3575089,65731] 0 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:50] ENGINE Bus STARTING 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:50] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:50] ENGINE Client ('192.168.123.103', 51900) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:50] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: [09/Mar/2026:16:14:50] ENGINE Bus STARTED 2026-03-09T16:14:51.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:51 vm03.local ceph-mon[51019]: pgmap v5: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:51.645 INFO:tasks.workunit.client.0.vm03.stdout:2/823: chown db/d12/c19 0 1 2026-03-09T16:14:51.653 INFO:tasks.workunit.client.0.vm03.stdout:1/743: unlink d4/d7b/fde 0 2026-03-09T16:14:51.653 INFO:tasks.workunit.client.0.vm03.stdout:1/744: chown d4/d6/d1d/d20/d93/c94 30520 1 2026-03-09T16:14:51.653 INFO:tasks.workunit.client.0.vm03.stdout:3/810: link d5/d1e/d42/d8b/fa5 d5/d53/d88/dd7/fee 0 2026-03-09T16:14:51.653 INFO:tasks.workunit.client.0.vm03.stdout:3/811: chown d5/d2e/cb4 1 1 2026-03-09T16:14:51.653 INFO:tasks.workunit.client.0.vm03.stdout:3/812: chown d5/d1e/d42/d55/d86/dbe 0 1 2026-03-09T16:14:51.655 INFO:tasks.workunit.client.0.vm03.stdout:4/832: sync 2026-03-09T16:14:51.656 INFO:tasks.workunit.client.0.vm03.stdout:0/841: rename d0/d7/d48/f63 to d0/d7/d3e/d57/d5a/d47/dce/df6/f120 0 2026-03-09T16:14:51.661 INFO:tasks.workunit.client.0.vm03.stdout:1/745: stat d4/d6/d3b/d6b/c2c 0 2026-03-09T16:14:51.661 INFO:tasks.workunit.client.0.vm03.stdout:3/813: truncate d5/d1e/d42/f29 1383556 0 2026-03-09T16:14:51.669 INFO:tasks.workunit.client.0.vm03.stdout:8/866: rename da/d10/d28/d4f/daf/cd6 to da/d6c/d7a/de4/c120 0 2026-03-09T16:14:51.670 INFO:tasks.workunit.client.0.vm03.stdout:8/867: dread - da/d10/d28/d4f/d85/d9c/d10e/f104 zero size 2026-03-09T16:14:51.677 INFO:tasks.workunit.client.0.vm03.stdout:5/885: dwrite d2/d7/de/d11/f32 [8388608,4194304] 0 2026-03-09T16:14:51.683 INFO:tasks.workunit.client.0.vm03.stdout:5/886: sync 2026-03-09T16:14:51.685 INFO:tasks.workunit.client.0.vm03.stdout:7/785: dwrite d4/da/d5d/dd8/f37 [0,4194304] 0 2026-03-09T16:14:51.690 INFO:tasks.workunit.client.0.vm03.stdout:7/786: chown d4/da/c31 1285 1 2026-03-09T16:14:51.695 INFO:tasks.workunit.client.0.vm03.stdout:7/787: chown d4/da/d5d/db0/d61/lf5 15 1 2026-03-09T16:14:51.704 INFO:tasks.workunit.client.0.vm03.stdout:7/788: dwrite d4/da/d5d/dd8/d22/f33 [0,4194304] 0 2026-03-09T16:14:51.714 INFO:tasks.workunit.client.0.vm03.stdout:7/789: dwrite d4/da/d5d/dd8/d22/f48 [0,4194304] 0 2026-03-09T16:14:51.719 INFO:tasks.workunit.client.0.vm03.stdout:0/842: mknod d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/c121 0 2026-03-09T16:14:51.719 INFO:tasks.workunit.client.0.vm03.stdout:6/782: truncate d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/feb 1151987 0 2026-03-09T16:14:51.719 INFO:tasks.workunit.client.0.vm03.stdout:9/892: getdents d2/d4/d11/d29/d2a/d104 0 2026-03-09T16:14:51.719 INFO:tasks.workunit.client.0.vm03.stdout:7/790: fsync d4/da/d5d/db0/d61/fd2 0 2026-03-09T16:14:51.722 INFO:tasks.workunit.client.0.vm03.stdout:4/833: creat d5/db/d25/d8b/da8/df3/df7/d4d/da9/fff x:0 0 0 2026-03-09T16:14:51.727 INFO:tasks.workunit.client.0.vm03.stdout:9/893: dread d2/d4/d11/d12/dc7/dee/fe4 [0,4194304] 0 2026-03-09T16:14:51.731 INFO:tasks.workunit.client.0.vm03.stdout:1/746: mkdir d4/db/d59/df1 0 2026-03-09T16:14:51.732 INFO:tasks.workunit.client.0.vm03.stdout:7/791: dwrite d4/f8d [0,4194304] 0 2026-03-09T16:14:51.735 INFO:tasks.workunit.client.0.vm03.stdout:8/868: mknod da/d10/d28/c121 0 2026-03-09T16:14:51.736 INFO:tasks.workunit.client.0.vm03.stdout:8/869: read - da/d6c/f10c zero size 2026-03-09T16:14:51.746 INFO:tasks.workunit.client.0.vm03.stdout:6/783: fsync d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f4e 0 2026-03-09T16:14:51.749 INFO:tasks.workunit.client.0.vm03.stdout:0/843: sync 2026-03-09T16:14:51.755 INFO:tasks.workunit.client.0.vm03.stdout:4/834: mknod d5/db/c100 0 2026-03-09T16:14:51.755 INFO:tasks.workunit.client.0.vm03.stdout:9/894: mkdir d2/d4/d11/d29/d2a/d38/dcd/d114 0 2026-03-09T16:14:51.756 INFO:tasks.workunit.client.0.vm03.stdout:2/824: mknod db/d12/da5/dc2/c11b 0 2026-03-09T16:14:51.758 INFO:tasks.workunit.client.0.vm03.stdout:1/747: symlink d4/d6/d1d/d3d/lf2 0 2026-03-09T16:14:51.762 INFO:tasks.workunit.client.0.vm03.stdout:5/887: dread d2/d7/d3c/f9a [0,4194304] 0 2026-03-09T16:14:51.767 INFO:tasks.workunit.client.0.vm03.stdout:2/825: sync 2026-03-09T16:14:51.787 INFO:tasks.workunit.client.0.vm03.stdout:9/895: mknod d2/de/d88/c115 0 2026-03-09T16:14:51.788 INFO:tasks.workunit.client.0.vm03.stdout:9/896: readlink d2/d4/d11/d12/l94 0 2026-03-09T16:14:51.814 INFO:tasks.workunit.client.0.vm03.stdout:1/748: dread d4/db/f21 [0,4194304] 0 2026-03-09T16:14:51.815 INFO:tasks.workunit.client.0.vm03.stdout:1/749: write d4/d31/d5c/da8/fd0 [72833,71279] 0 2026-03-09T16:14:51.818 INFO:tasks.workunit.client.0.vm03.stdout:7/792: symlink d4/da/d5d/db0/l102 0 2026-03-09T16:14:51.827 INFO:tasks.workunit.client.0.vm03.stdout:3/814: truncate d5/d1e/d42/d4c/feb 4160065 0 2026-03-09T16:14:51.828 INFO:tasks.workunit.client.0.vm03.stdout:9/897: truncate d2/d4/d11/d12/d28/f2f 4861134 0 2026-03-09T16:14:51.829 INFO:tasks.workunit.client.0.vm03.stdout:9/898: chown d2/d4/d1f/c62 62746585 1 2026-03-09T16:14:51.836 INFO:tasks.workunit.client.0.vm03.stdout:0/844: write d0/d7/d3e/d57/fa8 [1705924,11187] 0 2026-03-09T16:14:51.841 INFO:tasks.workunit.client.0.vm03.stdout:2/826: mkdir db/d12/d2a/d11c 0 2026-03-09T16:14:51.848 INFO:tasks.workunit.client.0.vm03.stdout:4/835: dwrite d5/db/d25/d8b/da8/df3/df7/d33/f69 [0,4194304] 0 2026-03-09T16:14:51.850 INFO:tasks.workunit.client.0.vm03.stdout:5/888: write d2/d7/de/d33/f8b [4450939,33429] 0 2026-03-09T16:14:51.878 INFO:tasks.workunit.client.0.vm03.stdout:8/870: link da/d10/d28/f8c da/d10/d28/d4f/f122 0 2026-03-09T16:14:51.878 INFO:tasks.workunit.client.0.vm03.stdout:2/827: creat db/d12/d2a/d99/de7/df9/d64/dbd/f11d x:0 0 0 2026-03-09T16:14:51.878 INFO:tasks.workunit.client.0.vm03.stdout:1/750: write d4/d31/f81 [1014953,6747] 0 2026-03-09T16:14:51.880 INFO:tasks.workunit.client.0.vm03.stdout:2/828: stat db/d12/da5/fd2 0 2026-03-09T16:14:51.880 INFO:tasks.workunit.client.0.vm03.stdout:6/784: link d9/c75 d9/d42/d45/d50/d80/d8a/cfb 0 2026-03-09T16:14:51.887 INFO:tasks.workunit.client.0.vm03.stdout:9/899: symlink d2/d54/l116 0 2026-03-09T16:14:51.888 INFO:tasks.workunit.client.0.vm03.stdout:7/793: link d4/da/l4a d4/da/dbf/deb/l103 0 2026-03-09T16:14:51.893 INFO:tasks.workunit.client.0.vm03.stdout:8/871: creat da/d10/d28/d4f/daf/f123 x:0 0 0 2026-03-09T16:14:51.894 INFO:tasks.workunit.client.0.vm03.stdout:8/872: read da/d32/f61 [143896,75250] 0 2026-03-09T16:14:51.894 INFO:tasks.workunit.client.0.vm03.stdout:4/836: write d5/dd/d1f/f4c [3296653,130411] 0 2026-03-09T16:14:51.902 INFO:tasks.workunit.client.0.vm03.stdout:1/751: rmdir d4/d6/d1d/d20 39 2026-03-09T16:14:51.905 INFO:tasks.workunit.client.0.vm03.stdout:3/815: dwrite d5/d1e/d42/d4c/feb [0,4194304] 0 2026-03-09T16:14:51.906 INFO:tasks.workunit.client.0.vm03.stdout:0/845: rename d0/da/d1b/d9b/c6d to d0/d7/d3e/d57/d5a/d5f/db2/c122 0 2026-03-09T16:14:51.907 INFO:tasks.workunit.client.0.vm03.stdout:0/846: read d0/da/d1b/f46 [687965,126289] 0 2026-03-09T16:14:51.923 INFO:tasks.workunit.client.0.vm03.stdout:9/900: truncate d2/d4/d11/d12/d28/fd5 428606 0 2026-03-09T16:14:51.933 INFO:tasks.workunit.client.0.vm03.stdout:3/816: dread d5/fb [0,4194304] 0 2026-03-09T16:14:51.940 INFO:tasks.workunit.client.0.vm03.stdout:8/873: dwrite da/db/f6a [4194304,4194304] 0 2026-03-09T16:14:51.940 INFO:tasks.workunit.client.0.vm03.stdout:2/829: dwrite db/d12/f69 [0,4194304] 0 2026-03-09T16:14:51.944 INFO:tasks.workunit.client.0.vm03.stdout:6/785: mknod d9/d42/cfc 0 2026-03-09T16:14:51.945 INFO:tasks.workunit.client.0.vm03.stdout:4/837: dread - d5/dd/d1f/d95/fad zero size 2026-03-09T16:14:51.949 INFO:tasks.workunit.client.0.vm03.stdout:4/838: dread - d5/dd/d1f/f59 zero size 2026-03-09T16:14:51.950 INFO:tasks.workunit.client.0.vm03.stdout:1/752: mkdir d4/d39/d7f/df3 0 2026-03-09T16:14:51.963 INFO:tasks.workunit.client.0.vm03.stdout:5/889: rename d2/f5a to d2/d7/d115/d16/d5c/f12a 0 2026-03-09T16:14:51.966 INFO:tasks.workunit.client.0.vm03.stdout:9/901: creat d2/de/f117 x:0 0 0 2026-03-09T16:14:51.967 INFO:tasks.workunit.client.0.vm03.stdout:9/902: fsync d2/d4/d11/d29/fc0 0 2026-03-09T16:14:51.971 INFO:tasks.workunit.client.0.vm03.stdout:3/817: mkdir d5/d53/d6c/d79/d91/dc9/def 0 2026-03-09T16:14:51.975 INFO:tasks.workunit.client.0.vm03.stdout:8/874: fsync da/db/f53 0 2026-03-09T16:14:51.983 INFO:tasks.workunit.client.0.vm03.stdout:4/839: mkdir d5/db/d25/d8b/da8/df3/df7/d33/d79/d101 0 2026-03-09T16:14:51.985 INFO:tasks.workunit.client.0.vm03.stdout:3/818: dwrite d5/d1e/d42/d55/d86/dae/fde [0,4194304] 0 2026-03-09T16:14:51.997 INFO:tasks.workunit.client.0.vm03.stdout:1/753: symlink d4/d6/d3b/d63/lf4 0 2026-03-09T16:14:51.999 INFO:tasks.workunit.client.0.vm03.stdout:5/890: rmdir d2/d7/d115/d16 39 2026-03-09T16:14:52.005 INFO:tasks.workunit.client.0.vm03.stdout:9/903: creat d2/d4/d1f/d83/f118 x:0 0 0 2026-03-09T16:14:52.006 INFO:tasks.workunit.client.0.vm03.stdout:2/830: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda/f11e x:0 0 0 2026-03-09T16:14:52.012 INFO:tasks.workunit.client.0.vm03.stdout:4/840: symlink d5/db/d25/d8b/da8/df3/df7/d4d/da9/l102 0 2026-03-09T16:14:52.015 INFO:tasks.workunit.client.0.vm03.stdout:1/754: chown d4/d6/d1d/d20/d93/l44 46 1 2026-03-09T16:14:52.016 INFO:tasks.workunit.client.0.vm03.stdout:9/904: dwrite d2/d4/d11/d29/d2a/d46/f9e [0,4194304] 0 2026-03-09T16:14:52.020 INFO:tasks.workunit.client.0.vm03.stdout:5/891: dread - d2/d7/de/d11/d19/d31/d35/f11b zero size 2026-03-09T16:14:52.020 INFO:tasks.workunit.client.0.vm03.stdout:1/755: write d4/d6/d1d/d20/f2a [2583076,121227] 0 2026-03-09T16:14:52.021 INFO:tasks.workunit.client.0.vm03.stdout:5/892: dread - d2/d7/de9/f103 zero size 2026-03-09T16:14:52.021 INFO:tasks.workunit.client.0.vm03.stdout:1/756: read d4/f1b [288739,1640] 0 2026-03-09T16:14:52.035 INFO:tasks.workunit.client.0.vm03.stdout:9/905: dwrite d2/d4/d11/d12/d28/fc5 [0,4194304] 0 2026-03-09T16:14:52.045 INFO:tasks.workunit.client.0.vm03.stdout:7/794: link d4/da/d5d/dd8/d22/d24/d15/l1f d4/da/d45/l104 0 2026-03-09T16:14:52.051 INFO:tasks.workunit.client.0.vm03.stdout:5/893: dread d2/d7/de/faa [4194304,4194304] 0 2026-03-09T16:14:52.054 INFO:tasks.workunit.client.0.vm03.stdout:8/875: unlink da/d10/d28/d4f/daf/fb7 0 2026-03-09T16:14:52.079 INFO:tasks.workunit.client.0.vm03.stdout:2/831: write db/d12/d2a/d99/de7/df9/f87 [811368,84203] 0 2026-03-09T16:14:52.079 INFO:tasks.workunit.client.0.vm03.stdout:6/786: creat d9/d42/d45/ffd x:0 0 0 2026-03-09T16:14:52.092 INFO:tasks.workunit.client.0.vm03.stdout:0/847: rename d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/c121 to d0/d7/d3e/d57/d5a/d5f/db2/c123 0 2026-03-09T16:14:52.098 INFO:tasks.workunit.client.0.vm03.stdout:4/841: rmdir d5/db/d25/d8b/da8/df3/df7/dcc 39 2026-03-09T16:14:52.101 INFO:tasks.workunit.client.0.vm03.stdout:9/906: creat d2/d4/d11/d29/d2a/d38/dcd/f119 x:0 0 0 2026-03-09T16:14:52.111 INFO:tasks.workunit.client.0.vm03.stdout:4/842: dread d5/db/d25/d8b/da8/f62 [8388608,4194304] 0 2026-03-09T16:14:52.112 INFO:tasks.workunit.client.0.vm03.stdout:3/819: rename d5/d53/d6c/l41 to d5/d1e/d42/d34/dd2/lf0 0 2026-03-09T16:14:52.114 INFO:tasks.workunit.client.0.vm03.stdout:1/757: mknod d4/d39/d7f/df3/cf5 0 2026-03-09T16:14:52.116 INFO:tasks.workunit.client.0.vm03.stdout:8/876: fsync da/d1d/f4a 0 2026-03-09T16:14:52.117 INFO:tasks.workunit.client.0.vm03.stdout:1/758: read - d4/d6/d3b/d6b/da5/dc0/fed zero size 2026-03-09T16:14:52.135 INFO:tasks.workunit.client.0.vm03.stdout:7/795: link d4/da/d45/d51/f9e d4/da/dbf/f105 0 2026-03-09T16:14:52.138 INFO:tasks.workunit.client.0.vm03.stdout:0/848: dread d0/da/d7a/fac [0,4194304] 0 2026-03-09T16:14:52.141 INFO:tasks.workunit.client.0.vm03.stdout:6/787: write d9/d42/d45/d50/d80/fa1 [4543834,9589] 0 2026-03-09T16:14:52.142 INFO:tasks.workunit.client.0.vm03.stdout:5/894: dwrite d2/d7/d3c/d3d/f56 [0,4194304] 0 2026-03-09T16:14:52.144 INFO:tasks.workunit.client.0.vm03.stdout:8/877: fdatasync da/d32/f66 0 2026-03-09T16:14:52.146 INFO:tasks.workunit.client.0.vm03.stdout:8/878: chown da/def 0 1 2026-03-09T16:14:52.147 INFO:tasks.workunit.client.0.vm03.stdout:7/796: write d4/da/d45/d51/d36/ff6 [1017639,116417] 0 2026-03-09T16:14:52.148 INFO:tasks.workunit.client.0.vm03.stdout:4/843: dwrite d5/d17/f83 [0,4194304] 0 2026-03-09T16:14:52.154 INFO:tasks.workunit.client.0.vm03.stdout:6/788: unlink d9/d42/d45/d50/d80/d8a/d9c/l82 0 2026-03-09T16:14:52.158 INFO:tasks.workunit.client.0.vm03.stdout:1/759: creat d4/db/d59/df1/ff6 x:0 0 0 2026-03-09T16:14:52.162 INFO:tasks.workunit.client.0.vm03.stdout:2/832: rename db/d12/d2a/d61/c4e to db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/c11f 0 2026-03-09T16:14:52.162 INFO:tasks.workunit.client.0.vm03.stdout:0/849: dread d0/da/d5c/f41 [0,4194304] 0 2026-03-09T16:14:52.188 INFO:tasks.workunit.client.0.vm03.stdout:3/820: dwrite d5/d53/d88/dd3/fba [0,4194304] 0 2026-03-09T16:14:52.189 INFO:tasks.workunit.client.0.vm03.stdout:3/821: fsync d5/d1e/d42/f74 0 2026-03-09T16:14:52.190 INFO:tasks.workunit.client.0.vm03.stdout:3/822: chown d5/d2e/c4d 423 1 2026-03-09T16:14:52.204 INFO:tasks.workunit.client.0.vm03.stdout:7/797: creat d4/da/d5d/db0/d61/f106 x:0 0 0 2026-03-09T16:14:52.207 INFO:tasks.workunit.client.0.vm03.stdout:4/844: fdatasync d5/dd/d1f/d5f/f7c 0 2026-03-09T16:14:52.220 INFO:tasks.workunit.client.0.vm03.stdout:0/850: mknod d0/da/d5c/db6/c124 0 2026-03-09T16:14:52.233 INFO:tasks.workunit.client.0.vm03.stdout:3/823: mkdir d5/d53/d88/dd7/df1 0 2026-03-09T16:14:52.245 INFO:tasks.workunit.client.0.vm03.stdout:6/789: write d9/d14/f31 [577218,118227] 0 2026-03-09T16:14:52.249 INFO:tasks.workunit.client.0.vm03.stdout:5/895: dread d2/d7/d115/d16/d5c/dfc/d106/d3b/fa2 [0,4194304] 0 2026-03-09T16:14:52.258 INFO:tasks.workunit.client.0.vm03.stdout:1/760: creat d4/d6/d3b/dbe/ff7 x:0 0 0 2026-03-09T16:14:52.259 INFO:tasks.workunit.client.0.vm03.stdout:6/790: dread d9/d14/f29 [0,4194304] 0 2026-03-09T16:14:52.262 INFO:tasks.workunit.client.0.vm03.stdout:6/791: write d9/d42/d45/d50/d80/d8a/d9c/d97/f9d [1031978,96321] 0 2026-03-09T16:14:52.265 INFO:tasks.workunit.client.0.vm03.stdout:0/851: fsync d0/da/d5c/f66 0 2026-03-09T16:14:52.266 INFO:tasks.workunit.client.0.vm03.stdout:0/852: chown d0/da/d5c/db6 707223158 1 2026-03-09T16:14:52.268 INFO:tasks.workunit.client.0.vm03.stdout:5/896: dread - d2/d7/d115/d24/d27/d43/f120 zero size 2026-03-09T16:14:52.274 INFO:tasks.workunit.client.0.vm03.stdout:3/824: sync 2026-03-09T16:14:52.278 INFO:tasks.workunit.client.0.vm03.stdout:8/879: link da/d10/d28/f29 da/d6c/d7a/de4/f124 0 2026-03-09T16:14:52.278 INFO:tasks.workunit.client.0.vm03.stdout:3/825: sync 2026-03-09T16:14:52.278 INFO:tasks.workunit.client.0.vm03.stdout:8/880: readlink da/d10/d28/l9a 0 2026-03-09T16:14:52.283 INFO:tasks.workunit.client.0.vm03.stdout:4/845: truncate d5/db/d25/d8b/da8/df3/df7/d33/fb6 1072374 0 2026-03-09T16:14:52.285 INFO:tasks.workunit.client.0.vm03.stdout:9/907: rename d2/d54/f69 to d2/d4/f11a 0 2026-03-09T16:14:52.289 INFO:tasks.workunit.client.0.vm03.stdout:9/908: dread d2/d4/d11/d12/dc7/dee/fcb [0,4194304] 0 2026-03-09T16:14:52.293 INFO:tasks.workunit.client.0.vm03.stdout:0/853: write d0/da/d1b/ffd [982101,104523] 0 2026-03-09T16:14:52.298 INFO:tasks.workunit.client.0.vm03.stdout:6/792: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f1c [2831328,30891] 0 2026-03-09T16:14:52.303 INFO:tasks.workunit.client.0.vm03.stdout:7/798: link d4/da/d45/d51/d36/f6f d4/da/d45/d51/d36/f107 0 2026-03-09T16:14:52.303 INFO:tasks.workunit.client.0.vm03.stdout:7/799: fdatasync d4/da/d5d/dd8/f44 0 2026-03-09T16:14:52.304 INFO:tasks.workunit.client.0.vm03.stdout:7/800: write d4/da/d5d/dd8/f44 [207078,104625] 0 2026-03-09T16:14:52.308 INFO:tasks.workunit.client.0.vm03.stdout:8/881: rmdir da/d32/db5 39 2026-03-09T16:14:52.309 INFO:tasks.workunit.client.0.vm03.stdout:8/882: write da/d32/d79/f11e [771668,89500] 0 2026-03-09T16:14:52.317 INFO:tasks.workunit.client.0.vm03.stdout:4/846: dread d5/db/d25/d8b/da8/df3/df7/d33/d79/f89 [0,4194304] 0 2026-03-09T16:14:52.317 INFO:tasks.workunit.client.0.vm03.stdout:4/847: chown d5/dd/dba 260299 1 2026-03-09T16:14:52.317 INFO:tasks.workunit.client.0.vm03.stdout:5/897: write d2/d7/d115/d24/fef [47462,34317] 0 2026-03-09T16:14:52.319 INFO:tasks.workunit.client.0.vm03.stdout:3/826: write d5/d53/fcc [986719,93750] 0 2026-03-09T16:14:52.322 INFO:tasks.workunit.client.0.vm03.stdout:4/848: chown d5/db/d25/d8b/da8/df3/df7/fa1 6667108 1 2026-03-09T16:14:52.325 INFO:tasks.workunit.client.0.vm03.stdout:2/833: rename db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda/ffd to db/d12/d2a/d61/d6d/f120 0 2026-03-09T16:14:52.325 INFO:tasks.workunit.client.0.vm03.stdout:2/834: readlink db/laa 0 2026-03-09T16:14:52.331 INFO:tasks.workunit.client.0.vm03.stdout:0/854: unlink d0/d7/f56 0 2026-03-09T16:14:52.336 INFO:tasks.workunit.client.0.vm03.stdout:7/801: dread - d4/da/d5d/dd8/d22/d24/d16/d6e/fa1 zero size 2026-03-09T16:14:52.339 INFO:tasks.workunit.client.0.vm03.stdout:8/883: truncate da/d32/d79/f84 1138655 0 2026-03-09T16:14:52.341 INFO:tasks.workunit.client.0.vm03.stdout:8/884: read da/d10/d28/d4f/d68/fb3 [4182365,79152] 0 2026-03-09T16:14:52.350 INFO:tasks.workunit.client.0.vm03.stdout:7/802: dread d4/d2d/f32 [0,4194304] 0 2026-03-09T16:14:52.356 INFO:tasks.workunit.client.0.vm03.stdout:4/849: fsync d5/db/d25/d8b/da8/df3/df7/d4d/f85 0 2026-03-09T16:14:52.357 INFO:tasks.workunit.client.0.vm03.stdout:4/850: read - d5/dd/fe9 zero size 2026-03-09T16:14:52.358 INFO:tasks.workunit.client.0.vm03.stdout:7/803: sync 2026-03-09T16:14:52.363 INFO:tasks.workunit.client.0.vm03.stdout:9/909: rename d2/d4/d11/f6c to d2/d4/d1f/f11b 0 2026-03-09T16:14:52.369 INFO:tasks.workunit.client.0.vm03.stdout:2/835: write db/d12/d2a/d99/de7/df9/d64/fc0 [400726,7238] 0 2026-03-09T16:14:52.370 INFO:tasks.workunit.client.0.vm03.stdout:2/836: fdatasync db/d12/d2a/d99/fcd 0 2026-03-09T16:14:52.376 INFO:tasks.workunit.client.0.vm03.stdout:0/855: dread d0/d7/d3e/d57/f90 [0,4194304] 0 2026-03-09T16:14:52.378 INFO:tasks.workunit.client.0.vm03.stdout:7/804: dwrite d4/da/d45/d51/d36/f94 [4194304,4194304] 0 2026-03-09T16:14:52.381 INFO:tasks.workunit.client.0.vm03.stdout:5/898: truncate d2/d7/d115/d24/d27/d43/d4b/fd1 1412722 0 2026-03-09T16:14:52.381 INFO:tasks.workunit.client.0.vm03.stdout:1/761: getdents d4/d6/da2 0 2026-03-09T16:14:52.381 INFO:tasks.workunit.client.0.vm03.stdout:4/851: creat d5/db/d25/d8b/da8/d81/f103 x:0 0 0 2026-03-09T16:14:52.381 INFO:tasks.workunit.client.0.vm03.stdout:6/793: write d9/d42/d45/d50/fba [816794,73074] 0 2026-03-09T16:14:52.383 INFO:tasks.workunit.client.0.vm03.stdout:8/885: chown da/d10/d28/l50 5277 1 2026-03-09T16:14:52.385 INFO:tasks.workunit.client.0.vm03.stdout:8/886: chown da/f114 5 1 2026-03-09T16:14:52.389 INFO:tasks.workunit.client.0.vm03.stdout:7/805: sync 2026-03-09T16:14:52.390 INFO:tasks.workunit.client.0.vm03.stdout:5/899: dwrite d2/d7/d115/d24/d27/fc3 [0,4194304] 0 2026-03-09T16:14:52.402 INFO:tasks.workunit.client.0.vm03.stdout:7/806: dread d4/da/d45/d51/f91 [0,4194304] 0 2026-03-09T16:14:52.405 INFO:tasks.workunit.client.0.vm03.stdout:3/827: rename d5/d1e/d42/d55/d86 to d5/d6d/db9/df2 0 2026-03-09T16:14:52.406 INFO:tasks.workunit.client.0.vm03.stdout:6/794: dread d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:52.409 INFO:tasks.workunit.client.0.vm03.stdout:7/807: dread d4/d2d/d4b/f4c [0,4194304] 0 2026-03-09T16:14:52.418 INFO:tasks.workunit.client.0.vm03.stdout:1/762: symlink d4/d6/d1d/d69/lf8 0 2026-03-09T16:14:52.419 INFO:tasks.workunit.client.0.vm03.stdout:1/763: chown d4/db 2560 1 2026-03-09T16:14:52.421 INFO:tasks.workunit.client.0.vm03.stdout:4/852: chown f1 3384040 1 2026-03-09T16:14:52.423 INFO:tasks.workunit.client.0.vm03.stdout:3/828: sync 2026-03-09T16:14:52.438 INFO:tasks.workunit.client.0.vm03.stdout:2/837: dread db/d12/f63 [0,4194304] 0 2026-03-09T16:14:52.461 INFO:tasks.workunit.client.0.vm03.stdout:8/887: creat da/d32/d79/f125 x:0 0 0 2026-03-09T16:14:52.464 INFO:tasks.workunit.client.0.vm03.stdout:8/888: dwrite da/d32/d79/f125 [0,4194304] 0 2026-03-09T16:14:52.476 INFO:tasks.workunit.client.0.vm03.stdout:5/900: read - d2/fd4 zero size 2026-03-09T16:14:52.478 INFO:tasks.workunit.client.0.vm03.stdout:9/910: mkdir d2/d54/d7d/dd3/d10e/d11c 0 2026-03-09T16:14:52.480 INFO:tasks.workunit.client.0.vm03.stdout:7/808: mknod d4/da/d45/d51/dea/c108 0 2026-03-09T16:14:52.481 INFO:tasks.workunit.client.0.vm03.stdout:8/889: dread da/d10/d28/d64/fed [0,4194304] 0 2026-03-09T16:14:52.482 INFO:tasks.workunit.client.0.vm03.stdout:0/856: truncate d0/da/d5c/f66 2543327 0 2026-03-09T16:14:52.482 INFO:tasks.workunit.client.0.vm03.stdout:8/890: read - da/db/da8/f107 zero size 2026-03-09T16:14:52.483 INFO:tasks.workunit.client.0.vm03.stdout:4/853: creat d5/d17/da0/f104 x:0 0 0 2026-03-09T16:14:52.486 INFO:tasks.workunit.client.0.vm03.stdout:3/829: dread - d5/d44/d61/fb8 zero size 2026-03-09T16:14:52.488 INFO:tasks.workunit.client.0.vm03.stdout:2/838: rmdir db/d12/d2a/d61/dbe 39 2026-03-09T16:14:52.496 INFO:tasks.workunit.client.0.vm03.stdout:5/901: creat d2/d7/de/d11/d19/d29/d90/dbe/df5/f12b x:0 0 0 2026-03-09T16:14:52.496 INFO:tasks.workunit.client.0.vm03.stdout:5/902: fsync d2/d7/d115/d16/d5c/dfc/d106/d3b/f88 0 2026-03-09T16:14:52.511 INFO:tasks.workunit.client.0.vm03.stdout:6/795: mkdir d9/d42/d45/d50/d80/d8a/dc1/dd4/de5/dfe 0 2026-03-09T16:14:52.515 INFO:tasks.workunit.client.0.vm03.stdout:6/796: read d9/d42/d45/d50/d80/d90/f64 [1222850,130245] 0 2026-03-09T16:14:52.516 INFO:tasks.workunit.client.0.vm03.stdout:7/809: creat d4/da/d5d/dd8/d22/f109 x:0 0 0 2026-03-09T16:14:52.519 INFO:tasks.workunit.client.0.vm03.stdout:1/764: truncate d4/d6/f15 1359846 0 2026-03-09T16:14:52.522 INFO:tasks.workunit.client.0.vm03.stdout:9/911: dwrite d2/d4/d11/d29/d2a/d38/fb7 [0,4194304] 0 2026-03-09T16:14:52.525 INFO:tasks.workunit.client.0.vm03.stdout:0/857: rename d0/d7/d3e/d57/d5a/d82/dd2/d11a to d0/d7/d3e/d57/d5a/d82/d89/def/d125 0 2026-03-09T16:14:52.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:52 vm05.local ceph-mon[58702]: mgrmap e29: vm03.gbgzmu(active, since 4s), standbys: vm05.dygxfv 2026-03-09T16:14:52.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:52 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:52.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:52 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:52.528 INFO:tasks.workunit.client.0.vm03.stdout:0/858: chown d0/da/d5c/c10e 186 1 2026-03-09T16:14:52.528 INFO:tasks.workunit.client.0.vm03.stdout:3/830: chown d5/d44/d61/cbc 172661 1 2026-03-09T16:14:52.534 INFO:tasks.workunit.client.0.vm03.stdout:2/839: mkdir db/d12/da5/de4/d121 0 2026-03-09T16:14:52.539 INFO:tasks.workunit.client.0.vm03.stdout:7/810: rmdir d4/da/d5d/db0/d9d 39 2026-03-09T16:14:52.540 INFO:tasks.workunit.client.0.vm03.stdout:4/854: dread d5/db/f34 [0,4194304] 0 2026-03-09T16:14:52.541 INFO:tasks.workunit.client.0.vm03.stdout:0/859: stat d0/da/d5c/f33 0 2026-03-09T16:14:52.547 INFO:tasks.workunit.client.0.vm03.stdout:8/891: mkdir da/d32/db5/d126 0 2026-03-09T16:14:52.548 INFO:tasks.workunit.client.0.vm03.stdout:9/912: fdatasync d2/d4/d11/d12/dc7/dee/fcb 0 2026-03-09T16:14:52.549 INFO:tasks.workunit.client.0.vm03.stdout:5/903: dwrite d2/d7/de/d11/ff4 [0,4194304] 0 2026-03-09T16:14:52.564 INFO:tasks.workunit.client.0.vm03.stdout:4/855: write d5/dd/d1f/d5f/f98 [1152669,116118] 0 2026-03-09T16:14:52.583 INFO:tasks.workunit.client.0.vm03.stdout:6/797: rmdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad 39 2026-03-09T16:14:52.584 INFO:tasks.workunit.client.0.vm03.stdout:7/811: rename d4/da/d45/d51/d36/f107 to d4/da/d5d/db0/da9/db8/f10a 0 2026-03-09T16:14:52.584 INFO:tasks.workunit.client.0.vm03.stdout:0/860: truncate d0/d7/d3e/d57/d5a/d52/d9f/fe3 784290 0 2026-03-09T16:14:52.592 INFO:tasks.workunit.client.0.vm03.stdout:2/840: rename db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/leb to db/d12/d2a/d11c/l122 0 2026-03-09T16:14:52.599 INFO:tasks.workunit.client.0.vm03.stdout:6/798: creat d9/d42/d45/d65/dae/fff x:0 0 0 2026-03-09T16:14:52.611 INFO:tasks.workunit.client.0.vm03.stdout:6/799: dread d9/d84/fa9 [0,4194304] 0 2026-03-09T16:14:52.621 INFO:tasks.workunit.client.0.vm03.stdout:4/856: rmdir d5/db/d25/d9f 39 2026-03-09T16:14:52.625 INFO:tasks.workunit.client.0.vm03.stdout:9/913: rename d2/d4/d11/d12/dc7/dcc/fd0 to d2/d4/d11/d29/d2a/d46/dd6/d111/f11d 0 2026-03-09T16:14:52.628 INFO:tasks.workunit.client.0.vm03.stdout:1/765: dwrite d4/d6/d1d/d20/fc2 [8388608,4194304] 0 2026-03-09T16:14:52.630 INFO:tasks.workunit.client.0.vm03.stdout:2/841: unlink db/d12/da5/de4/f106 0 2026-03-09T16:14:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:52 vm03.local ceph-mon[51019]: mgrmap e29: vm03.gbgzmu(active, since 4s), standbys: vm05.dygxfv 2026-03-09T16:14:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:52 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:52 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:52.645 INFO:tasks.workunit.client.0.vm03.stdout:7/812: mkdir d4/da/d5d/dd8/d22/d24/d16/d10b 0 2026-03-09T16:14:52.651 INFO:tasks.workunit.client.0.vm03.stdout:8/892: dwrite da/db/f53 [4194304,4194304] 0 2026-03-09T16:14:52.665 INFO:tasks.workunit.client.0.vm03.stdout:5/904: dwrite d2/d7/de/fd8 [4194304,4194304] 0 2026-03-09T16:14:52.665 INFO:tasks.workunit.client.0.vm03.stdout:5/905: read d2/d7/d115/d24/d27/fc3 [331174,90463] 0 2026-03-09T16:14:52.671 INFO:tasks.workunit.client.0.vm03.stdout:3/831: link d5/d53/d88/dd7/cb2 d5/d6d/d6a/dbd/cf3 0 2026-03-09T16:14:52.671 INFO:tasks.workunit.client.0.vm03.stdout:6/800: read d9/d14/d71/f95 [3145950,100697] 0 2026-03-09T16:14:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/857: mkdir d5/db/d25/d8b/da8/df3/df7/d4d/da9/d105 0 2026-03-09T16:14:52.677 INFO:tasks.workunit.client.0.vm03.stdout:0/861: rename d0/f4e to d0/da/d1b/d9b/f126 0 2026-03-09T16:14:52.680 INFO:tasks.workunit.client.0.vm03.stdout:9/914: fsync d2/d4/d11/fa8 0 2026-03-09T16:14:52.680 INFO:tasks.workunit.client.0.vm03.stdout:2/842: sync 2026-03-09T16:14:52.690 INFO:tasks.workunit.client.0.vm03.stdout:1/766: rmdir d4/d6/d3b/d63 39 2026-03-09T16:14:52.692 INFO:tasks.workunit.client.0.vm03.stdout:7/813: readlink d4/da/d45/l104 0 2026-03-09T16:14:52.698 INFO:tasks.workunit.client.0.vm03.stdout:8/893: truncate da/db/d43/fe5 981329 0 2026-03-09T16:14:52.702 INFO:tasks.workunit.client.0.vm03.stdout:5/906: mkdir d2/d7/de/d12c 0 2026-03-09T16:14:52.704 INFO:tasks.workunit.client.0.vm03.stdout:5/907: chown d2/d7/d115/d16/d5c/dfc/d106/d3b 61 1 2026-03-09T16:14:52.711 INFO:tasks.workunit.client.0.vm03.stdout:3/832: unlink d5/d53/fcc 0 2026-03-09T16:14:52.712 INFO:tasks.workunit.client.0.vm03.stdout:6/801: dread - d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcf zero size 2026-03-09T16:14:52.713 INFO:tasks.workunit.client.0.vm03.stdout:4/858: truncate d5/d17/f8a 197031 0 2026-03-09T16:14:52.715 INFO:tasks.workunit.client.0.vm03.stdout:0/862: creat d0/d7/d3e/d57/d5a/d47/dce/df6/f127 x:0 0 0 2026-03-09T16:14:52.716 INFO:tasks.workunit.client.0.vm03.stdout:2/843: truncate db/d12/da5/fd2 183769 0 2026-03-09T16:14:52.721 INFO:tasks.workunit.client.0.vm03.stdout:7/814: stat d4/da/d5d/db0/d9d 0 2026-03-09T16:14:52.722 INFO:tasks.workunit.client.0.vm03.stdout:7/815: stat d4/da/d45/l8e 0 2026-03-09T16:14:52.725 INFO:tasks.workunit.client.0.vm03.stdout:1/767: write d4/d39/f5a [410115,27471] 0 2026-03-09T16:14:52.727 INFO:tasks.workunit.client.0.vm03.stdout:1/768: write d4/d6/d1d/d3d/f45 [4998631,85363] 0 2026-03-09T16:14:52.728 INFO:tasks.workunit.client.0.vm03.stdout:1/769: readlink d4/d6/d3b/d6b/da5/la7 0 2026-03-09T16:14:52.742 INFO:tasks.workunit.client.0.vm03.stdout:4/859: write d5/db/d25/d8b/da8/df3/df7/d33/d79/f7b [1239015,119782] 0 2026-03-09T16:14:52.744 INFO:tasks.workunit.client.0.vm03.stdout:0/863: dwrite d0/d7/d3e/d95/f99 [0,4194304] 0 2026-03-09T16:14:52.754 INFO:tasks.workunit.client.0.vm03.stdout:2/844: write db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/fc6 [606030,3631] 0 2026-03-09T16:14:52.766 INFO:tasks.workunit.client.0.vm03.stdout:3/833: link d5/d53/d6c/d79/d91/dc9/fe2 d5/d6d/d6a/dbd/ff4 0 2026-03-09T16:14:52.766 INFO:tasks.workunit.client.0.vm03.stdout:6/802: creat d9/d42/d45/d65/dbf/dc9/de4/f100 x:0 0 0 2026-03-09T16:14:52.766 INFO:tasks.workunit.client.0.vm03.stdout:9/915: link d2/de/d88/d7a/cf9 d2/d54/d7d/d8f/dad/def/d89/c11e 0 2026-03-09T16:14:52.770 INFO:tasks.workunit.client.0.vm03.stdout:4/860: creat d5/dd/d1f/d95/f106 x:0 0 0 2026-03-09T16:14:52.771 INFO:tasks.workunit.client.0.vm03.stdout:9/916: write d2/d4/d11/d12/d28/fc5 [230076,26035] 0 2026-03-09T16:14:52.774 INFO:tasks.workunit.client.0.vm03.stdout:0/864: mkdir d0/da/d1b/d9b/d128 0 2026-03-09T16:14:52.776 INFO:tasks.workunit.client.0.vm03.stdout:6/803: dread d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f4e [0,4194304] 0 2026-03-09T16:14:52.784 INFO:tasks.workunit.client.0.vm03.stdout:5/908: truncate d2/d7/de/d11/d19/d29/d90/fac 1147727 0 2026-03-09T16:14:52.786 INFO:tasks.workunit.client.0.vm03.stdout:5/909: chown d2/d7/d115/d24/d27/d43/d4b/dbc/ce1 224336874 1 2026-03-09T16:14:52.793 INFO:tasks.workunit.client.0.vm03.stdout:7/816: dwrite d4/da/d45/d51/d36/f6f [0,4194304] 0 2026-03-09T16:14:52.802 INFO:tasks.workunit.client.0.vm03.stdout:3/834: unlink d5/d1e/d42/c32 0 2026-03-09T16:14:52.807 INFO:tasks.workunit.client.0.vm03.stdout:0/865: mknod d0/da/d5c/db6/c129 0 2026-03-09T16:14:52.808 INFO:tasks.workunit.client.0.vm03.stdout:4/861: rename d5/db/d25/d8b/da8/df3/df7/d33/c3b to d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a/c107 0 2026-03-09T16:14:52.809 INFO:tasks.workunit.client.0.vm03.stdout:6/804: dread - d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcd zero size 2026-03-09T16:14:52.814 INFO:tasks.workunit.client.0.vm03.stdout:8/894: getdents da/db/d43 0 2026-03-09T16:14:52.817 INFO:tasks.workunit.client.0.vm03.stdout:1/770: link d4/d39/l51 d4/d39/d7f/lf9 0 2026-03-09T16:14:52.817 INFO:tasks.workunit.client.0.vm03.stdout:7/817: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/d10c 0 2026-03-09T16:14:52.817 INFO:tasks.workunit.client.0.vm03.stdout:9/917: truncate d2/d4/d11/f66 3217951 0 2026-03-09T16:14:52.819 INFO:tasks.workunit.client.0.vm03.stdout:3/835: read d5/d53/d6c/f9f [430450,23652] 0 2026-03-09T16:14:52.819 INFO:tasks.workunit.client.0.vm03.stdout:9/918: write d2/d4/d11/d29/d2a/d38/fb7 [2262918,28086] 0 2026-03-09T16:14:52.821 INFO:tasks.workunit.client.0.vm03.stdout:0/866: readlink d0/d7/d3e/l2d 0 2026-03-09T16:14:52.824 INFO:tasks.workunit.client.0.vm03.stdout:4/862: symlink d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/l108 0 2026-03-09T16:14:52.833 INFO:tasks.workunit.client.0.vm03.stdout:5/910: dread d2/d7/d3c/d3d/f93 [0,4194304] 0 2026-03-09T16:14:52.837 INFO:tasks.workunit.client.0.vm03.stdout:7/818: symlink d4/da/d45/d51/dea/l10d 0 2026-03-09T16:14:52.846 INFO:tasks.workunit.client.0.vm03.stdout:7/819: dread d4/da/d45/d51/d36/f6f [0,4194304] 0 2026-03-09T16:14:52.850 INFO:tasks.workunit.client.0.vm03.stdout:3/836: chown d5/d1e/d42/d34/dd2/f8a 6316 1 2026-03-09T16:14:52.851 INFO:tasks.workunit.client.0.vm03.stdout:9/919: creat d2/d4/d11/d29/d2a/db3/dbe/de0/f11f x:0 0 0 2026-03-09T16:14:52.855 INFO:tasks.workunit.client.0.vm03.stdout:2/845: getdents db/d12/d2a/d99/de7/df9 0 2026-03-09T16:14:52.875 INFO:tasks.workunit.client.0.vm03.stdout:0/867: mknod d0/da/d5c/c12a 0 2026-03-09T16:14:52.884 INFO:tasks.workunit.client.0.vm03.stdout:5/911: dwrite d2/d7/de/d11/dbf/fc5 [0,4194304] 0 2026-03-09T16:14:52.889 INFO:tasks.workunit.client.0.vm03.stdout:7/820: dwrite d4/da/d5d/dd8/d22/d24/d15/d71/fd1 [0,4194304] 0 2026-03-09T16:14:52.908 INFO:tasks.workunit.client.0.vm03.stdout:5/912: mknod d2/d7/de/d11/d19/d29/d90/dbe/c12d 0 2026-03-09T16:14:52.914 INFO:tasks.workunit.client.0.vm03.stdout:8/895: getdents da/db/da8/db8 0 2026-03-09T16:14:52.915 INFO:tasks.workunit.client.0.vm03.stdout:2/846: creat db/d12/da5/dc2/d110/f123 x:0 0 0 2026-03-09T16:14:52.916 INFO:tasks.workunit.client.0.vm03.stdout:1/771: rename d4/d6/d3b/d8e to d4/d6/d1d/dfa 0 2026-03-09T16:14:52.916 INFO:tasks.workunit.client.0.vm03.stdout:0/868: symlink d0/d7/d3e/d57/d5a/d47/dce/d11f/l12b 0 2026-03-09T16:14:52.921 INFO:tasks.workunit.client.0.vm03.stdout:6/805: link d9/d14/c1b d9/d42/d45/d50/d80/c101 0 2026-03-09T16:14:52.925 INFO:tasks.workunit.client.0.vm03.stdout:9/920: creat d2/d4/d11/f120 x:0 0 0 2026-03-09T16:14:52.926 INFO:tasks.workunit.client.0.vm03.stdout:9/921: dread - d2/d54/f5e zero size 2026-03-09T16:14:52.929 INFO:tasks.workunit.client.0.vm03.stdout:3/837: rename d5/d1e/f72 to d5/d6d/d6a/ff5 0 2026-03-09T16:14:52.929 INFO:tasks.workunit.client.0.vm03.stdout:4/863: rename d5 to d5/db/d25/dc8/d109 22 2026-03-09T16:14:52.932 INFO:tasks.workunit.client.0.vm03.stdout:8/896: mkdir da/db/d127 0 2026-03-09T16:14:52.932 INFO:tasks.workunit.client.0.vm03.stdout:1/772: truncate d4/d31/f4f 305128 0 2026-03-09T16:14:52.933 INFO:tasks.workunit.client.0.vm03.stdout:1/773: chown d4/db/d8b 175421164 1 2026-03-09T16:14:52.935 INFO:tasks.workunit.client.0.vm03.stdout:6/806: creat d9/d42/d45/d50/d80/d8a/dc1/f102 x:0 0 0 2026-03-09T16:14:52.937 INFO:tasks.workunit.client.0.vm03.stdout:9/922: symlink d2/d4/d11/d12/dc7/dee/dce/l121 0 2026-03-09T16:14:52.946 INFO:tasks.workunit.client.0.vm03.stdout:5/913: dread d2/d7/d1a/d1c/d3f/f92 [0,4194304] 0 2026-03-09T16:14:52.952 INFO:tasks.workunit.client.0.vm03.stdout:1/774: creat d4/d6/d1d/d20/ffb x:0 0 0 2026-03-09T16:14:52.952 INFO:tasks.workunit.client.0.vm03.stdout:3/838: symlink d5/d1e/d42/d34/lf6 0 2026-03-09T16:14:52.953 INFO:tasks.workunit.client.0.vm03.stdout:0/869: sync 2026-03-09T16:14:52.955 INFO:tasks.workunit.client.0.vm03.stdout:9/923: creat d2/d4/d1f/f122 x:0 0 0 2026-03-09T16:14:52.959 INFO:tasks.workunit.client.0.vm03.stdout:1/775: chown d4/db/d59/f9d 218878256 1 2026-03-09T16:14:52.960 INFO:tasks.workunit.client.0.vm03.stdout:9/924: sync 2026-03-09T16:14:52.987 INFO:tasks.workunit.client.0.vm03.stdout:6/807: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/df9/f103 x:0 0 0 2026-03-09T16:14:53.003 INFO:tasks.workunit.client.0.vm03.stdout:7/821: write d4/da/d5d/db0/d61/f8b [3518322,86222] 0 2026-03-09T16:14:53.019 INFO:tasks.workunit.client.0.vm03.stdout:2/847: dwrite db/d12/d2a/d99/de7/df9/fb4 [0,4194304] 0 2026-03-09T16:14:53.037 INFO:tasks.workunit.client.0.vm03.stdout:8/897: link da/d10/d28/f8b da/d32/db5/f128 0 2026-03-09T16:14:53.047 INFO:tasks.workunit.client.0.vm03.stdout:4/864: getdents d5/d17/d44 0 2026-03-09T16:14:53.048 INFO:tasks.workunit.client.0.vm03.stdout:6/808: truncate f7 1214593 0 2026-03-09T16:14:53.053 INFO:tasks.workunit.client.0.vm03.stdout:0/870: write d0/da/f8b [19952,67473] 0 2026-03-09T16:14:53.054 INFO:tasks.workunit.client.0.vm03.stdout:7/822: rmdir d4/da/d5d/db0/da9/db8 39 2026-03-09T16:14:53.059 INFO:tasks.workunit.client.0.vm03.stdout:1/776: creat d4/d6/da2/dea/ffc x:0 0 0 2026-03-09T16:14:53.061 INFO:tasks.workunit.client.0.vm03.stdout:2/848: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/d124 0 2026-03-09T16:14:53.076 INFO:tasks.workunit.client.0.vm03.stdout:3/839: write d5/d1e/d42/d34/dd2/f8a [1320259,94862] 0 2026-03-09T16:14:53.076 INFO:tasks.workunit.client.0.vm03.stdout:8/898: creat da/db/da8/db8/f129 x:0 0 0 2026-03-09T16:14:53.078 INFO:tasks.workunit.client.0.vm03.stdout:4/865: readlink d5/l68 0 2026-03-09T16:14:53.097 INFO:tasks.workunit.client.0.vm03.stdout:9/925: dread d2/fc6 [0,4194304] 0 2026-03-09T16:14:53.103 INFO:tasks.workunit.client.0.vm03.stdout:6/809: rmdir d9/d42/d45/d50/d80/d8a/d9c 39 2026-03-09T16:14:53.118 INFO:tasks.workunit.client.0.vm03.stdout:1/777: mkdir d4/d7b/dfd 0 2026-03-09T16:14:53.123 INFO:tasks.workunit.client.0.vm03.stdout:1/778: dread d4/d6/d1d/d3d/f49 [0,4194304] 0 2026-03-09T16:14:53.123 INFO:tasks.workunit.client.0.vm03.stdout:5/914: getdents d2/d7/d1a/d1c/d6c 0 2026-03-09T16:14:53.125 INFO:tasks.workunit.client.0.vm03.stdout:3/840: fdatasync d5/d53/d88/dd7/fc7 0 2026-03-09T16:14:53.125 INFO:tasks.workunit.client.0.vm03.stdout:3/841: chown d5/d1e/d42/d34/dd2 230187 1 2026-03-09T16:14:53.126 INFO:tasks.workunit.client.0.vm03.stdout:8/899: creat da/db/da8/db8/f12a x:0 0 0 2026-03-09T16:14:53.144 INFO:tasks.workunit.client.0.vm03.stdout:4/866: write f1 [4388847,38489] 0 2026-03-09T16:14:53.147 INFO:tasks.workunit.client.0.vm03.stdout:9/926: dwrite d2/d4/d11/d29/d2a/d46/f81 [0,4194304] 0 2026-03-09T16:14:53.150 INFO:tasks.workunit.client.0.vm03.stdout:9/927: dread - d2/d4/d11/d29/d2a/db3/dbe/de0/f11f zero size 2026-03-09T16:14:53.160 INFO:tasks.workunit.client.0.vm03.stdout:6/810: chown d9/cc 43904049 1 2026-03-09T16:14:53.162 INFO:tasks.workunit.client.0.vm03.stdout:9/928: dwrite d2/d4/d1f/d83/f118 [0,4194304] 0 2026-03-09T16:14:53.184 INFO:tasks.workunit.client.0.vm03.stdout:2/849: rename db/d12/l5b to db/d12/d2a/d11c/l125 0 2026-03-09T16:14:53.184 INFO:tasks.workunit.client.0.vm03.stdout:1/779: symlink d4/d6/d3b/d6b/da5/dc0/lfe 0 2026-03-09T16:14:53.184 INFO:tasks.workunit.client.0.vm03.stdout:2/850: fdatasync db/d12/f69 0 2026-03-09T16:14:53.193 INFO:tasks.workunit.client.0.vm03.stdout:4/867: fdatasync d5/d17/d44/f61 0 2026-03-09T16:14:53.195 INFO:tasks.workunit.client.0.vm03.stdout:6/811: unlink d9/d42/d45/d50/d80/d90/l96 0 2026-03-09T16:14:53.196 INFO:tasks.workunit.client.0.vm03.stdout:9/929: fdatasync d2/d54/d7d/d8f/dad/def/f9f 0 2026-03-09T16:14:53.197 INFO:tasks.workunit.client.0.vm03.stdout:7/823: rename d4/da/d45/d51/d36/f94 to d4/da/d5d/db0/da9/db8/ddf/f10e 0 2026-03-09T16:14:53.207 INFO:tasks.workunit.client.0.vm03.stdout:9/930: dwrite d2/d4/d11/d12/dc7/dee/dce/fde [0,4194304] 0 2026-03-09T16:14:53.211 INFO:tasks.workunit.client.0.vm03.stdout:9/931: chown d2/d4/d11/d12/dc7/dee 239 1 2026-03-09T16:14:53.213 INFO:tasks.workunit.client.0.vm03.stdout:2/851: dread db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fdc [0,4194304] 0 2026-03-09T16:14:53.213 INFO:tasks.workunit.client.0.vm03.stdout:9/932: read - d2/d4/d11/dac/ff8 zero size 2026-03-09T16:14:53.218 INFO:tasks.workunit.client.0.vm03.stdout:0/871: write d0/d7/d3e/d57/d5a/d82/d89/dbd/d9c/fe2 [635828,115843] 0 2026-03-09T16:14:53.219 INFO:tasks.workunit.client.0.vm03.stdout:8/900: write da/d6c/d7a/f91 [4492858,71622] 0 2026-03-09T16:14:53.227 INFO:tasks.workunit.client.0.vm03.stdout:7/824: dread d4/da/d5d/db0/d9d/dc9/fda [0,4194304] 0 2026-03-09T16:14:53.227 INFO:tasks.workunit.client.0.vm03.stdout:3/842: dwrite d5/d6d/db9/df2/fe9 [0,4194304] 0 2026-03-09T16:14:53.232 INFO:tasks.workunit.client.0.vm03.stdout:7/825: chown d4/da/d5d/dd8 15330849 1 2026-03-09T16:14:53.235 INFO:tasks.workunit.client.0.vm03.stdout:7/826: truncate d4/d2d/d4b/fd6 760356 0 2026-03-09T16:14:53.239 INFO:tasks.workunit.client.0.vm03.stdout:7/827: read - d4/da/d45/d51/dea/ff3 zero size 2026-03-09T16:14:53.247 INFO:tasks.workunit.client.0.vm03.stdout:8/901: sync 2026-03-09T16:14:53.253 INFO:tasks.workunit.client.0.vm03.stdout:7/828: dread d4/f26 [4194304,4194304] 0 2026-03-09T16:14:53.256 INFO:tasks.workunit.client.0.vm03.stdout:0/872: rmdir d0/d7/d3e/d57/d5a/d47/dce/d11f 39 2026-03-09T16:14:53.256 INFO:tasks.workunit.client.0.vm03.stdout:0/873: chown d0/da/f8b 43816 1 2026-03-09T16:14:53.257 INFO:tasks.workunit.client.0.vm03.stdout:0/874: read d0/da/d5c/f31 [933274,105876] 0 2026-03-09T16:14:53.263 INFO:tasks.workunit.client.0.vm03.stdout:2/852: rename db/d12/d2a/d99/cd3 to db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/c126 0 2026-03-09T16:14:53.266 INFO:tasks.workunit.client.0.vm03.stdout:3/843: creat d5/d1e/d42/d34/dd2/ff7 x:0 0 0 2026-03-09T16:14:53.269 INFO:tasks.workunit.client.0.vm03.stdout:8/902: truncate da/d10/d28/db1/fbb 455380 0 2026-03-09T16:14:53.272 INFO:tasks.workunit.client.0.vm03.stdout:1/780: truncate d4/fd 2257442 0 2026-03-09T16:14:53.272 INFO:tasks.workunit.client.0.vm03.stdout:4/868: write d5/db/d25/d8b/da8/df3/fd3 [581186,70581] 0 2026-03-09T16:14:53.279 INFO:tasks.workunit.client.0.vm03.stdout:5/915: link d2/d7/d115/d16/c58 d2/d7/de/d11/d19/c12e 0 2026-03-09T16:14:53.281 INFO:tasks.workunit.client.0.vm03.stdout:7/829: symlink d4/da/d5d/db0/da9/db8/l10f 0 2026-03-09T16:14:53.284 INFO:tasks.workunit.client.0.vm03.stdout:6/812: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/fd6 [0,4194304] 0 2026-03-09T16:14:53.296 INFO:tasks.workunit.client.0.vm03.stdout:3/844: creat d5/d6d/d6a/ff8 x:0 0 0 2026-03-09T16:14:53.301 INFO:tasks.workunit.client.0.vm03.stdout:4/869: creat d5/db/d25/dc8/f10a x:0 0 0 2026-03-09T16:14:53.301 INFO:tasks.workunit.client.0.vm03.stdout:5/916: read d2/d7/de/d11/d19/d29/d90/fac [313494,21138] 0 2026-03-09T16:14:53.306 INFO:tasks.workunit.client.0.vm03.stdout:1/781: symlink d4/d39/deb/lff 0 2026-03-09T16:14:53.309 INFO:tasks.workunit.client.0.vm03.stdout:1/782: write d4/db/d59/fe4 [968380,50126] 0 2026-03-09T16:14:53.309 INFO:tasks.workunit.client.0.vm03.stdout:9/933: getdents d2/d4/d11/d29/d2a/d4d 0 2026-03-09T16:14:53.313 INFO:tasks.workunit.client.0.vm03.stdout:4/870: sync 2026-03-09T16:14:53.313 INFO:tasks.workunit.client.0.vm03.stdout:6/813: sync 2026-03-09T16:14:53.313 INFO:tasks.workunit.client.0.vm03.stdout:7/830: creat d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/d10c/f110 x:0 0 0 2026-03-09T16:14:53.315 INFO:tasks.workunit.client.0.vm03.stdout:5/917: dwrite d2/d7/de/d11/d19/d29/d90/dbe/df5/f12b [0,4194304] 0 2026-03-09T16:14:53.324 INFO:tasks.workunit.client.0.vm03.stdout:5/918: truncate d2/d7/d115/d24/d27/d43/f120 789393 0 2026-03-09T16:14:53.329 INFO:tasks.workunit.client.0.vm03.stdout:8/903: link da/d32/dad/lf8 da/d10/d28/d4f/d85/l12b 0 2026-03-09T16:14:53.329 INFO:tasks.workunit.client.0.vm03.stdout:3/845: symlink d5/d53/d6c/d79/d91/dc9/def/lf9 0 2026-03-09T16:14:53.329 INFO:tasks.workunit.client.0.vm03.stdout:9/934: fsync d2/d54/d7d/d8f/fbb 0 2026-03-09T16:14:53.330 INFO:tasks.workunit.client.0.vm03.stdout:4/871: mknod d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/c10b 0 2026-03-09T16:14:53.330 INFO:tasks.workunit.client.0.vm03.stdout:3/846: chown d5/d53/d6c/d79/d91/cd8 3172807 1 2026-03-09T16:14:53.331 INFO:tasks.workunit.client.0.vm03.stdout:7/831: rename d4/da/d5d/db0/d61/l8a to d4/da/d5d/dd8/d22/d24/d15/d71/db7/l111 0 2026-03-09T16:14:53.331 INFO:tasks.workunit.client.0.vm03.stdout:9/935: chown d2/d4/d11/d29/d2a/d46/ff2 165 1 2026-03-09T16:14:53.338 INFO:tasks.workunit.client.0.vm03.stdout:8/904: chown da/d10/d28/d64/c71 7 1 2026-03-09T16:14:53.340 INFO:tasks.workunit.client.0.vm03.stdout:2/853: dread db/d12/d2a/d61/f5d [0,4194304] 0 2026-03-09T16:14:53.373 INFO:tasks.workunit.client.0.vm03.stdout:3/847: rename d5/lf to d5/d53/d6c/d79/dd9/lfa 0 2026-03-09T16:14:53.384 INFO:tasks.workunit.client.0.vm03.stdout:7/832: dwrite d4/da/d5d/db0/d9d/fac [0,4194304] 0 2026-03-09T16:14:53.387 INFO:tasks.workunit.client.0.vm03.stdout:7/833: readlink d4/da/l76 0 2026-03-09T16:14:53.421 INFO:tasks.workunit.client.0.vm03.stdout:4/872: getdents d5/dd/dba 0 2026-03-09T16:14:53.427 INFO:tasks.workunit.client.0.vm03.stdout:3/848: symlink d5/lfb 0 2026-03-09T16:14:53.431 INFO:tasks.workunit.client.0.vm03.stdout:0/875: truncate d0/d7/d3e/d57/d5a/d5f/db2/f5e 1537769 0 2026-03-09T16:14:53.432 INFO:tasks.workunit.client.0.vm03.stdout:0/876: read d0/d7/d3e/d57/d5a/d82/d89/dbd/d9c/fe2 [220447,76764] 0 2026-03-09T16:14:53.445 INFO:tasks.workunit.client.0.vm03.stdout:1/783: dwrite d4/d6/d1d/f66 [0,4194304] 0 2026-03-09T16:14:53.445 INFO:tasks.workunit.client.0.vm03.stdout:6/814: write d9/d14/d71/f95 [3497700,102189] 0 2026-03-09T16:14:53.446 INFO:tasks.workunit.client.0.vm03.stdout:5/919: write d2/fb9 [175220,27420] 0 2026-03-09T16:14:53.461 INFO:tasks.workunit.client.0.vm03.stdout:5/920: dread d2/d7/d1a/d1c/d6c/f79 [0,4194304] 0 2026-03-09T16:14:53.465 INFO:tasks.workunit.client.0.vm03.stdout:2/854: dwrite db/d12/d2a/d99/fb8 [0,4194304] 0 2026-03-09T16:14:53.489 INFO:tasks.workunit.client.0.vm03.stdout:9/936: write d2/d4/d11/d12/d28/fd5 [592044,125626] 0 2026-03-09T16:14:53.515 INFO:tasks.workunit.client.0.vm03.stdout:7/834: rename d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5/df2/lf7 to d4/da/d5d/dd8/d22/d24/d15/d71/db7/l112 0 2026-03-09T16:14:53.517 INFO:tasks.workunit.client.0.vm03.stdout:8/905: truncate da/d10/f1f 147162 0 2026-03-09T16:14:53.523 INFO:tasks.workunit.client.0.vm03.stdout:0/877: mknod d0/d7/d3e/d57/d5a/d82/d89/dc0/c12c 0 2026-03-09T16:14:53.523 INFO:tasks.workunit.client.0.vm03.stdout:3/849: write d5/d1e/d42/f20 [6326420,101407] 0 2026-03-09T16:14:53.532 INFO:tasks.workunit.client.0.vm03.stdout:1/784: mkdir d4/d6/da2/dea/d100 0 2026-03-09T16:14:53.536 INFO:tasks.workunit.client.0.vm03.stdout:6/815: dread d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:53.541 INFO:tasks.workunit.client.0.vm03.stdout:5/921: creat d2/d7/de/d11/d19/d29/d90/dbe/f12f x:0 0 0 2026-03-09T16:14:53.552 INFO:tasks.workunit.client.0.vm03.stdout:5/922: dread d2/d7/d115/d16/d5c/fb5 [0,4194304] 0 2026-03-09T16:14:53.560 INFO:tasks.workunit.client.0.vm03.stdout:8/906: rmdir da/db/d43 39 2026-03-09T16:14:53.566 INFO:tasks.workunit.client.0.vm03.stdout:0/878: rmdir d0/da/d7a/d98 39 2026-03-09T16:14:53.587 INFO:tasks.workunit.client.0.vm03.stdout:4/873: write d5/db/d25/d8b/da8/df3/df7/d33/d79/fb0 [31285,122739] 0 2026-03-09T16:14:53.596 INFO:tasks.workunit.client.0.vm03.stdout:9/937: write d2/d54/d7d/d8f/fbb [260669,120465] 0 2026-03-09T16:14:53.604 INFO:tasks.workunit.client.0.vm03.stdout:2/855: dwrite db/d12/d2a/f58 [0,4194304] 0 2026-03-09T16:14:53.608 INFO:tasks.workunit.client.0.vm03.stdout:5/923: rmdir d2/d7/d115/d16/d5c/dcf 39 2026-03-09T16:14:53.623 INFO:tasks.workunit.client.0.vm03.stdout:8/907: readlink da/d10/d28/d64/l11f 0 2026-03-09T16:14:53.623 INFO:tasks.workunit.client.0.vm03.stdout:0/879: creat d0/d7/d75/f12d x:0 0 0 2026-03-09T16:14:53.625 INFO:tasks.workunit.client.0.vm03.stdout:3/850: mknod d5/d6d/db9/de0/cfc 0 2026-03-09T16:14:53.629 INFO:tasks.workunit.client.0.vm03.stdout:4/874: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a/f10c x:0 0 0 2026-03-09T16:14:53.642 INFO:tasks.workunit.client.0.vm03.stdout:7/835: rename d4/da/d5d/db0/da9 to d4/da/d5d/db0/d113 0 2026-03-09T16:14:53.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:53 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:53.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:53 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:53.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:53 vm03.local ceph-mon[51019]: pgmap v6: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:53.643 INFO:tasks.workunit.client.0.vm03.stdout:2/856: sync 2026-03-09T16:14:53.647 INFO:tasks.workunit.client.0.vm03.stdout:9/938: dread d2/d4/d11/f66 [0,4194304] 0 2026-03-09T16:14:53.648 INFO:tasks.workunit.client.0.vm03.stdout:9/939: stat d2/d4/d11/d29/d2a/d46/ld4 0 2026-03-09T16:14:53.651 INFO:tasks.workunit.client.0.vm03.stdout:9/940: dwrite d2/f15 [4194304,4194304] 0 2026-03-09T16:14:53.657 INFO:tasks.workunit.client.0.vm03.stdout:9/941: sync 2026-03-09T16:14:53.658 INFO:tasks.workunit.client.0.vm03.stdout:9/942: sync 2026-03-09T16:14:53.659 INFO:tasks.workunit.client.0.vm03.stdout:9/943: chown d2/d4/d11/d12/f9a 0 1 2026-03-09T16:14:53.666 INFO:tasks.workunit.client.0.vm03.stdout:0/880: write d0/d7/d3e/d57/d5a/d5f/f71 [3685817,48720] 0 2026-03-09T16:14:53.667 INFO:tasks.workunit.client.0.vm03.stdout:3/851: write d5/d1e/d42/f84 [2586935,13358] 0 2026-03-09T16:14:53.670 INFO:tasks.workunit.client.0.vm03.stdout:6/816: link d9/d42/d45/d65/dae/lf6 d9/d42/d45/ddf/l104 0 2026-03-09T16:14:53.671 INFO:tasks.workunit.client.0.vm03.stdout:4/875: symlink d5/db/d25/d8b/da8/df3/df7/d33/d79/l10d 0 2026-03-09T16:14:53.674 INFO:tasks.workunit.client.0.vm03.stdout:5/924: creat d2/d7/d115/d16/d5c/dcf/f130 x:0 0 0 2026-03-09T16:14:53.675 INFO:tasks.workunit.client.0.vm03.stdout:5/925: chown d2/d7/de/d11/d19/d29/d90/dbe/lc4 0 1 2026-03-09T16:14:53.678 INFO:tasks.workunit.client.0.vm03.stdout:7/836: fsync d4/da/d45/d51/f50 0 2026-03-09T16:14:53.683 INFO:tasks.workunit.client.0.vm03.stdout:8/908: creat da/db/d43/f12c x:0 0 0 2026-03-09T16:14:53.685 INFO:tasks.workunit.client.0.vm03.stdout:2/857: truncate db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/fd7 831993 0 2026-03-09T16:14:53.687 INFO:tasks.workunit.client.0.vm03.stdout:1/785: getdents d4/d6/d1d/d3d 0 2026-03-09T16:14:53.689 INFO:tasks.workunit.client.0.vm03.stdout:3/852: symlink d5/d6d/d6a/dbd/lfd 0 2026-03-09T16:14:53.699 INFO:tasks.workunit.client.0.vm03.stdout:9/944: dwrite d2/d4/d11/d12/f1e [0,4194304] 0 2026-03-09T16:14:53.712 INFO:tasks.workunit.client.0.vm03.stdout:8/909: truncate da/d1d/f4a 177645 0 2026-03-09T16:14:53.713 INFO:tasks.workunit.client.0.vm03.stdout:1/786: mkdir d4/d6/d1d/d3d/d101 0 2026-03-09T16:14:53.714 INFO:tasks.workunit.client.0.vm03.stdout:1/787: chown d4/f6d 145353 1 2026-03-09T16:14:53.716 INFO:tasks.workunit.client.0.vm03.stdout:0/881: rename d0/d7/d75/f69 to d0/d7/d75/f12e 0 2026-03-09T16:14:53.720 INFO:tasks.workunit.client.0.vm03.stdout:5/926: symlink d2/d7/d115/d24/d27/d43/d4b/de6/l131 0 2026-03-09T16:14:53.725 INFO:tasks.workunit.client.0.vm03.stdout:9/945: truncate d2/d4/d11/d29/f4e 221974 0 2026-03-09T16:14:53.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:53 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:53.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:53 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:53.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:53 vm05.local ceph-mon[58702]: pgmap v6: 65 pgs: 65 active+clean; 1.7 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail 2026-03-09T16:14:53.818 INFO:tasks.workunit.client.0.vm03.stdout:4/876: dwrite d5/d17/f8a [0,4194304] 0 2026-03-09T16:14:53.848 INFO:tasks.workunit.client.0.vm03.stdout:3/853: creat d5/d53/d88/dd7/df1/ffe x:0 0 0 2026-03-09T16:14:53.870 INFO:tasks.workunit.client.0.vm03.stdout:3/854: sync 2026-03-09T16:14:53.902 INFO:tasks.workunit.client.0.vm03.stdout:9/946: symlink d2/d4/d11/d29/d2a/db3/dbe/de0/l123 0 2026-03-09T16:14:53.902 INFO:tasks.workunit.client.0.vm03.stdout:9/947: chown d2/d4/d11/d29/d2a/d38/dcd/f119 386 1 2026-03-09T16:14:53.970 INFO:tasks.workunit.client.0.vm03.stdout:2/858: mknod db/d12/d2a/c127 0 2026-03-09T16:14:53.973 INFO:tasks.workunit.client.0.vm03.stdout:2/859: dread db/d12/d2a/d99/de7/df9/fa7 [0,4194304] 0 2026-03-09T16:14:53.976 INFO:tasks.workunit.client.0.vm03.stdout:0/882: getdents d0/da/d7a 0 2026-03-09T16:14:53.984 INFO:tasks.workunit.client.0.vm03.stdout:4/877: getdents d5/db/d25/dc8/dd2 0 2026-03-09T16:14:53.987 INFO:tasks.workunit.client.0.vm03.stdout:4/878: dwrite d5/dd/d1f/d5f/f98 [4194304,4194304] 0 2026-03-09T16:14:54.006 INFO:tasks.workunit.client.0.vm03.stdout:2/860: dread f0 [0,4194304] 0 2026-03-09T16:14:54.007 INFO:tasks.workunit.client.0.vm03.stdout:2/861: chown db/d12/d2a/d99/de7/df9/d64/dbd/fd8 604338033 1 2026-03-09T16:14:54.011 INFO:tasks.workunit.client.0.vm03.stdout:0/883: write d0/d7/d48/f18 [3126168,71229] 0 2026-03-09T16:14:54.024 INFO:tasks.workunit.client.0.vm03.stdout:0/884: mknod d0/d7/d3e/d57/d5a/d5f/db2/dcf/c12f 0 2026-03-09T16:14:54.027 INFO:tasks.workunit.client.0.vm03.stdout:4/879: mkdir d5/d17/db7/d10e 0 2026-03-09T16:14:54.042 INFO:tasks.workunit.client.0.vm03.stdout:4/880: sync 2026-03-09T16:14:54.062 INFO:tasks.workunit.client.0.vm03.stdout:0/885: creat d0/da/d7a/f130 x:0 0 0 2026-03-09T16:14:54.065 INFO:tasks.workunit.client.0.vm03.stdout:0/886: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:54.065 INFO:tasks.workunit.client.0.vm03.stdout:0/887: readlink d0/d7/d75/le8 0 2026-03-09T16:14:54.066 INFO:tasks.workunit.client.0.vm03.stdout:0/888: chown d0/da/d1b/d9b/f93 937923 1 2026-03-09T16:14:54.067 INFO:tasks.workunit.client.0.vm03.stdout:0/889: mknod d0/d7/d3e/d57/d5a/d47/dce/df6/c131 0 2026-03-09T16:14:54.076 INFO:tasks.workunit.client.0.vm03.stdout:0/890: dread d0/da/d1b/d9b/f61 [0,4194304] 0 2026-03-09T16:14:54.101 INFO:tasks.workunit.client.0.vm03.stdout:0/891: creat d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/f132 x:0 0 0 2026-03-09T16:14:54.152 INFO:tasks.workunit.client.0.vm03.stdout:6/817: creat d9/d42/d45/d50/f105 x:0 0 0 2026-03-09T16:14:54.153 INFO:tasks.workunit.client.0.vm03.stdout:1/788: unlink d4/db/f7d 0 2026-03-09T16:14:54.173 INFO:tasks.workunit.client.0.vm03.stdout:8/910: write da/d6c/d7a/de4/f124 [3153768,53414] 0 2026-03-09T16:14:54.176 INFO:tasks.workunit.client.0.vm03.stdout:8/911: creat da/db/da8/f12d x:0 0 0 2026-03-09T16:14:54.180 INFO:tasks.workunit.client.0.vm03.stdout:8/912: dwrite da/db/da8/f12d [0,4194304] 0 2026-03-09T16:14:54.190 INFO:tasks.workunit.client.0.vm03.stdout:8/913: creat da/d10/d28/d4f/d68/f12e x:0 0 0 2026-03-09T16:14:54.190 INFO:tasks.workunit.client.0.vm03.stdout:8/914: chown da/f114 226281 1 2026-03-09T16:14:54.191 INFO:tasks.workunit.client.0.vm03.stdout:8/915: readlink da/d10/d28/d4f/d68/ld4 0 2026-03-09T16:14:54.196 INFO:tasks.workunit.client.0.vm03.stdout:8/916: symlink da/db/l12f 0 2026-03-09T16:14:54.196 INFO:tasks.workunit.client.0.vm03.stdout:8/917: fdatasync da/db/f6a 0 2026-03-09T16:14:54.199 INFO:tasks.workunit.client.0.vm03.stdout:8/918: read - da/d6c/fae zero size 2026-03-09T16:14:54.200 INFO:tasks.workunit.client.0.vm03.stdout:8/919: symlink da/d6c/dc4/l130 0 2026-03-09T16:14:54.202 INFO:tasks.workunit.client.0.vm03.stdout:9/948: write d2/d4/d11/d12/d28/ffe [509347,28642] 0 2026-03-09T16:14:54.206 INFO:tasks.workunit.client.0.vm03.stdout:9/949: read d2/d4/d11/d12/f45 [699067,86174] 0 2026-03-09T16:14:54.211 INFO:tasks.workunit.client.0.vm03.stdout:9/950: truncate d2/d54/d7d/d8f/dad/fae 217745 0 2026-03-09T16:14:54.214 INFO:tasks.workunit.client.0.vm03.stdout:8/920: dread da/d10/d28/d4f/d68/d80/f2f [0,4194304] 0 2026-03-09T16:14:54.215 INFO:tasks.workunit.client.0.vm03.stdout:8/921: chown da/d1d/fcb 165 1 2026-03-09T16:14:54.215 INFO:tasks.workunit.client.0.vm03.stdout:9/951: dwrite d2/d4/d1f/fff [0,4194304] 0 2026-03-09T16:14:54.220 INFO:tasks.workunit.client.0.vm03.stdout:9/952: dread d2/d54/d7d/d8f/dad/def/d84/d8a/fb5 [0,4194304] 0 2026-03-09T16:14:54.230 INFO:tasks.workunit.client.0.vm03.stdout:7/837: rename d4/da/d5d/dd8/d22/d24/d16/d3e/db5/de5 to d4/da/d5d/dd8/d22/d24/d16/d3e/d114 0 2026-03-09T16:14:54.234 INFO:tasks.workunit.client.0.vm03.stdout:8/922: creat da/db/da8/db8/f131 x:0 0 0 2026-03-09T16:14:54.245 INFO:tasks.workunit.client.0.vm03.stdout:5/927: rename d2/d7/de/d11/dbf/f102 to d2/d7/de/d11/d19/d29/f132 0 2026-03-09T16:14:54.245 INFO:tasks.workunit.client.0.vm03.stdout:7/838: read d4/da/d5d/dd8/f37 [3607577,76215] 0 2026-03-09T16:14:54.245 INFO:tasks.workunit.client.0.vm03.stdout:9/953: rename d2/d54/d7d/d8f/dad/def/d84/d8a/fb5 to d2/d4/d11/d12/dc7/dee/dc2/f124 0 2026-03-09T16:14:54.245 INFO:tasks.workunit.client.0.vm03.stdout:9/954: rmdir d2/d54/d7d/d8f/dad/def/d89 39 2026-03-09T16:14:54.246 INFO:tasks.workunit.client.0.vm03.stdout:7/839: getdents d4/da/d5d/db0/d9d/dc9 0 2026-03-09T16:14:54.247 INFO:tasks.workunit.client.0.vm03.stdout:7/840: stat d4/da/d5d/dd8/d22/d24/d16/d3e/d114/lef 0 2026-03-09T16:14:54.255 INFO:tasks.workunit.client.0.vm03.stdout:3/855: dwrite d5/d1e/d42/d34/f73 [4194304,4194304] 0 2026-03-09T16:14:54.259 INFO:tasks.workunit.client.0.vm03.stdout:3/856: fsync d5/d2e/fd4 0 2026-03-09T16:14:54.274 INFO:tasks.workunit.client.0.vm03.stdout:3/857: link d5/d53/d88/dd7/lc4 d5/d1e/d42/lff 0 2026-03-09T16:14:54.275 INFO:tasks.workunit.client.0.vm03.stdout:3/858: unlink d5/d6d/la3 0 2026-03-09T16:14:54.280 INFO:tasks.workunit.client.0.vm03.stdout:3/859: fsync d5/d6d/db9/df2/dae/fde 0 2026-03-09T16:14:54.282 INFO:tasks.workunit.client.0.vm03.stdout:9/955: sync 2026-03-09T16:14:54.292 INFO:tasks.workunit.client.0.vm03.stdout:3/860: truncate d5/d6d/d5a/f78 7136071 0 2026-03-09T16:14:54.300 INFO:tasks.workunit.client.0.vm03.stdout:3/861: dread d5/d1e/d42/d34/fad [0,4194304] 0 2026-03-09T16:14:54.304 INFO:tasks.workunit.client.0.vm03.stdout:9/956: dread d2/d54/d7d/fa4 [0,4194304] 0 2026-03-09T16:14:54.305 INFO:tasks.workunit.client.0.vm03.stdout:3/862: unlink d5/d53/d88/dd3/fc0 0 2026-03-09T16:14:54.306 INFO:tasks.workunit.client.0.vm03.stdout:9/957: rmdir d2/de/d88 39 2026-03-09T16:14:54.313 INFO:tasks.workunit.client.0.vm03.stdout:2/862: write db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fb9 [258530,125641] 0 2026-03-09T16:14:54.326 INFO:tasks.workunit.client.0.vm03.stdout:2/863: rename db/d12/d2a/d99/de7/df9/cee to db/d12/d2a/d99/de7/df9/d52/c128 0 2026-03-09T16:14:54.342 INFO:tasks.workunit.client.0.vm03.stdout:9/958: creat d2/d54/d7d/f125 x:0 0 0 2026-03-09T16:14:54.342 INFO:tasks.workunit.client.0.vm03.stdout:4/881: write d5/dd/dd5/fef [773969,49112] 0 2026-03-09T16:14:54.355 INFO:tasks.workunit.client.0.vm03.stdout:9/959: dwrite d2/d4/d11/d29/d2a/d46/f81 [0,4194304] 0 2026-03-09T16:14:54.367 INFO:tasks.workunit.client.0.vm03.stdout:4/882: unlink d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a/fac 0 2026-03-09T16:14:54.371 INFO:tasks.workunit.client.0.vm03.stdout:4/883: truncate d5/dd/d1f/d5f/f7c 544893 0 2026-03-09T16:14:54.373 INFO:tasks.workunit.client.0.vm03.stdout:0/892: dwrite d0/d7/d48/f2e [0,4194304] 0 2026-03-09T16:14:54.376 INFO:tasks.workunit.client.0.vm03.stdout:0/893: fdatasync d0/d7/d48/f43 0 2026-03-09T16:14:54.417 INFO:tasks.workunit.client.0.vm03.stdout:6/818: write d9/d42/f78 [1396132,87555] 0 2026-03-09T16:14:54.417 INFO:tasks.workunit.client.0.vm03.stdout:1/789: write d4/d6/f15 [632650,99754] 0 2026-03-09T16:14:54.447 INFO:tasks.workunit.client.0.vm03.stdout:0/894: truncate d0/da/d1b/dc8/d104/f110 468126 0 2026-03-09T16:14:54.453 INFO:tasks.workunit.client.0.vm03.stdout:9/960: dread d2/d54/d7d/d8f/dad/fae [0,4194304] 0 2026-03-09T16:14:54.459 INFO:tasks.workunit.client.0.vm03.stdout:8/923: write da/f52 [3436899,99170] 0 2026-03-09T16:14:54.477 INFO:tasks.workunit.client.0.vm03.stdout:0/895: sync 2026-03-09T16:14:54.477 INFO:tasks.workunit.client.0.vm03.stdout:5/928: dwrite d2/d7/de/d11/d19/d31/d35/d87/f10a [0,4194304] 0 2026-03-09T16:14:54.487 INFO:tasks.workunit.client.0.vm03.stdout:7/841: write d4/da/d5d/db0/d113/db8/ddf/ffe [1921364,103366] 0 2026-03-09T16:14:54.517 INFO:tasks.workunit.client.0.vm03.stdout:1/790: rename d4/d6/d1d/d20/d93/l44 to d4/d6/d3b/l102 0 2026-03-09T16:14:54.529 INFO:tasks.workunit.client.0.vm03.stdout:3/863: dwrite d5/f2b [4194304,4194304] 0 2026-03-09T16:14:54.532 INFO:tasks.workunit.client.0.vm03.stdout:4/884: write d5/db/d25/d8b/da8/df3/df7/fa1 [492788,115900] 0 2026-03-09T16:14:54.536 INFO:tasks.workunit.client.0.vm03.stdout:3/864: chown d5/d6d/db9/df2/dbe 23 1 2026-03-09T16:14:54.536 INFO:tasks.workunit.client.0.vm03.stdout:4/885: write f1 [1505498,83681] 0 2026-03-09T16:14:54.545 INFO:tasks.workunit.client.0.vm03.stdout:5/929: mknod d2/d7/de/d11/d19/c133 0 2026-03-09T16:14:54.551 INFO:tasks.workunit.client.0.vm03.stdout:9/961: write d2/d54/d7d/d8f/dad/fae [684946,68059] 0 2026-03-09T16:14:54.554 INFO:tasks.workunit.client.0.vm03.stdout:7/842: symlink d4/da/d5d/dd8/d22/d24/d16/d2b/l115 0 2026-03-09T16:14:54.556 INFO:tasks.workunit.client.0.vm03.stdout:4/886: unlink d5/dd/d1f/f5e 0 2026-03-09T16:14:54.556 INFO:tasks.workunit.client.0.vm03.stdout:3/865: sync 2026-03-09T16:14:54.561 INFO:tasks.workunit.client.0.vm03.stdout:0/896: link d0/d7/ldf d0/d7/d3e/d57/d5a/d82/d89/def/d125/l133 0 2026-03-09T16:14:54.565 INFO:tasks.workunit.client.0.vm03.stdout:6/819: dwrite d9/f20 [4194304,4194304] 0 2026-03-09T16:14:54.566 INFO:tasks.workunit.client.0.vm03.stdout:1/791: dread d4/d6/d3b/f35 [0,4194304] 0 2026-03-09T16:14:54.568 INFO:tasks.workunit.client.0.vm03.stdout:0/897: sync 2026-03-09T16:14:54.570 INFO:tasks.workunit.client.0.vm03.stdout:2/864: dread db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fb9 [0,4194304] 0 2026-03-09T16:14:54.580 INFO:tasks.workunit.client.0.vm03.stdout:5/930: fdatasync d2/d7/d115/d16/d5c/fb5 0 2026-03-09T16:14:54.586 INFO:tasks.workunit.client.0.vm03.stdout:9/962: mknod d2/d54/d7d/c126 0 2026-03-09T16:14:54.586 INFO:tasks.workunit.client.0.vm03.stdout:9/963: chown d2/d4/c20 328 1 2026-03-09T16:14:54.590 INFO:tasks.workunit.client.0.vm03.stdout:7/843: creat d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/f116 x:0 0 0 2026-03-09T16:14:54.594 INFO:tasks.workunit.client.0.vm03.stdout:4/887: fdatasync d5/db/d25/d8b/da8/df3/df7/d4d/da9/fab 0 2026-03-09T16:14:54.595 INFO:tasks.workunit.client.0.vm03.stdout:4/888: chown d5/db/d25/d8b/da8/df3/df7/d4d/da9/fff 60680 1 2026-03-09T16:14:54.596 INFO:tasks.workunit.client.0.vm03.stdout:4/889: write d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a/ff6 [216701,123471] 0 2026-03-09T16:14:54.609 INFO:tasks.workunit.client.0.vm03.stdout:6/820: creat d9/d42/d45/d50/d80/d90/db7/f106 x:0 0 0 2026-03-09T16:14:54.631 INFO:tasks.workunit.client.0.vm03.stdout:8/924: dwrite da/d10/d28/d4f/d68/f8f [0,4194304] 0 2026-03-09T16:14:54.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.632+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1944376737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 msgr2=0x7fdf6c10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:54.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.632+0000 7fdf70bc2640 1 --2- 192.168.123.103:0/1944376737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c10c590 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fdf6000b0a0 tx=0x7fdf6002f4a0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.634+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1944376737 shutdown_connections 2026-03-09T16:14:54.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.634+0000 7fdf70bc2640 1 --2- 192.168.123.103:0/1944376737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c10c590 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.634+0000 7fdf70bc2640 1 --2- 192.168.123.103:0/1944376737 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 0x7fdf6c071dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.634+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1944376737 >> 192.168.123.103:0/1944376737 conn(0x7fdf6c06d4f0 msgr2=0x7fdf6c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:54.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.636+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1944376737 shutdown_connections 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.636+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1944376737 wait complete. 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 Processor -- start 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 -- start start 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 0x7fdf6c1afd70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf6c1b0960 con 0x7fdf6c0719c0 2026-03-09T16:14:54.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf70bc2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf6c1b0ad0 con 0x7fdf6c072390 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf69d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf69d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58094/0 (socket says 192.168.123.103:58094) 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.637+0000 7fdf69d74640 1 -- 192.168.123.103:0/1998463867 learned_addr learned my addr 192.168.123.103:0/1998463867 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf6a575640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 0x7fdf6c1afd70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf69d74640 1 -- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 msgr2=0x7fdf6c1afd70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf69d74640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 0x7fdf6c1afd70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf69d74640 1 -- 192.168.123.103:0/1998463867 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdf60009d00 con 0x7fdf6c072390 2026-03-09T16:14:54.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf69d74640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fdf60009840 tx=0x7fdf60002bd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:54.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdf600090d0 con 0x7fdf6c072390 2026-03-09T16:14:54.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1998463867 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdf6c1b5170 con 0x7fdf6c072390 2026-03-09T16:14:54.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.638+0000 7fdf70bc2640 1 -- 192.168.123.103:0/1998463867 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdf6c1b56c0 con 0x7fdf6c072390 2026-03-09T16:14:54.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.639+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdf6c10f5f0 con 0x7fdf6c072390 2026-03-09T16:14:54.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.640+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdf6002fe90 con 0x7fdf6c072390 2026-03-09T16:14:54.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.641+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdf60038740 con 0x7fdf6c072390 2026-03-09T16:14:54.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.641+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 29) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fdf60007660 con 0x7fdf6c072390 2026-03-09T16:14:54.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.642+0000 7fdf577fe640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 0x7fdf3c0798e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:54.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.642+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fdf600be5e0 con 0x7fdf6c072390 2026-03-09T16:14:54.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.643+0000 7fdf6a575640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 0x7fdf3c0798e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:54.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.644+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fdf600870a0 con 0x7fdf6c072390 2026-03-09T16:14:54.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.646+0000 7fdf6a575640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 0x7fdf3c0798e0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fdf5c004500 tx=0x7fdf5c009290 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:54.648 INFO:tasks.workunit.client.0.vm03.stdout:0/898: truncate d0/d7/d3e/d57/d5a/d47/f88 3990586 0 2026-03-09T16:14:54.649 INFO:tasks.workunit.client.0.vm03.stdout:0/899: fsync d0/d7/d3e/d57/d5a/d5f/db2/f112 0 2026-03-09T16:14:54.658 INFO:tasks.workunit.client.0.vm03.stdout:3/866: symlink d5/d1e/d42/l100 0 2026-03-09T16:14:54.663 INFO:tasks.workunit.client.0.vm03.stdout:4/890: mknod d5/db/d25/d8b/da8/d81/dd4/c10f 0 2026-03-09T16:14:54.664 INFO:tasks.workunit.client.0.vm03.stdout:4/891: readlink d5/l29 0 2026-03-09T16:14:54.673 INFO:tasks.workunit.client.0.vm03.stdout:8/925: mknod da/d10/d28/d4f/d85/d9c/c132 0 2026-03-09T16:14:54.679 INFO:tasks.workunit.client.0.vm03.stdout:0/900: symlink d0/d7/d75/l134 0 2026-03-09T16:14:54.683 INFO:tasks.workunit.client.0.vm03.stdout:2/865: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/d124/d129 0 2026-03-09T16:14:54.683 INFO:tasks.workunit.client.0.vm03.stdout:1/792: dwrite d4/d39/d7f/fcb [0,4194304] 0 2026-03-09T16:14:54.713 INFO:tasks.workunit.client.0.vm03.stdout:3/867: mkdir d5/d6d/d6a/d101 0 2026-03-09T16:14:54.714 INFO:tasks.workunit.client.0.vm03.stdout:3/868: fdatasync d5/fb3 0 2026-03-09T16:14:54.719 INFO:tasks.workunit.client.0.vm03.stdout:6/821: write d9/d42/d45/d50/fb0 [8692777,121801] 0 2026-03-09T16:14:54.746 INFO:tasks.workunit.client.0.vm03.stdout:1/793: creat d4/d6/da2/dea/f103 x:0 0 0 2026-03-09T16:14:54.747 INFO:tasks.workunit.client.0.vm03.stdout:1/794: dread - d4/d6/d1d/d69/fa9 zero size 2026-03-09T16:14:54.766 INFO:tasks.workunit.client.0.vm03.stdout:9/964: rmdir d2/d4/d11/d29/d2a/d38/dcd/d114 0 2026-03-09T16:14:54.768 INFO:tasks.workunit.client.0.vm03.stdout:9/965: fdatasync d2/d4/d11/d29/d2a/d46/f9e 0 2026-03-09T16:14:54.770 INFO:tasks.workunit.client.0.vm03.stdout:9/966: dread - d2/d54/f5e zero size 2026-03-09T16:14:54.780 INFO:tasks.workunit.client.0.vm03.stdout:9/967: dread d2/d54/d7d/d8f/dad/def/f22 [0,4194304] 0 2026-03-09T16:14:54.793 INFO:tasks.workunit.client.0.vm03.stdout:2/866: dwrite db/d12/d2a/d61/d6d/f91 [0,4194304] 0 2026-03-09T16:14:54.817 INFO:tasks.workunit.client.0.vm03.stdout:5/931: getdents d2/d7/de/d11/d19/d29/d90/dbe 0 2026-03-09T16:14:54.818 INFO:tasks.workunit.client.0.vm03.stdout:5/932: chown d2/d7/de/d54/c9c 17834 1 2026-03-09T16:14:54.823 INFO:tasks.workunit.client.0.vm03.stdout:1/795: write d4/d6/d3b/f98 [554276,61546] 0 2026-03-09T16:14:54.832 INFO:tasks.workunit.client.0.vm03.stdout:9/968: creat d2/d4/d11/d12/dc7/dcc/f127 x:0 0 0 2026-03-09T16:14:54.841 INFO:tasks.workunit.client.0.vm03.stdout:2/867: creat db/d12/da5/de4/f12a x:0 0 0 2026-03-09T16:14:54.847 INFO:tasks.workunit.client.0.vm03.stdout:7/844: getdents d4/da/d5d/db0/d9d 0 2026-03-09T16:14:54.864 INFO:tasks.workunit.client.0.vm03.stdout:4/892: creat d5/db/d25/d8b/da8/df3/df7/f110 x:0 0 0 2026-03-09T16:14:54.865 INFO:tasks.workunit.client.0.vm03.stdout:4/893: chown d5/db/d25/d8b/da8/df3/df7/d4d/d5b/fc2 153726 1 2026-03-09T16:14:54.866 INFO:tasks.workunit.client.0.vm03.stdout:4/894: chown d5/dd/dd5 14 1 2026-03-09T16:14:54.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.869+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdf6c10f800 con 0x7fdf3c077420 2026-03-09T16:14:54.873 INFO:tasks.workunit.client.0.vm03.stdout:6/822: creat d9/d42/d45/dfa/f107 x:0 0 0 2026-03-09T16:14:54.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.875+0000 7fdf577fe640 1 -- 192.168.123.103:0/1998463867 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+329 (secure 0 0 0) 0x7fdf6c10f800 con 0x7fdf3c077420 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 msgr2=0x7fdf3c0798e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 0x7fdf3c0798e0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fdf5c004500 tx=0x7fdf5c009290 comp rx=0 tx=0).stop 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 msgr2=0x7fdf6c1b02f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fdf60009840 tx=0x7fdf60002bd0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 shutdown_connections 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fdf3c077420 0x7fdf3c0798e0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdf6c072390 0x7fdf6c1b02f0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 --2- 192.168.123.103:0/1998463867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdf6c0719c0 0x7fdf6c1afd70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:54.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.881+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 >> 192.168.123.103:0/1998463867 conn(0x7fdf6c06d4f0 msgr2=0x7fdf6c10a7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:54.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.882+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 shutdown_connections 2026-03-09T16:14:54.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:54.882+0000 7fdf557fa640 1 -- 192.168.123.103:0/1998463867 wait complete. 2026-03-09T16:14:54.884 INFO:tasks.workunit.client.0.vm03.stdout:3/869: mkdir d5/d44/d102 0 2026-03-09T16:14:54.885 INFO:tasks.workunit.client.0.vm03.stdout:8/926: rename da/d10/d28/d4f/d68 to da/d32/d133 0 2026-03-09T16:14:54.890 INFO:tasks.workunit.client.0.vm03.stdout:6/823: sync 2026-03-09T16:14:54.895 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:14:54.896 INFO:tasks.workunit.client.0.vm03.stdout:0/901: link d0/da/d5c/f39 d0/d7/d3e/d57/d5a/d5f/db2/dcf/f135 0 2026-03-09T16:14:54.898 INFO:tasks.workunit.client.0.vm03.stdout:4/895: dwrite d5/db/d25/d8b/da8/d81/f103 [0,4194304] 0 2026-03-09T16:14:54.904 INFO:tasks.workunit.client.0.vm03.stdout:7/845: creat d4/da/d5d/dd8/d22/d24/d16/d3e/d114/d10c/f117 x:0 0 0 2026-03-09T16:14:54.920 INFO:tasks.workunit.client.0.vm03.stdout:7/846: dread d4/da/d5d/dd8/f6a [0,4194304] 0 2026-03-09T16:14:54.926 INFO:tasks.workunit.client.0.vm03.stdout:0/902: dwrite d0/d7/d3e/d57/d5a/d5f/db2/dcf/f101 [0,4194304] 0 2026-03-09T16:14:54.957 INFO:tasks.workunit.client.0.vm03.stdout:4/896: readlink d5/db/d25/l43 0 2026-03-09T16:14:54.965 INFO:tasks.workunit.client.0.vm03.stdout:8/927: write da/db/d30/dc7/fec [98060,13471] 0 2026-03-09T16:14:54.967 INFO:tasks.workunit.client.0.vm03.stdout:3/870: dwrite d5/d6d/d6a/fa9 [0,4194304] 0 2026-03-09T16:14:54.968 INFO:tasks.workunit.client.0.vm03.stdout:4/897: sync 2026-03-09T16:14:54.969 INFO:tasks.workunit.client.0.vm03.stdout:4/898: write d5/dd/d1f/f58 [2763208,37049] 0 2026-03-09T16:14:54.978 INFO:tasks.workunit.client.0.vm03.stdout:3/871: dread d5/d53/d6c/d79/f9d [0,4194304] 0 2026-03-09T16:14:54.980 INFO:tasks.workunit.client.0.vm03.stdout:7/847: stat d4/f8f 0 2026-03-09T16:14:54.992 INFO:tasks.workunit.client.0.vm03.stdout:0/903: creat d0/d7/d3e/d57/d5a/d5f/f136 x:0 0 0 2026-03-09T16:14:54.995 INFO:tasks.workunit.client.0.vm03.stdout:7/848: sync 2026-03-09T16:14:54.997 INFO:tasks.workunit.client.0.vm03.stdout:7/849: chown d4/f3b 7017 1 2026-03-09T16:14:54.998 INFO:tasks.workunit.client.0.vm03.stdout:5/933: creat d2/d7/d115/d24/d27/f134 x:0 0 0 2026-03-09T16:14:55.007 INFO:tasks.workunit.client.0.vm03.stdout:1/796: rmdir d4/d6/da2/dea/d100 0 2026-03-09T16:14:55.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.007+0000 7f8edb6e1640 1 -- 192.168.123.103:0/4264854562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 msgr2=0x7f8ed4104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.007+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/4264854562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed4104100 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f8ec80099b0 tx=0x7f8ec802f240 comp rx=0 tx=0).stop 2026-03-09T16:14:55.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.008+0000 7f8edb6e1640 1 -- 192.168.123.103:0/4264854562 shutdown_connections 2026-03-09T16:14:55.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.008+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/4264854562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed4104100 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.008+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/4264854562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ed4102a80 0x7f8ed4102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.008+0000 7f8edb6e1640 1 -- 192.168.123.103:0/4264854562 >> 192.168.123.103:0/4264854562 conn(0x7f8ed40fe250 msgr2=0x7f8ed4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 -- 192.168.123.103:0/4264854562 shutdown_connections 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 -- 192.168.123.103:0/4264854562 wait complete. 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 Processor -- start 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 -- start start 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ed4102a80 0x7f8ed419e6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ed419f200 con 0x7f8ed4102a80 2026-03-09T16:14:55.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.010+0000 7f8edb6e1640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ed419f370 con 0x7f8ed4103c80 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58118/0 (socket says 192.168.123.103:58118) 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 -- 192.168.123.103:0/2412287217 learned_addr learned my addr 192.168.123.103:0/2412287217 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 -- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ed4102a80 msgr2=0x7f8ed419e6f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ed4102a80 0x7f8ed419e6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.011+0000 7f8ed8c55640 1 -- 192.168.123.103:0/2412287217 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ec8009660 con 0x7f8ed4103c80 2026-03-09T16:14:55.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.014+0000 7f8ed8c55640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f8ec8009630 tx=0x7f8ec8031cf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.015 INFO:tasks.workunit.client.0.vm03.stdout:6/824: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/de5/dfe/f108 x:0 0 0 2026-03-09T16:14:55.015 INFO:tasks.workunit.client.0.vm03.stdout:2/868: creat db/d12/d2a/d61/f12b x:0 0 0 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.015+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ec803d070 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.015+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ed41a3db0 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.015+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ed41a4240 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.016+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8ec8002e70 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.016+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ec8031280 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.016+0000 7f8eabfff640 1 -- 192.168.123.103:0/2412287217 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ed40769a0 con 0x7f8ed4103c80 2026-03-09T16:14:55.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.017+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 29) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f8ec8038830 con 0x7f8ed4103c80 2026-03-09T16:14:55.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.018+0000 7f8ec27fc640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 0x7f8e9c079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.018+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8ec80be350 con 0x7f8ed4103c80 2026-03-09T16:14:55.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.021+0000 7f8ed9456640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 0x7f8e9c079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.021+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f8ec8086d60 con 0x7f8ed4103c80 2026-03-09T16:14:55.024 INFO:tasks.workunit.client.0.vm03.stdout:3/872: rename d5/d6d/db9/df2/dae/fde to d5/d53/d88/dd7/df1/f103 0 2026-03-09T16:14:55.025 INFO:tasks.workunit.client.0.vm03.stdout:0/904: symlink d0/da/d5c/db6/l137 0 2026-03-09T16:14:55.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.024+0000 7f8ed9456640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 0x7f8e9c079b40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f8ec4005fd0 tx=0x7f8ec40094f0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.040 INFO:tasks.workunit.client.0.vm03.stdout:5/934: mkdir d2/d7/d1a/d135 0 2026-03-09T16:14:55.043 INFO:tasks.workunit.client.0.vm03.stdout:6/825: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/c109 0 2026-03-09T16:14:55.048 INFO:tasks.workunit.client.0.vm03.stdout:8/928: write da/d32/f4d [649704,9062] 0 2026-03-09T16:14:55.057 INFO:tasks.workunit.client.0.vm03.stdout:2/869: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/df7/d12c 0 2026-03-09T16:14:55.057 INFO:tasks.workunit.client.0.vm03.stdout:4/899: write d5/db/d25/d8b/da8/dbe/fc5 [523098,99995] 0 2026-03-09T16:14:55.066 INFO:tasks.workunit.client.0.vm03.stdout:7/850: rename d4/da/d5d/dd8/d22/d24/d16/d69 to d4/da/d45/d51/d36/d66/d118 0 2026-03-09T16:14:55.068 INFO:tasks.workunit.client.0.vm03.stdout:8/929: dwrite da/d32/d133/fa9 [0,4194304] 0 2026-03-09T16:14:55.069 INFO:tasks.workunit.client.0.vm03.stdout:1/797: mkdir d4/d6/d1d/d3d/d101/d104 0 2026-03-09T16:14:55.081 INFO:tasks.workunit.client.0.vm03.stdout:9/969: getdents d2/de/d88 0 2026-03-09T16:14:55.084 INFO:tasks.workunit.client.0.vm03.stdout:4/900: creat d5/d56/f111 x:0 0 0 2026-03-09T16:14:55.086 INFO:tasks.workunit.client.0.vm03.stdout:4/901: chown d5/db/d25/d8b/da8/df3/df7/f110 3682 1 2026-03-09T16:14:55.101 INFO:tasks.workunit.client.0.vm03.stdout:3/873: creat d5/d44/d102/f104 x:0 0 0 2026-03-09T16:14:55.104 INFO:tasks.workunit.client.0.vm03.stdout:7/851: readlink d4/da/d5d/dd8/d22/d24/d16/d3e/lba 0 2026-03-09T16:14:55.105 INFO:tasks.workunit.client.0.vm03.stdout:4/902: dwrite d5/dd/dd5/fef [0,4194304] 0 2026-03-09T16:14:55.110 INFO:tasks.workunit.client.0.vm03.stdout:3/874: dread - d5/d53/d88/dd7/df1/ffe zero size 2026-03-09T16:14:55.114 INFO:tasks.workunit.client.0.vm03.stdout:1/798: mkdir d4/db/d8b/d105 0 2026-03-09T16:14:55.123 INFO:tasks.workunit.client.0.vm03.stdout:0/905: write d0/d7/d3e/d57/d5a/d82/d89/dbd/f7e [1027125,112984] 0 2026-03-09T16:14:55.125 INFO:tasks.workunit.client.0.vm03.stdout:5/935: mknod d2/d7/d115/c136 0 2026-03-09T16:14:55.125 INFO:tasks.workunit.client.0.vm03.stdout:2/870: write db/ff0 [76243,78635] 0 2026-03-09T16:14:55.125 INFO:tasks.workunit.client.0.vm03.stdout:9/970: symlink d2/d4/d11/d29/d2a/d46/dd6/d111/l128 0 2026-03-09T16:14:55.125 INFO:tasks.workunit.client.0.vm03.stdout:1/799: stat d4/d6/d1d/d20/d93/f85 0 2026-03-09T16:14:55.130 INFO:tasks.workunit.client.0.vm03.stdout:1/800: chown d4/d6/d3b/f36 279 1 2026-03-09T16:14:55.149 INFO:tasks.workunit.client.0.vm03.stdout:6/826: link d9/d42/d45/d50/d80/d8a/dc1/f102 d9/d42/d45/d50/d80/f10a 0 2026-03-09T16:14:55.167 INFO:tasks.workunit.client.0.vm03.stdout:1/801: dwrite d4/d6/da2/dea/f103 [0,4194304] 0 2026-03-09T16:14:55.168 INFO:tasks.workunit.client.0.vm03.stdout:5/936: creat d2/d7/d115/d24/d27/d43/d4b/de6/f137 x:0 0 0 2026-03-09T16:14:55.176 INFO:tasks.workunit.client.0.vm03.stdout:4/903: dwrite d5/db/d25/d8b/da8/df3/df7/d33/d79/fa6 [0,4194304] 0 2026-03-09T16:14:55.185 INFO:tasks.workunit.client.0.vm03.stdout:0/906: dwrite d0/d7/ff8 [0,4194304] 0 2026-03-09T16:14:55.191 INFO:tasks.workunit.client.0.vm03.stdout:1/802: dread d4/d31/d5c/da8/fd0 [0,4194304] 0 2026-03-09T16:14:55.211 INFO:tasks.workunit.client.0.vm03.stdout:9/971: truncate d2/d4/d11/d12/ff5 784816 0 2026-03-09T16:14:55.215 INFO:tasks.workunit.client.0.vm03.stdout:1/803: sync 2026-03-09T16:14:55.224 INFO:tasks.workunit.client.0.vm03.stdout:1/804: dwrite d4/db/f60 [0,4194304] 0 2026-03-09T16:14:55.229 INFO:tasks.workunit.client.0.vm03.stdout:7/852: mknod d4/da/d5d/db0/d113/de8/c119 0 2026-03-09T16:14:55.229 INFO:tasks.workunit.client.0.vm03.stdout:6/827: rename d9/d42/f78 to d9/d8e/def/f10b 0 2026-03-09T16:14:55.230 INFO:tasks.workunit.client.0.vm03.stdout:1/805: chown d4/d6/d1d/d3d/f45 108999709 1 2026-03-09T16:14:55.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.250+0000 7f8eabfff640 1 -- 192.168.123.103:0/2412287217 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8ed4061be0 con 0x7f8e9c077680 2026-03-09T16:14:55.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.252+0000 7f8ec27fc640 1 -- 192.168.123.103:0/2412287217 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+329 (secure 0 0 0) 0x7f8ed4061be0 con 0x7f8e9c077680 2026-03-09T16:14:55.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.256+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 msgr2=0x7f8e9c079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.256+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 0x7f8e9c079b40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f8ec4005fd0 tx=0x7f8ec40094f0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.256+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 msgr2=0x7f8ed419ec30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.256+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f8ec8009630 tx=0x7f8ec8031cf0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.257+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 shutdown_connections 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.257+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8e9c077680 0x7f8e9c079b40 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.257+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ed4103c80 0x7f8ed419ec30 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.257+0000 7f8edb6e1640 1 --2- 192.168.123.103:0/2412287217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ed4102a80 0x7f8ed419e6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.257+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 >> 192.168.123.103:0/2412287217 conn(0x7f8ed40fe250 msgr2=0x7f8ed4104ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.258+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 shutdown_connections 2026-03-09T16:14:55.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.259+0000 7f8edb6e1640 1 -- 192.168.123.103:0/2412287217 wait complete. 2026-03-09T16:14:55.260 INFO:tasks.workunit.client.0.vm03.stdout:5/937: creat d2/d7/de/da9/f138 x:0 0 0 2026-03-09T16:14:55.260 INFO:tasks.workunit.client.0.vm03.stdout:4/904: dread - d5/db/d25/d8b/da8/dbe/fd0 zero size 2026-03-09T16:14:55.279 INFO:tasks.workunit.client.0.vm03.stdout:8/930: getdents da/d6c/d7a 0 2026-03-09T16:14:55.279 INFO:tasks.workunit.client.0.vm03.stdout:7/853: chown d4/da/lff 0 1 2026-03-09T16:14:55.287 INFO:tasks.workunit.client.0.vm03.stdout:1/806: readlink d4/d6/l6a 0 2026-03-09T16:14:55.293 INFO:tasks.workunit.client.0.vm03.stdout:5/938: dread - d2/d7/d115/d16/d5c/ff1 zero size 2026-03-09T16:14:55.295 INFO:tasks.workunit.client.0.vm03.stdout:5/939: chown d2/d7/de/d33/f129 107760803 1 2026-03-09T16:14:55.295 INFO:tasks.workunit.client.0.vm03.stdout:3/875: getdents d5/d1e/d42/d34/dd2 0 2026-03-09T16:14:55.309 INFO:tasks.workunit.client.0.vm03.stdout:8/931: symlink da/d32/l134 0 2026-03-09T16:14:55.309 INFO:tasks.workunit.client.0.vm03.stdout:7/854: unlink d4/da/c64 0 2026-03-09T16:14:55.309 INFO:tasks.workunit.client.0.vm03.stdout:8/932: chown da/db/da8/db8 2034053 1 2026-03-09T16:14:55.310 INFO:tasks.workunit.client.0.vm03.stdout:5/940: mknod d2/d7/d115/d16/d5c/dcf/c139 0 2026-03-09T16:14:55.310 INFO:tasks.workunit.client.0.vm03.stdout:7/855: write d4/da/d5d/db0/d9d/fac [794409,120415] 0 2026-03-09T16:14:55.310 INFO:tasks.workunit.client.0.vm03.stdout:8/933: write da/d32/d133/f12e [68417,16786] 0 2026-03-09T16:14:55.319 INFO:tasks.workunit.client.0.vm03.stdout:1/807: unlink d4/cf 0 2026-03-09T16:14:55.328 INFO:tasks.workunit.client.0.vm03.stdout:8/934: creat da/d45/f135 x:0 0 0 2026-03-09T16:14:55.328 INFO:tasks.workunit.client.0.vm03.stdout:3/876: dread d5/d2e/fec [0,4194304] 0 2026-03-09T16:14:55.342 INFO:tasks.workunit.client.0.vm03.stdout:2/871: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/da0/fcb [0,4194304] 0 2026-03-09T16:14:55.351 INFO:tasks.workunit.client.0.vm03.stdout:3/877: creat d5/d6d/d6a/f105 x:0 0 0 2026-03-09T16:14:55.351 INFO:tasks.workunit.client.0.vm03.stdout:1/808: truncate d4/d6/d1d/d20/d23/f74 1843840 0 2026-03-09T16:14:55.359 INFO:tasks.workunit.client.0.vm03.stdout:7/856: link d4/da/d5d/dd8/d22/d24/f59 d4/da/d45/d51/d36/f11a 0 2026-03-09T16:14:55.364 INFO:tasks.workunit.client.0.vm03.stdout:3/878: read d5/d44/f54 [2336421,70942] 0 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- 192.168.123.103:0/1452147802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68072440 msgr2=0x7f6a680771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/1452147802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68072440 0x7f6a680771b0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6a60009040 tx=0x7f6a6002fc10 comp rx=0 tx=0).stop 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- 192.168.123.103:0/1452147802 shutdown_connections 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/1452147802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68072440 0x7f6a680771b0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/1452147802 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a68071a70 0x7f6a68071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- 192.168.123.103:0/1452147802 >> 192.168.123.103:0/1452147802 conn(0x7f6a6806d4f0 msgr2=0x7f6a6806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- 192.168.123.103:0/1452147802 shutdown_connections 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- 192.168.123.103:0/1452147802 wait complete. 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 Processor -- start 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- start start 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a680826e0 0x7f6a68082b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a680845d0 con 0x7f6a680826e0 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.367+0000 7f6a6dff2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a680830a0 con 0x7f6a68071a70 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58136/0 (socket says 192.168.123.103:58136) 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 -- 192.168.123.103:0/2682134274 learned_addr learned my addr 192.168.123.103:0/2682134274 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 -- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a680826e0 msgr2=0x7f6a68082b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a680826e0 0x7f6a68082b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.368+0000 7f6a677fe640 1 -- 192.168.123.103:0/2682134274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a60008cf0 con 0x7f6a68071a70 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.369+0000 7f6a677fe640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f6a58009870 tx=0x7f6a58009d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.369+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a58010040 con 0x7f6a68071a70 2026-03-09T16:14:55.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.369+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a68083380 con 0x7f6a68071a70 2026-03-09T16:14:55.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.369+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a6812ef70 con 0x7f6a68071a70 2026-03-09T16:14:55.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.370+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6a5800ecf0 con 0x7f6a68071a70 2026-03-09T16:14:55.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.370+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a58002cf0 con 0x7f6a68071a70 2026-03-09T16:14:55.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.370+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a68072440 con 0x7f6a68071a70 2026-03-09T16:14:55.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.371+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 29) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f6a5800e830 con 0x7f6a68071a70 2026-03-09T16:14:55.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.371+0000 7f6a64ff9640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 0x7f6a54079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.372+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6a58099100 con 0x7f6a68071a70 2026-03-09T16:14:55.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.373+0000 7f6a66ffd640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 0x7f6a54079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.373+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f6a58061b30 con 0x7f6a68071a70 2026-03-09T16:14:55.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.374+0000 7f6a66ffd640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 0x7f6a54079c10 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6a68083e10 tx=0x7f6a60031f30 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.377 INFO:tasks.workunit.client.0.vm03.stdout:2/872: creat db/d12/d11a/f12d x:0 0 0 2026-03-09T16:14:55.382 INFO:tasks.workunit.client.0.vm03.stdout:9/972: dwrite d2/d54/d7d/d8f/dad/def/f9f [0,4194304] 0 2026-03-09T16:14:55.388 INFO:tasks.workunit.client.0.vm03.stdout:2/873: sync 2026-03-09T16:14:55.402 INFO:tasks.workunit.client.0.vm03.stdout:3/879: creat d5/d1e/d42/d4c/f106 x:0 0 0 2026-03-09T16:14:55.403 INFO:tasks.workunit.client.0.vm03.stdout:0/907: truncate d0/d7/d48/f2e 3878288 0 2026-03-09T16:14:55.405 INFO:tasks.workunit.client.0.vm03.stdout:0/908: fsync d0/d7/d3e/d57/d5a/d82/d89/dbd/f7e 0 2026-03-09T16:14:55.407 INFO:tasks.workunit.client.0.vm03.stdout:6/828: dwrite d9/d14/f29 [4194304,4194304] 0 2026-03-09T16:14:55.413 INFO:tasks.workunit.client.0.vm03.stdout:4/905: dwrite d5/db/d25/d8b/da8/f62 [0,4194304] 0 2026-03-09T16:14:55.414 INFO:tasks.workunit.client.0.vm03.stdout:7/857: symlink d4/da/d5d/db0/d61/dbc/l11b 0 2026-03-09T16:14:55.415 INFO:tasks.workunit.client.0.vm03.stdout:9/973: creat d2/d4/d11/d12/dc7/dee/dc2/de9/f129 x:0 0 0 2026-03-09T16:14:55.419 INFO:tasks.workunit.client.0.vm03.stdout:9/974: fdatasync d2/d4/d11/d29/d2a/d38/dcd/f119 0 2026-03-09T16:14:55.434 INFO:tasks.workunit.client.0.vm03.stdout:2/874: rename db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/lcf to db/d12/d2a/d99/de7/df9/d64/dbd/l12e 0 2026-03-09T16:14:55.436 INFO:tasks.workunit.client.0.vm03.stdout:2/875: stat db/d12/d2a/d99/fd4 0 2026-03-09T16:14:55.437 INFO:tasks.workunit.client.0.vm03.stdout:2/876: chown db/d12/da5/dc2 2597764 1 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: from='client.24431 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:55.450 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:55 vm03.local ceph-mon[51019]: pgmap v7: 65 pgs: 65 active+clean; 1.9 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 33 MiB/s rd, 51 MiB/s wr, 172 op/s 2026-03-09T16:14:55.464 INFO:tasks.workunit.client.0.vm03.stdout:5/941: dwrite d2/fd4 [0,4194304] 0 2026-03-09T16:14:55.482 INFO:tasks.workunit.client.0.vm03.stdout:8/935: write da/d10/d28/fd3 [1082409,45841] 0 2026-03-09T16:14:55.483 INFO:tasks.workunit.client.0.vm03.stdout:8/936: fdatasync da/db/da8/db8/f12a 0 2026-03-09T16:14:55.517 INFO:tasks.workunit.client.0.vm03.stdout:1/809: dwrite d4/d6/d3b/d63/f78 [0,4194304] 0 2026-03-09T16:14:55.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.582+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6a68075c90 con 0x7f6a54077750 2026-03-09T16:14:55.592 INFO:tasks.workunit.client.0.vm03.stdout:0/909: dwrite d0/da/d1b/d9b/ff3 [0,4194304] 0 2026-03-09T16:14:55.598 INFO:tasks.workunit.client.0.vm03.stdout:7/858: creat d4/da/d5d/db0/d61/dca/f11c x:0 0 0 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.598+0000 7f6a64ff9640 1 -- 192.168.123.103:0/2682134274 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6a68075c90 con 0x7f6a54077750 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 4s ago 4m 24.9M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 4s ago 4m 8921k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (4m) 6s ago 4m 8762k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 4s ago 4m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (4m) 6s ago 4m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 4s ago 4m 88.3M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (2m) 4s ago 2m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (2m) 4s ago 2m 233M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (2m) 6s ago 2m 14.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (2m) 6s ago 2m 16.6M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (19s) 4s ago 5m 576M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (41s) 6s ago 4m 501M - 19.2.3-678-ge911bdeb 654f31e6858e b47787a071c8 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 4s ago 5m 54.1M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (4m) 6s ago 4m 46.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 4s ago 4m 14.3M - 1.5.0 0da6a335fe13 8c7f00e55632 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 6s ago 4m 14.9M - 1.5.0 0da6a335fe13 4c3ab3bdf8cf 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 4s ago 3m 238M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:14:55.599 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 4s ago 3m 247M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:14:55.600 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 4s ago 3m 200M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:14:55.600 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (3m) 6s ago 3m 257M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:14:55.600 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (3m) 6s ago 3m 214M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:14:55.600 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (3m) 6s ago 3m 213M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:14:55.600 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (23s) 4s ago 4m 37.8M - 2.43.0 a07b618ecd1d 8dff9dfb84c9 2026-03-09T16:14:55.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 msgr2=0x7f6a54079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 0x7f6a54079c10 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6a68083e10 tx=0x7f6a60031f30 comp rx=0 tx=0).stop 2026-03-09T16:14:55.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 msgr2=0x7f6a68084090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f6a58009870 tx=0x7f6a58009d40 comp rx=0 tx=0).stop 2026-03-09T16:14:55.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 shutdown_connections 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6a54077750 0x7f6a54079c10 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a680826e0 0x7f6a68082b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 --2- 192.168.123.103:0/2682134274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a68071a70 0x7f6a68084090 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.601+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 >> 192.168.123.103:0/2682134274 conn(0x7f6a6806d4f0 msgr2=0x7f6a68073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.602+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 shutdown_connections 2026-03-09T16:14:55.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.602+0000 7f6a6dff2640 1 -- 192.168.123.103:0/2682134274 wait complete. 2026-03-09T16:14:55.613 INFO:tasks.workunit.client.0.vm03.stdout:6/829: creat d9/d42/d45/dfa/f10c x:0 0 0 2026-03-09T16:14:55.613 INFO:tasks.workunit.client.0.vm03.stdout:4/906: mknod d5/db/d25/d8b/da8/df3/df7/d4d/d5b/c112 0 2026-03-09T16:14:55.620 INFO:tasks.workunit.client.0.vm03.stdout:5/942: creat d2/d7/de/d11/d19/dbb/f13a x:0 0 0 2026-03-09T16:14:55.620 INFO:tasks.workunit.client.0.vm03.stdout:8/937: creat da/d10/d28/d64/f136 x:0 0 0 2026-03-09T16:14:55.630 INFO:tasks.workunit.client.0.vm03.stdout:3/880: fdatasync d5/d1e/f31 0 2026-03-09T16:14:55.657 INFO:tasks.workunit.client.0.vm03.stdout:1/810: write d4/d6/d1d/d20/d23/f9f [4595758,102897] 0 2026-03-09T16:14:55.680 INFO:tasks.workunit.client.0.vm03.stdout:9/975: rename d2/d54/d7d/d8f/dad/def/f76 to d2/d4/d11/d12/dc7/dee/dce/f12a 0 2026-03-09T16:14:55.687 INFO:tasks.workunit.client.0.vm03.stdout:6/830: write d9/d42/f9a [396661,23112] 0 2026-03-09T16:14:55.700 INFO:tasks.workunit.client.0.vm03.stdout:4/907: creat d5/d17/db7/f113 x:0 0 0 2026-03-09T16:14:55.701 INFO:tasks.workunit.client.0.vm03.stdout:4/908: rename d5/db to d5/db/d25/d8b/da8/df3/df7/d33/d79/d101/d114 22 2026-03-09T16:14:55.702 INFO:tasks.workunit.client.0.vm03.stdout:9/976: dread d2/d4/d11/d29/d2a/d46/f9e [0,4194304] 0 2026-03-09T16:14:55.707 INFO:tasks.workunit.client.0.vm03.stdout:9/977: dread - d2/d4/d11/d29/d2a/d46/ff2 zero size 2026-03-09T16:14:55.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- 192.168.123.103:0/566802524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280072440 msgr2=0x7f72800771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 --2- 192.168.123.103:0/566802524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280072440 0x7f72800771b0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f7278009040 tx=0x7f727802fc10 comp rx=0 tx=0).stop 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- 192.168.123.103:0/566802524 shutdown_connections 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 --2- 192.168.123.103:0/566802524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280072440 0x7f72800771b0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 --2- 192.168.123.103:0/566802524 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280071a70 0x7f7280071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- 192.168.123.103:0/566802524 >> 192.168.123.103:0/566802524 conn(0x7f728006d4f0 msgr2=0x7f728006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- 192.168.123.103:0/566802524 shutdown_connections 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- 192.168.123.103:0/566802524 wait complete. 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 Processor -- start 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- start start 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280082690 0x7f7280082b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7280084510 con 0x7f7280082690 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7286e39640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7280083050 con 0x7f7280071a70 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.718+0000 7f7284bae640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58150/0 (socket says 192.168.123.103:58150) 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 -- 192.168.123.103:0/406955774 learned_addr learned my addr 192.168.123.103:0/406955774 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f727ffff640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280082690 0x7f7280082b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 -- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280082690 msgr2=0x7f7280082b10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280082690 0x7f7280082b10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 -- 192.168.123.103:0/406955774 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7278008cf0 con 0x7f7280071a70 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7284bae640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f7270009870 tx=0x7f7270009d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7270010040 con 0x7f7280071a70 2026-03-09T16:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.719+0000 7f7286e39640 1 -- 192.168.123.103:0/406955774 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7280083330 con 0x7f7280071a70 2026-03-09T16:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.720+0000 7f7286e39640 1 -- 192.168.123.103:0/406955774 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f728012ef70 con 0x7f7280071a70 2026-03-09T16:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.720+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f727000ecf0 con 0x7f7280071a70 2026-03-09T16:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.720+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7270002cf0 con 0x7f7280071a70 2026-03-09T16:14:55.721 INFO:tasks.workunit.client.0.vm03.stdout:2/877: dwrite db/d12/f77 [0,4194304] 0 2026-03-09T16:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.722+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 29) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f727001e3b0 con 0x7f7280071a70 2026-03-09T16:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.722+0000 7f727dffb640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 0x7f7268079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.723+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7270099f20 con 0x7f7280071a70 2026-03-09T16:14:55.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.723+0000 7f7286e39640 1 -- 192.168.123.103:0/406955774 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7244005350 con 0x7f7280071a70 2026-03-09T16:14:55.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.726+0000 7f727ffff640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 0x7f7268079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:55.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.727+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f72700628d0 con 0x7f7280071a70 2026-03-09T16:14:55.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.732+0000 7f727ffff640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 0x7f7268079b70 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f7278002790 tx=0x7f7278007480 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:55.743 INFO:tasks.workunit.client.0.vm03.stdout:5/943: dwrite d2/d7/d115/f36 [0,4194304] 0 2026-03-09T16:14:55.755 INFO:tasks.workunit.client.0.vm03.stdout:8/938: truncate da/d32/d133/d80/f58 910281 0 2026-03-09T16:14:55.765 INFO:tasks.workunit.client.0.vm03.stdout:0/910: symlink d0/d7/d3e/l138 0 2026-03-09T16:14:55.775 INFO:tasks.workunit.client.0.vm03.stdout:3/881: readlink d5/d53/d6c/d79/dd9/lfa 0 2026-03-09T16:14:55.776 INFO:tasks.workunit.client.0.vm03.stdout:3/882: readlink d5/d6d/l5e 0 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: from='client.24431 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:55.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:55 vm05.local ceph-mon[58702]: pgmap v7: 65 pgs: 65 active+clean; 1.9 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 33 MiB/s rd, 51 MiB/s wr, 172 op/s 2026-03-09T16:14:55.787 INFO:tasks.workunit.client.0.vm03.stdout:6/831: mkdir d9/d14/da5/dd8/d10d 0 2026-03-09T16:14:55.816 INFO:tasks.workunit.client.0.vm03.stdout:2/878: rmdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad 39 2026-03-09T16:14:55.817 INFO:tasks.workunit.client.0.vm03.stdout:5/944: read - d2/f118 zero size 2026-03-09T16:14:55.817 INFO:tasks.workunit.client.0.vm03.stdout:2/879: stat db/d12/d2a/d61/l6f 0 2026-03-09T16:14:55.818 INFO:tasks.workunit.client.0.vm03.stdout:2/880: chown db/d12/d2a/d99/de7/df9/d64/dbd/cbc 730734 1 2026-03-09T16:14:55.821 INFO:tasks.workunit.client.0.vm03.stdout:8/939: rename da/db/da8/f107 to da/d32/df1/f137 0 2026-03-09T16:14:55.821 INFO:tasks.workunit.client.0.vm03.stdout:7/859: creat d4/da/d5d/dd8/d22/d24/f11d x:0 0 0 2026-03-09T16:14:55.822 INFO:tasks.workunit.client.0.vm03.stdout:7/860: chown d4/da/f42 0 1 2026-03-09T16:14:55.824 INFO:tasks.workunit.client.0.vm03.stdout:6/832: unlink d9/lb1 0 2026-03-09T16:14:55.824 INFO:tasks.workunit.client.0.vm03.stdout:4/909: mknod d5/db/d25/d8b/da8/df3/df7/d4d/c115 0 2026-03-09T16:14:55.825 INFO:tasks.workunit.client.0.vm03.stdout:6/833: chown d9/d42/d45/d50/d80/d8a/d9c/d97/da8/c6b 7071639 1 2026-03-09T16:14:55.829 INFO:tasks.workunit.client.0.vm03.stdout:5/945: symlink d2/d7/d3c/d3d/l13b 0 2026-03-09T16:14:55.829 INFO:tasks.workunit.client.0.vm03.stdout:7/861: dread d4/da/d5d/dd8/f6a [0,4194304] 0 2026-03-09T16:14:55.839 INFO:tasks.workunit.client.0.vm03.stdout:0/911: rename d0/d7/d3e/d57/d5a/d47/dce/df6 to d0/da/d5c/db6/d139 0 2026-03-09T16:14:55.839 INFO:tasks.workunit.client.0.vm03.stdout:1/811: dread d4/d6/d3b/f98 [0,4194304] 0 2026-03-09T16:14:55.857 INFO:tasks.workunit.client.0.vm03.stdout:4/910: write d5/d17/da0/fb9 [4966983,28062] 0 2026-03-09T16:14:55.866 INFO:tasks.workunit.client.0.vm03.stdout:2/881: symlink db/d12/d2a/d99/de7/df9/d64/dbd/da0/db6/l12f 0 2026-03-09T16:14:55.879 INFO:tasks.workunit.client.0.vm03.stdout:7/862: creat d4/da/dbf/f11e x:0 0 0 2026-03-09T16:14:55.882 INFO:tasks.workunit.client.0.vm03.stdout:5/946: chown d2/d7/d115/d16/l5f 2442473 1 2026-03-09T16:14:55.891 INFO:tasks.workunit.client.0.vm03.stdout:3/883: rename d5/f33 to d5/d6d/db9/df2/dbe/f107 0 2026-03-09T16:14:55.892 INFO:tasks.workunit.client.0.vm03.stdout:0/912: write d0/d7/d3e/d57/fdb [558143,12460] 0 2026-03-09T16:14:55.892 INFO:tasks.workunit.client.0.vm03.stdout:0/913: readlink d0/da/d5c/l49 0 2026-03-09T16:14:55.894 INFO:tasks.workunit.client.0.vm03.stdout:3/884: write d5/d53/d88/dd3/fba [5033124,51360] 0 2026-03-09T16:14:55.902 INFO:tasks.workunit.client.0.vm03.stdout:6/834: truncate f7 1579840 0 2026-03-09T16:14:55.903 INFO:tasks.workunit.client.0.vm03.stdout:6/835: stat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/l26 0 2026-03-09T16:14:55.906 INFO:tasks.workunit.client.0.vm03.stdout:7/863: sync 2026-03-09T16:14:55.907 INFO:tasks.workunit.client.0.vm03.stdout:7/864: fdatasync d4/da/d5d/db0/d113/db8/ddf/ffe 0 2026-03-09T16:14:55.919 INFO:tasks.workunit.client.0.vm03.stdout:4/911: rename d5/d56/f111 to d5/d17/d44/f116 0 2026-03-09T16:14:55.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.934+0000 7f7286e39640 1 -- 192.168.123.103:0/406955774 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f72440058d0 con 0x7f7280071a70 2026-03-09T16:14:55.936 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.936+0000 7f727dffb640 1 -- 192.168.123.103:0/406955774 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f727001a070 con 0x7f7280071a70 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:14:55.937 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:14:55.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.943+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 msgr2=0x7f7268079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.943+0000 7f72577fe640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 0x7f7268079b70 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f7278002790 tx=0x7f7278007480 comp rx=0 tx=0).stop 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.943+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 msgr2=0x7f7280083fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.943+0000 7f72577fe640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f7270009870 tx=0x7f7270009d40 comp rx=0 tx=0).stop 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 shutdown_connections 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f72680776b0 0x7f7268079b70 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7280082690 0x7f7280082b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 --2- 192.168.123.103:0/406955774 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7280071a70 0x7f7280083fd0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:55.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 >> 192.168.123.103:0/406955774 conn(0x7f728006d4f0 msgr2=0x7f728006e3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:55.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 shutdown_connections 2026-03-09T16:14:55.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:55.944+0000 7f72577fe640 1 -- 192.168.123.103:0/406955774 wait complete. 2026-03-09T16:14:55.949 INFO:tasks.workunit.client.0.vm03.stdout:2/882: write db/d12/d2a/d99/de7/df9/d64/ff6 [3782673,105380] 0 2026-03-09T16:14:55.956 INFO:tasks.workunit.client.0.vm03.stdout:0/914: dread d0/d7/d3e/d57/d5a/d82/d89/dbd/fc7 [0,4194304] 0 2026-03-09T16:14:55.967 INFO:tasks.workunit.client.0.vm03.stdout:3/885: creat d5/d6d/f108 x:0 0 0 2026-03-09T16:14:55.967 INFO:tasks.workunit.client.0.vm03.stdout:8/940: link da/d32/d79/f90 da/d32/db5/d126/f138 0 2026-03-09T16:14:55.970 INFO:tasks.workunit.client.0.vm03.stdout:3/886: chown d5/d53/d6c/d79/cd1 256982765 1 2026-03-09T16:14:55.971 INFO:tasks.workunit.client.0.vm03.stdout:9/978: link d2/d4/d11/d29/d2a/d46/dd6/d111/f11d d2/f12b 0 2026-03-09T16:14:55.971 INFO:tasks.workunit.client.0.vm03.stdout:1/812: creat d4/f106 x:0 0 0 2026-03-09T16:14:55.978 INFO:tasks.workunit.client.0.vm03.stdout:4/912: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/f117 x:0 0 0 2026-03-09T16:14:55.982 INFO:tasks.workunit.client.0.vm03.stdout:6/836: write d9/d42/d45/d50/d80/d8a/d9c/fe7 [1020923,50785] 0 2026-03-09T16:14:55.982 INFO:tasks.workunit.client.0.vm03.stdout:4/913: read d5/db/d25/d8b/da8/df3/df7/d33/d79/fa6 [1991648,4781] 0 2026-03-09T16:14:55.994 INFO:tasks.workunit.client.0.vm03.stdout:0/915: fdatasync d0/d7/d3e/d57/d5a/f38 0 2026-03-09T16:14:55.995 INFO:tasks.workunit.client.0.vm03.stdout:3/887: creat d5/f109 x:0 0 0 2026-03-09T16:14:55.995 INFO:tasks.workunit.client.0.vm03.stdout:1/813: rmdir d4/d6/d1d 39 2026-03-09T16:14:55.999 INFO:tasks.workunit.client.0.vm03.stdout:4/914: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/f118 x:0 0 0 2026-03-09T16:14:56.001 INFO:tasks.workunit.client.0.vm03.stdout:8/941: mkdir da/d32/d133/d139 0 2026-03-09T16:14:56.018 INFO:tasks.workunit.client.0.vm03.stdout:5/947: rename d2/d7/de9/f103 to d2/d7/de/d11/d19/f13c 0 2026-03-09T16:14:56.019 INFO:tasks.workunit.client.0.vm03.stdout:1/814: dwrite d4/d6/d3b/f36 [4194304,4194304] 0 2026-03-09T16:14:56.024 INFO:tasks.workunit.client.0.vm03.stdout:8/942: readlink da/d10/d28/d4f/d85/d9c/d10e/d105/l108 0 2026-03-09T16:14:56.025 INFO:tasks.workunit.client.0.vm03.stdout:4/915: fdatasync d5/db/d25/d8b/da8/dbe/fd0 0 2026-03-09T16:14:56.025 INFO:tasks.workunit.client.0.vm03.stdout:5/948: truncate d2/d7/de/d11/d19/dbb/f13a 156205 0 2026-03-09T16:14:56.029 INFO:tasks.workunit.client.0.vm03.stdout:9/979: rename d2/d54/d7d/c9c to d2/d54/d7d/dd3/c12c 0 2026-03-09T16:14:56.031 INFO:tasks.workunit.client.0.vm03.stdout:2/883: write db/d12/da5/dc2/dc9/ff1 [74976,109503] 0 2026-03-09T16:14:56.047 INFO:tasks.workunit.client.0.vm03.stdout:5/949: truncate d2/fdf 396613 0 2026-03-09T16:14:56.050 INFO:tasks.workunit.client.0.vm03.stdout:6/837: rename d9/d42/d45/d50/d80/d8a/fe9 to d9/d14/d71/f10e 0 2026-03-09T16:14:56.065 INFO:tasks.workunit.client.0.vm03.stdout:1/815: mkdir d4/d6/d107 0 2026-03-09T16:14:56.066 INFO:tasks.workunit.client.0.vm03.stdout:3/888: link d5/d6d/l5e d5/d1e/d42/d8b/ddf/l10a 0 2026-03-09T16:14:56.067 INFO:tasks.workunit.client.0.vm03.stdout:3/889: stat d5/d1e/d42/f99 0 2026-03-09T16:14:56.069 INFO:tasks.workunit.client.0.vm03.stdout:8/943: mknod da/db/c13a 0 2026-03-09T16:14:56.070 INFO:tasks.workunit.client.0.vm03.stdout:8/944: write da/d32/d79/f11e [995789,77434] 0 2026-03-09T16:14:56.073 INFO:tasks.workunit.client.0.vm03.stdout:6/838: chown d9/d14/c49 1 1 2026-03-09T16:14:56.078 INFO:tasks.workunit.client.0.vm03.stdout:7/865: truncate d4/da/d5d/db0/d9d/dc9/fd0 4014510 0 2026-03-09T16:14:56.079 INFO:tasks.workunit.client.0.vm03.stdout:0/916: write d0/d7/d3e/d57/d5a/d5f/db2/fa2 [577367,23691] 0 2026-03-09T16:14:56.080 INFO:tasks.workunit.client.0.vm03.stdout:7/866: chown d4/da/d45/d51/d36/fc8 2531247 1 2026-03-09T16:14:56.080 INFO:tasks.workunit.client.0.vm03.stdout:0/917: dread - d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc zero size 2026-03-09T16:14:56.081 INFO:tasks.workunit.client.0.vm03.stdout:3/890: sync 2026-03-09T16:14:56.082 INFO:tasks.workunit.client.0.vm03.stdout:1/816: sync 2026-03-09T16:14:56.086 INFO:tasks.workunit.client.0.vm03.stdout:5/950: dread d2/d7/d1a/d1c/d3f/f92 [0,4194304] 0 2026-03-09T16:14:56.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.092+0000 7f7f3db95640 1 -- 192.168.123.103:0/1256068787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380719a0 msgr2=0x7f7f38071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:56.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.092+0000 7f7f3db95640 1 --2- 192.168.123.103:0/1256068787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380719a0 0x7f7f38071da0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f7f240099b0 tx=0x7f7f2402f240 comp rx=0 tx=0).stop 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 -- 192.168.123.103:0/1256068787 shutdown_connections 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 --2- 192.168.123.103:0/1256068787 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f380722e0 0x7f7f38110d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 --2- 192.168.123.103:0/1256068787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380719a0 0x7f7f38071da0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 -- 192.168.123.103:0/1256068787 >> 192.168.123.103:0/1256068787 conn(0x7f7f3806d4f0 msgr2=0x7f7f3806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 -- 192.168.123.103:0/1256068787 shutdown_connections 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.095+0000 7f7f3db95640 1 -- 192.168.123.103:0/1256068787 wait complete. 2026-03-09T16:14:56.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 Processor -- start 2026-03-09T16:14:56.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 -- start start 2026-03-09T16:14:56.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f380719a0 0x7f7f381a2c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:56.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:56.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f381a3760 con 0x7f7f380719a0 2026-03-09T16:14:56.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.096+0000 7f7f3db95640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f381a38d0 con 0x7f7f380722e0 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58176/0 (socket says 192.168.123.103:58176) 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 -- 192.168.123.103:0/4271450033 learned_addr learned my addr 192.168.123.103:0/4271450033 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 -- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f380719a0 msgr2=0x7f7f381a2c50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f380719a0 0x7f7f381a2c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 -- 192.168.123.103:0/4271450033 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f24009660 con 0x7f7f380722e0 2026-03-09T16:14:56.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.097+0000 7f7f37fff640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f7f2800d8d0 tx=0x7f7f2800dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:56.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.098+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f28004490 con 0x7f7f380722e0 2026-03-09T16:14:56.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.098+0000 7f7f3db95640 1 -- 192.168.123.103:0/4271450033 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f381a8360 con 0x7f7f380722e0 2026-03-09T16:14:56.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.098+0000 7f7f3db95640 1 -- 192.168.123.103:0/4271450033 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f381a88b0 con 0x7f7f380722e0 2026-03-09T16:14:56.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.098+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7f2800bd00 con 0x7f7f380722e0 2026-03-09T16:14:56.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.098+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f28010460 con 0x7f7f380722e0 2026-03-09T16:14:56.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.100+0000 7f7f3db95640 1 -- 192.168.123.103:0/4271450033 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f38113ee0 con 0x7f7f380722e0 2026-03-09T16:14:56.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.100+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 29) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f7f280027e0 con 0x7f7f380722e0 2026-03-09T16:14:56.101 INFO:tasks.workunit.client.0.vm03.stdout:4/916: rename d5/dd/dd5/ce5 to d5/dd/c119 0 2026-03-09T16:14:56.101 INFO:tasks.workunit.client.0.vm03.stdout:2/884: mkdir db/d12/d2a/d99/d109/d130 0 2026-03-09T16:14:56.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.101+0000 7f7f35ffb640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 0x7f7f04079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:14:56.103 INFO:tasks.workunit.client.0.vm03.stdout:4/917: write d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/ffa [578438,36342] 0 2026-03-09T16:14:56.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.102+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7f280999d0 con 0x7f7f380722e0 2026-03-09T16:14:56.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.104+0000 7f7f3cb93640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 0x7f7f04079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:14:56.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.104+0000 7f7f3cb93640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 0x7f7f04079b40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f7f2402f750 tx=0x7f7f240047c0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:14:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.107+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f7f28062360 con 0x7f7f380722e0 2026-03-09T16:14:56.113 INFO:tasks.workunit.client.0.vm03.stdout:8/945: read - da/d10/d28/d4f/daf/dee/ff7 zero size 2026-03-09T16:14:56.116 INFO:tasks.workunit.client.0.vm03.stdout:9/980: dread d2/d4/d11/d12/d28/f2c [0,4194304] 0 2026-03-09T16:14:56.120 INFO:tasks.workunit.client.0.vm03.stdout:7/867: dread d4/da/d5d/dd8/f37 [0,4194304] 0 2026-03-09T16:14:56.120 INFO:tasks.workunit.client.0.vm03.stdout:9/981: sync 2026-03-09T16:14:56.120 INFO:tasks.workunit.client.0.vm03.stdout:9/982: sync 2026-03-09T16:14:56.122 INFO:tasks.workunit.client.0.vm03.stdout:6/839: chown d9/d42/d45/c5e 25003036 1 2026-03-09T16:14:56.124 INFO:tasks.workunit.client.0.vm03.stdout:0/918: fdatasync d0/da/d1b/d9b/f61 0 2026-03-09T16:14:56.126 INFO:tasks.workunit.client.0.vm03.stdout:1/817: rmdir d4/d39/deb 39 2026-03-09T16:14:56.146 INFO:tasks.workunit.client.0.vm03.stdout:3/891: rename d5/d53/d6c/fea to d5/d53/d6c/d79/d91/dc9/f10b 0 2026-03-09T16:14:56.150 INFO:tasks.workunit.client.0.vm03.stdout:3/892: dwrite d5/d6d/d6a/f8e [0,4194304] 0 2026-03-09T16:14:56.168 INFO:tasks.workunit.client.0.vm03.stdout:5/951: dwrite d2/d7/de/f78 [0,4194304] 0 2026-03-09T16:14:56.177 INFO:tasks.workunit.client.0.vm03.stdout:4/918: rmdir d5/db/d25/d8b/da8/d81/dd4 39 2026-03-09T16:14:56.177 INFO:tasks.workunit.client.0.vm03.stdout:4/919: fsync f1 0 2026-03-09T16:14:56.181 INFO:tasks.workunit.client.0.vm03.stdout:8/946: symlink da/d10/d28/d4f/daf/l13b 0 2026-03-09T16:14:56.183 INFO:tasks.workunit.client.0.vm03.stdout:7/868: creat d4/da/d45/d51/d36/d66/f11f x:0 0 0 2026-03-09T16:14:56.201 INFO:tasks.workunit.client.0.vm03.stdout:3/893: mkdir d5/d1e/d42/d8b/d10c 0 2026-03-09T16:14:56.208 INFO:tasks.workunit.client.0.vm03.stdout:1/818: dwrite d4/fa [4194304,4194304] 0 2026-03-09T16:14:56.213 INFO:tasks.workunit.client.0.vm03.stdout:2/885: mknod db/d12/d2a/d61/c131 0 2026-03-09T16:14:56.224 INFO:tasks.workunit.client.0.vm03.stdout:2/886: sync 2026-03-09T16:14:56.229 INFO:tasks.workunit.client.0.vm03.stdout:2/887: dwrite db/d12/f69 [0,4194304] 0 2026-03-09T16:14:56.232 INFO:tasks.workunit.client.0.vm03.stdout:2/888: chown db/d12/da5/de2 5533791 1 2026-03-09T16:14:56.235 INFO:tasks.workunit.client.0.vm03.stdout:2/889: dread - db/d12/d11a/f12d zero size 2026-03-09T16:14:56.243 INFO:tasks.workunit.client.0.vm03.stdout:2/890: truncate db/d12/da5/dc2/d110/f123 913401 0 2026-03-09T16:14:56.243 INFO:tasks.workunit.client.0.vm03.stdout:4/920: dread d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d82/fe7 [0,4194304] 0 2026-03-09T16:14:56.255 INFO:tasks.workunit.client.0.vm03.stdout:7/869: dwrite d4/da/d45/d51/f91 [4194304,4194304] 0 2026-03-09T16:14:56.257 INFO:tasks.workunit.client.0.vm03.stdout:8/947: dwrite da/d32/d133/d80/f2f [0,4194304] 0 2026-03-09T16:14:56.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.274+0000 7f7f3db95640 1 -- 192.168.123.103:0/4271450033 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f3810f8a0 con 0x7f7f04077680 2026-03-09T16:14:56.283 INFO:tasks.workunit.client.0.vm03.stdout:0/919: mknod d0/da/d5c/c13a 0 2026-03-09T16:14:56.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.287+0000 7f7f35ffb640 1 -- 192.168.123.103:0/4271450033 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+329 (secure 0 0 0) 0x7f7f3810f8a0 con 0x7f7f04077680 2026-03-09T16:14:56.287 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/2 daemons upgraded", 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:14:56.288 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:14:56.289 INFO:tasks.workunit.client.0.vm03.stdout:9/983: write d2/d4/d11/d29/f4e [1170969,90622] 0 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 msgr2=0x7f7f04079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 0x7f7f04079b40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f7f2402f750 tx=0x7f7f240047c0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 msgr2=0x7f7f381a3190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f7f2800d8d0 tx=0x7f7f2800dda0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 shutdown_connections 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f7f04077680 0x7f7f04079b40 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f380722e0 0x7f7f381a3190 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 --2- 192.168.123.103:0/4271450033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f380719a0 0x7f7f381a2c50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.290+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 >> 192.168.123.103:0/4271450033 conn(0x7f7f3806d4f0 msgr2=0x7f7f3810f190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.291+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 shutdown_connections 2026-03-09T16:14:56.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:14:56.291+0000 7f7f0b7fe640 1 -- 192.168.123.103:0/4271450033 wait complete. 2026-03-09T16:14:56.297 INFO:tasks.workunit.client.0.vm03.stdout:3/894: dread - d5/d53/d88/dd7/fee zero size 2026-03-09T16:14:56.307 INFO:tasks.workunit.client.0.vm03.stdout:1/819: chown d4/d6/d1d 665696 1 2026-03-09T16:14:56.312 INFO:tasks.workunit.client.0.vm03.stdout:5/952: mkdir d2/d13d 0 2026-03-09T16:14:56.316 INFO:tasks.workunit.client.0.vm03.stdout:2/891: readlink db/d12/d2a/l67 0 2026-03-09T16:14:56.319 INFO:tasks.workunit.client.0.vm03.stdout:4/921: truncate d5/d17/d44/f64 25464 0 2026-03-09T16:14:56.349 INFO:tasks.workunit.client.0.vm03.stdout:9/984: dwrite d2/de/f85 [0,4194304] 0 2026-03-09T16:14:56.353 INFO:tasks.workunit.client.0.vm03.stdout:3/895: dwrite d5/d2e/fec [0,4194304] 0 2026-03-09T16:14:56.372 INFO:tasks.workunit.client.0.vm03.stdout:5/953: unlink d2/d7/l89 0 2026-03-09T16:14:56.380 INFO:tasks.workunit.client.0.vm03.stdout:5/954: dread d2/d7/de/d11/d19/d29/d90/dbe/df5/f12b [0,4194304] 0 2026-03-09T16:14:56.381 INFO:tasks.workunit.client.0.vm03.stdout:5/955: write d2/d7/d115/f36 [5236566,79235] 0 2026-03-09T16:14:56.399 INFO:tasks.workunit.client.0.vm03.stdout:4/922: write d5/db/f34 [2075252,57307] 0 2026-03-09T16:14:56.405 INFO:tasks.workunit.client.0.vm03.stdout:6/840: getdents d9/d42/d45/ddf 0 2026-03-09T16:14:56.406 INFO:tasks.workunit.client.0.vm03.stdout:8/948: write da/d10/d28/f8c [791532,63824] 0 2026-03-09T16:14:56.408 INFO:tasks.workunit.client.0.vm03.stdout:0/920: chown d0/da/d7a/d98/ff9 711 1 2026-03-09T16:14:56.415 INFO:tasks.workunit.client.0.vm03.stdout:9/985: mknod d2/d4/d11/d12/dc7/dcc/c12d 0 2026-03-09T16:14:56.416 INFO:tasks.workunit.client.0.vm03.stdout:9/986: write d2/d4/d11/d12/dc7/dee/dc2/de9/ffb [1001517,37340] 0 2026-03-09T16:14:56.423 INFO:tasks.workunit.client.0.vm03.stdout:3/896: creat d5/d53/d6c/d79/d91/dc9/def/f10d x:0 0 0 2026-03-09T16:14:56.431 INFO:tasks.workunit.client.0.vm03.stdout:2/892: mknod db/d12/d2a/d99/de7/df9/c132 0 2026-03-09T16:14:56.437 INFO:tasks.workunit.client.0.vm03.stdout:5/956: dread d2/d75/f107 [0,4194304] 0 2026-03-09T16:14:56.638 INFO:tasks.workunit.client.0.vm03.stdout:4/923: creat d5/db/d25/d8b/da8/df3/f11a x:0 0 0 2026-03-09T16:14:56.639 INFO:tasks.workunit.client.0.vm03.stdout:4/924: readlink d5/dd/d1f/l67 0 2026-03-09T16:14:56.647 INFO:tasks.workunit.client.0.vm03.stdout:4/925: dwrite d5/db/d25/dc8/f10a [0,4194304] 0 2026-03-09T16:14:56.661 INFO:tasks.workunit.client.0.vm03.stdout:8/949: fdatasync da/db/f75 0 2026-03-09T16:14:56.672 INFO:tasks.workunit.client.0.vm03.stdout:9/987: unlink d2/d4/d1f/f23 0 2026-03-09T16:14:56.687 INFO:tasks.workunit.client.0.vm03.stdout:9/988: dread d2/d4/d11/d29/d2a/f58 [0,4194304] 0 2026-03-09T16:14:56.701 INFO:tasks.workunit.client.0.vm03.stdout:1/820: rename d4/d6/d3b/f35 to d4/d6/d3b/d6b/da5/f108 0 2026-03-09T16:14:56.712 INFO:tasks.workunit.client.0.vm03.stdout:5/957: truncate d2/d7/de/d11/d19/d31/d35/fd3 677367 0 2026-03-09T16:14:56.715 INFO:tasks.workunit.client.0.vm03.stdout:7/870: getdents d4/da/d45 0 2026-03-09T16:14:56.716 INFO:tasks.workunit.client.0.vm03.stdout:5/958: chown d2/d7/d115/d16/d5c/dfc/d106/d52/c55 5568 1 2026-03-09T16:14:56.719 INFO:tasks.workunit.client.0.vm03.stdout:6/841: mknod d9/d42/d45/d65/c10f 0 2026-03-09T16:14:56.720 INFO:tasks.workunit.client.0.vm03.stdout:6/842: chown d9/d42/d45/d50/d80/d8a/dc1/dd4 19 1 2026-03-09T16:14:56.728 INFO:tasks.workunit.client.0.vm03.stdout:9/989: truncate d2/d4/d11/d29/d2a/f8b 686706 0 2026-03-09T16:14:56.733 INFO:tasks.workunit.client.0.vm03.stdout:9/990: write d2/d4/d11/d29/d2a/db3/dbe/de0/f11f [1012174,77536] 0 2026-03-09T16:14:56.738 INFO:tasks.workunit.client.0.vm03.stdout:8/950: write da/d10/d28/fb0 [4775786,39769] 0 2026-03-09T16:14:56.744 INFO:tasks.workunit.client.0.vm03.stdout:5/959: rmdir d2/d7/de/d11/d19/d31 39 2026-03-09T16:14:56.746 INFO:tasks.workunit.client.0.vm03.stdout:6/843: write d9/d14/d71/f10e [1415749,87555] 0 2026-03-09T16:14:56.746 INFO:tasks.workunit.client.0.vm03.stdout:3/897: dwrite d5/d1e/d42/f1d [0,4194304] 0 2026-03-09T16:14:56.753 INFO:tasks.workunit.client.0.vm03.stdout:6/844: chown d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f27 302 1 2026-03-09T16:14:56.754 INFO:tasks.workunit.client.0.vm03.stdout:6/845: dread - d9/d42/d45/d50/d80/d90/db7/f106 zero size 2026-03-09T16:14:56.758 INFO:tasks.workunit.client.0.vm03.stdout:3/898: dwrite d5/d53/d88/dd7/df1/ffe [0,4194304] 0 2026-03-09T16:14:56.765 INFO:tasks.workunit.client.0.vm03.stdout:0/921: creat d0/da/f13b x:0 0 0 2026-03-09T16:14:56.772 INFO:tasks.workunit.client.0.vm03.stdout:1/821: mknod d4/db/c109 0 2026-03-09T16:14:56.773 INFO:tasks.workunit.client.0.vm03.stdout:8/951: fdatasync da/db/d43/fe5 0 2026-03-09T16:14:56.773 INFO:tasks.workunit.client.0.vm03.stdout:1/822: stat d4/fa 0 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='client.24435 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/406955774' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:14:56.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:56 vm05.local ceph-mon[58702]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='client.24435 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/406955774' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:14:56.778 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:56 vm03.local ceph-mon[51019]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T16:14:56.779 INFO:tasks.workunit.client.0.vm03.stdout:1/823: dwrite d4/d6/da2/dea/ffc [0,4194304] 0 2026-03-09T16:14:56.798 INFO:tasks.workunit.client.0.vm03.stdout:5/960: truncate d2/d7/d115/d16/d5c/fb5 3037840 0 2026-03-09T16:14:56.813 INFO:tasks.workunit.client.0.vm03.stdout:6/846: mkdir d9/d42/d45/d50/d80/d8a/dc1/dd4/de5/dfe/d110 0 2026-03-09T16:14:56.821 INFO:tasks.workunit.client.0.vm03.stdout:0/922: fdatasync d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc 0 2026-03-09T16:14:56.822 INFO:tasks.workunit.client.0.vm03.stdout:4/926: truncate d5/f54 9654930 0 2026-03-09T16:14:56.826 INFO:tasks.workunit.client.0.vm03.stdout:7/871: dwrite d4/da/d5d/dd8/d22/d24/d16/d3e/db5/fcf [0,4194304] 0 2026-03-09T16:14:56.858 INFO:tasks.workunit.client.0.vm03.stdout:8/952: unlink da/d10/d28/d64/l11f 0 2026-03-09T16:14:56.858 INFO:tasks.workunit.client.0.vm03.stdout:8/953: stat da/db/d30/f94 0 2026-03-09T16:14:56.863 INFO:tasks.workunit.client.0.vm03.stdout:9/991: write d2/d4/d11/d29/f70 [761506,68937] 0 2026-03-09T16:14:56.873 INFO:tasks.workunit.client.0.vm03.stdout:1/824: write d4/f6d [5239083,46540] 0 2026-03-09T16:14:56.900 INFO:tasks.workunit.client.0.vm03.stdout:3/899: mknod d5/d53/d88/c10e 0 2026-03-09T16:14:56.902 INFO:tasks.workunit.client.0.vm03.stdout:3/900: fsync d5/d1e/d42/d34/dd2/ff7 0 2026-03-09T16:14:56.902 INFO:tasks.workunit.client.0.vm03.stdout:0/923: creat d0/d7/d3e/d57/d5a/d82/dd2/f13c x:0 0 0 2026-03-09T16:14:56.909 INFO:tasks.workunit.client.0.vm03.stdout:0/924: dwrite d0/d7/d3e/d57/d5a/d5f/db2/f112 [0,4194304] 0 2026-03-09T16:14:56.915 INFO:tasks.workunit.client.0.vm03.stdout:5/961: dwrite d2/d7/d115/d16/d5c/dfc/f116 [0,4194304] 0 2026-03-09T16:14:56.915 INFO:tasks.workunit.client.0.vm03.stdout:0/925: chown d0/d7/d3e/d57/d5a/d82/d89 1798 1 2026-03-09T16:14:56.954 INFO:tasks.workunit.client.0.vm03.stdout:7/872: truncate d4/da/d5d/db0/d61/f84 1725303 0 2026-03-09T16:14:56.954 INFO:tasks.workunit.client.0.vm03.stdout:7/873: chown d4/da/d5d/dd8/d22/d24/d15/la7 1 1 2026-03-09T16:14:56.955 INFO:tasks.workunit.client.0.vm03.stdout:7/874: write d4/da/d5d/db0/d61/f8b [1685043,57929] 0 2026-03-09T16:14:56.955 INFO:tasks.workunit.client.0.vm03.stdout:2/893: getdents db/d12/d2a/d99/de7/df9/d64/dbd/da0/db6 0 2026-03-09T16:14:56.965 INFO:tasks.workunit.client.0.vm03.stdout:8/954: mknod da/d10/d28/db1/dce/de8/c13c 0 2026-03-09T16:14:56.998 INFO:tasks.workunit.client.0.vm03.stdout:0/926: mkdir d0/d7/d3e/d57/d5a/d5f/db2/dab/d13d 0 2026-03-09T16:14:56.998 INFO:tasks.workunit.client.0.vm03.stdout:7/875: symlink d4/d2d/d4b/l120 0 2026-03-09T16:14:56.998 INFO:tasks.workunit.client.0.vm03.stdout:8/955: symlink da/d32/d133/ddc/l13d 0 2026-03-09T16:14:56.999 INFO:tasks.workunit.client.0.vm03.stdout:3/901: symlink d5/d44/l10f 0 2026-03-09T16:14:57.001 INFO:tasks.workunit.client.0.vm03.stdout:7/876: write d4/d2d/f90 [756288,62401] 0 2026-03-09T16:14:57.017 INFO:tasks.workunit.client.0.vm03.stdout:3/902: dwrite d5/d6d/f108 [0,4194304] 0 2026-03-09T16:14:57.019 INFO:tasks.workunit.client.0.vm03.stdout:2/894: dread db/d12/f49 [0,4194304] 0 2026-03-09T16:14:57.019 INFO:tasks.workunit.client.0.vm03.stdout:8/956: dread da/d32/d79/f90 [0,4194304] 0 2026-03-09T16:14:57.021 INFO:tasks.workunit.client.0.vm03.stdout:8/957: fdatasync da/d10/d28/d64/f136 0 2026-03-09T16:14:57.038 INFO:tasks.workunit.client.0.vm03.stdout:9/992: rmdir d2/d4/d11/d29/d2a/d46/dd6/dd9 0 2026-03-09T16:14:57.040 INFO:tasks.workunit.client.0.vm03.stdout:6/847: link d9/d42/d45/d50/d80/d8a/cfb d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/c111 0 2026-03-09T16:14:57.047 INFO:tasks.workunit.client.0.vm03.stdout:4/927: getdents d5/db/d25/d8b 0 2026-03-09T16:14:57.051 INFO:tasks.workunit.client.0.vm03.stdout:4/928: truncate d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d9a/fde 225381 0 2026-03-09T16:14:57.053 INFO:tasks.workunit.client.0.vm03.stdout:4/929: fsync d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/f117 0 2026-03-09T16:14:57.061 INFO:tasks.workunit.client.0.vm03.stdout:3/903: creat d5/d53/d6c/d79/f110 x:0 0 0 2026-03-09T16:14:57.061 INFO:tasks.workunit.client.0.vm03.stdout:4/930: chown d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/f91 1 1 2026-03-09T16:14:57.062 INFO:tasks.workunit.client.0.vm03.stdout:3/904: write d5/fb3 [3781269,46999] 0 2026-03-09T16:14:57.072 INFO:tasks.workunit.client.0.vm03.stdout:0/927: dread d0/d7/d3e/d57/d5a/d52/d9f/fd3 [0,4194304] 0 2026-03-09T16:14:57.075 INFO:tasks.workunit.client.0.vm03.stdout:0/928: truncate d0/d7/d3e/d57/d5a/d82/dd2/f13c 806684 0 2026-03-09T16:14:57.076 INFO:tasks.workunit.client.0.vm03.stdout:9/993: read d2/d4/d11/d12/dc7/dee/dc2/fdd [3490929,27549] 0 2026-03-09T16:14:57.077 INFO:tasks.workunit.client.0.vm03.stdout:9/994: write d2/d54/d7d/f125 [368688,67366] 0 2026-03-09T16:14:57.078 INFO:tasks.workunit.client.0.vm03.stdout:9/995: chown d2/d54/d7d/d8f 392 1 2026-03-09T16:14:57.081 INFO:tasks.workunit.client.0.vm03.stdout:0/929: dread d0/d7/ff8 [0,4194304] 0 2026-03-09T16:14:57.095 INFO:tasks.workunit.client.0.vm03.stdout:5/962: write d2/d7/de/d11/d19/d31/fcb [1151164,40074] 0 2026-03-09T16:14:57.102 INFO:tasks.workunit.client.0.vm03.stdout:1/825: link d4/fd d4/d6/d1d/d20/d93/f10a 0 2026-03-09T16:14:57.137 INFO:tasks.workunit.client.0.vm03.stdout:3/905: rename d5/d1e/l45 to d5/d6d/db9/df2/dbe/l111 0 2026-03-09T16:14:57.159 INFO:tasks.workunit.client.0.vm03.stdout:9/996: mkdir d2/d4/d11/d29/d2a/db3/d12e 0 2026-03-09T16:14:57.160 INFO:tasks.workunit.client.0.vm03.stdout:0/930: creat d0/da/d5c/db6/f13e x:0 0 0 2026-03-09T16:14:57.162 INFO:tasks.workunit.client.0.vm03.stdout:5/963: readlink d2/d7/d115/d16/d5c/dfc/d106/d52/l98 0 2026-03-09T16:14:57.162 INFO:tasks.workunit.client.0.vm03.stdout:9/997: readlink d2/d4/d11/d29/d2a/db3/dbe/de0/l123 0 2026-03-09T16:14:57.162 INFO:tasks.workunit.client.0.vm03.stdout:0/931: chown d0/d7/d3e/d57/d5a/d5f/db2/dab/lb4 479481657 1 2026-03-09T16:14:57.165 INFO:tasks.workunit.client.0.vm03.stdout:0/932: chown d0/d7/d3e/d57/d5a/d5f/db2/l67 1237283961 1 2026-03-09T16:14:57.169 INFO:tasks.workunit.client.0.vm03.stdout:1/826: unlink d4/d6/d1d/d20/d23/f62 0 2026-03-09T16:14:57.171 INFO:tasks.workunit.client.0.vm03.stdout:8/958: link da/db/l2a da/d6c/dc4/l13e 0 2026-03-09T16:14:57.171 INFO:tasks.workunit.client.0.vm03.stdout:8/959: readlink da/d32/db5/lea 0 2026-03-09T16:14:57.174 INFO:tasks.workunit.client.0.vm03.stdout:3/906: readlink d5/d1e/d42/lff 0 2026-03-09T16:14:57.175 INFO:tasks.workunit.client.0.vm03.stdout:3/907: fdatasync d5/d6d/db9/df2/fe9 0 2026-03-09T16:14:57.180 INFO:tasks.workunit.client.0.vm03.stdout:3/908: dwrite d5/d2e/fec [0,4194304] 0 2026-03-09T16:14:57.198 INFO:tasks.workunit.client.0.vm03.stdout:5/964: symlink d2/d7/de9/l13e 0 2026-03-09T16:14:57.202 INFO:tasks.workunit.client.0.vm03.stdout:5/965: dread d2/d7/de/d11/d19/d29/d90/fac [0,4194304] 0 2026-03-09T16:14:57.206 INFO:tasks.workunit.client.0.vm03.stdout:4/931: write d5/db/d25/d8b/da8/dbe/fd0 [1021132,28916] 0 2026-03-09T16:14:57.206 INFO:tasks.workunit.client.0.vm03.stdout:2/895: write db/d12/fff [754079,7663] 0 2026-03-09T16:14:57.208 INFO:tasks.workunit.client.0.vm03.stdout:6/848: dwrite d9/d42/d45/d50/f51 [4194304,4194304] 0 2026-03-09T16:14:57.217 INFO:tasks.workunit.client.0.vm03.stdout:5/966: dread d2/d7/d115/d16/d5c/dfc/d106/d3b/fb3 [0,4194304] 0 2026-03-09T16:14:57.235 INFO:tasks.workunit.client.0.vm03.stdout:0/933: mknod d0/d7/d3e/d57/d5a/d82/c13f 0 2026-03-09T16:14:57.237 INFO:tasks.workunit.client.0.vm03.stdout:7/877: getdents d4/da/d5d/db0/d113/db8 0 2026-03-09T16:14:57.248 INFO:tasks.workunit.client.0.vm03.stdout:6/849: mkdir d9/d8e/def/d112 0 2026-03-09T16:14:57.251 INFO:tasks.workunit.client.0.vm03.stdout:0/934: dread - d0/d7/d3e/d57/d5a/d5f/db2/d8e/fcc zero size 2026-03-09T16:14:57.252 INFO:tasks.workunit.client.0.vm03.stdout:6/850: read d9/f20 [2017644,75590] 0 2026-03-09T16:14:57.254 INFO:tasks.workunit.client.0.vm03.stdout:4/932: dread d5/db/d25/d8b/da8/df3/df7/d4d/fb2 [0,4194304] 0 2026-03-09T16:14:57.280 INFO:tasks.workunit.client.0.vm03.stdout:8/960: link da/d10/d28/c121 da/d10/d28/d4f/daf/dee/c13f 0 2026-03-09T16:14:57.284 INFO:tasks.workunit.client.0.vm03.stdout:5/967: sync 2026-03-09T16:14:57.287 INFO:tasks.workunit.client.0.vm03.stdout:5/968: dwrite d2/d7/d115/f86 [0,4194304] 0 2026-03-09T16:14:57.289 INFO:tasks.workunit.client.0.vm03.stdout:5/969: write d2/d7/d115/d24/d27/d43/d4b/de6/f137 [609010,67640] 0 2026-03-09T16:14:57.296 INFO:tasks.workunit.client.0.vm03.stdout:0/935: unlink d0/d7/d3e/d57/d5a/d5f/db2/f76 0 2026-03-09T16:14:57.296 INFO:tasks.workunit.client.0.vm03.stdout:4/933: symlink d5/dd/dba/l11b 0 2026-03-09T16:14:57.304 INFO:tasks.workunit.client.0.vm03.stdout:6/851: getdents d9/d8e/def/d112 0 2026-03-09T16:14:57.305 INFO:tasks.workunit.client.0.vm03.stdout:6/852: dread - d9/d42/d45/d50/d80/d90/db7/f106 zero size 2026-03-09T16:14:57.306 INFO:tasks.workunit.client.0.vm03.stdout:7/878: getdents d4/da/d5d/dd8/d22/d24/d16/d3e/db5/dd4 0 2026-03-09T16:14:57.310 INFO:tasks.workunit.client.0.vm03.stdout:5/970: dwrite d2/d7/de/d11/d19/d29/f132 [0,4194304] 0 2026-03-09T16:14:57.323 INFO:tasks.workunit.client.0.vm03.stdout:8/961: getdents da/db/d127 0 2026-03-09T16:14:57.323 INFO:tasks.workunit.client.0.vm03.stdout:0/936: creat d0/da/d5c/db6/f140 x:0 0 0 2026-03-09T16:14:57.323 INFO:tasks.workunit.client.0.vm03.stdout:6/853: creat d9/d42/d45/d65/dae/f113 x:0 0 0 2026-03-09T16:14:57.332 INFO:tasks.workunit.client.0.vm03.stdout:8/962: read da/d32/f61 [1472345,104071] 0 2026-03-09T16:14:57.332 INFO:tasks.workunit.client.0.vm03.stdout:0/937: write d0/d7/d3e/d57/d5a/d82/dd2/f13c [1032229,90818] 0 2026-03-09T16:14:57.346 INFO:tasks.workunit.client.0.vm03.stdout:5/971: rmdir d2/d7/de/d11/d19/dbb 39 2026-03-09T16:14:57.346 INFO:tasks.workunit.client.0.vm03.stdout:4/934: creat d5/d17/db7/d10e/f11c x:0 0 0 2026-03-09T16:14:57.346 INFO:tasks.workunit.client.0.vm03.stdout:9/998: write d2/d4/d11/d29/d2a/d38/fca [1311849,90336] 0 2026-03-09T16:14:57.346 INFO:tasks.workunit.client.0.vm03.stdout:8/963: sync 2026-03-09T16:14:57.346 INFO:tasks.workunit.client.0.vm03.stdout:0/938: mknod d0/d7/d3e/d57/d5a/d5f/c141 0 2026-03-09T16:14:57.347 INFO:tasks.workunit.client.0.vm03.stdout:6/854: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/df9/f114 x:0 0 0 2026-03-09T16:14:57.348 INFO:tasks.workunit.client.0.vm03.stdout:0/939: chown d0/d7/d48/fb8 36 1 2026-03-09T16:14:57.355 INFO:tasks.workunit.client.0.vm03.stdout:8/964: dread da/d32/d79/f103 [0,4194304] 0 2026-03-09T16:14:57.356 INFO:tasks.workunit.client.0.vm03.stdout:7/879: creat d4/da/d5d/db0/f121 x:0 0 0 2026-03-09T16:14:57.358 INFO:tasks.workunit.client.0.vm03.stdout:8/965: dwrite da/d10/d28/fb0 [4194304,4194304] 0 2026-03-09T16:14:57.372 INFO:tasks.workunit.client.0.vm03.stdout:1/827: truncate d4/d6/da2/dea/ffc 1010429 0 2026-03-09T16:14:57.372 INFO:tasks.workunit.client.0.vm03.stdout:4/935: fdatasync d5/f74 0 2026-03-09T16:14:57.372 INFO:tasks.workunit.client.0.vm03.stdout:2/896: write db/d12/fe8 [992759,7921] 0 2026-03-09T16:14:57.372 INFO:tasks.workunit.client.0.vm03.stdout:3/909: write d5/d1e/f31 [4052057,3259] 0 2026-03-09T16:14:57.383 INFO:tasks.workunit.client.0.vm03.stdout:0/940: chown d0/d7/d3e/d57/d5a/d5f/db2/c122 3881800 1 2026-03-09T16:14:57.392 INFO:tasks.workunit.client.0.vm03.stdout:8/966: creat da/d10/d28/db1/f140 x:0 0 0 2026-03-09T16:14:57.408 INFO:tasks.workunit.client.0.vm03.stdout:3/910: mknod d5/d6d/c112 0 2026-03-09T16:14:57.426 INFO:tasks.workunit.client.0.vm03.stdout:9/999: write d2/d4/d11/d29/d92/f6a [979553,102590] 0 2026-03-09T16:14:57.427 INFO:tasks.workunit.client.0.vm03.stdout:8/967: mkdir da/d10/d28/d4f/daf/d141 0 2026-03-09T16:14:57.427 INFO:tasks.workunit.client.0.vm03.stdout:0/941: creat d0/d7/d3e/d57/d5a/d82/f142 x:0 0 0 2026-03-09T16:14:57.427 INFO:tasks.workunit.client.0.vm03.stdout:7/880: write d4/f8f [1046229,102657] 0 2026-03-09T16:14:57.428 INFO:tasks.workunit.client.0.vm03.stdout:2/897: chown db/d12/d2a/d61/d6d/f120 79 1 2026-03-09T16:14:57.431 INFO:tasks.workunit.client.0.vm03.stdout:6/855: dwrite d9/d42/d45/d50/d80/d8a/dc1/f102 [0,4194304] 0 2026-03-09T16:14:57.435 INFO:tasks.workunit.client.0.vm03.stdout:2/898: dread db/d12/f49 [0,4194304] 0 2026-03-09T16:14:57.470 INFO:tasks.workunit.client.0.vm03.stdout:1/828: dwrite d4/d6/da2/fd3 [0,4194304] 0 2026-03-09T16:14:57.518 INFO:tasks.workunit.client.0.vm03.stdout:3/911: creat d5/f113 x:0 0 0 2026-03-09T16:14:57.519 INFO:tasks.workunit.client.0.vm03.stdout:3/912: read - d5/f113 zero size 2026-03-09T16:14:57.524 INFO:tasks.workunit.client.0.vm03.stdout:3/913: dread d5/fb [0,4194304] 0 2026-03-09T16:14:57.547 INFO:tasks.workunit.client.0.vm03.stdout:0/942: truncate d0/d7/d3e/d57/d5a/fc1 470910 0 2026-03-09T16:14:57.550 INFO:tasks.workunit.client.0.vm03.stdout:4/936: dwrite d5/fa [0,4194304] 0 2026-03-09T16:14:57.559 INFO:tasks.workunit.client.0.vm03.stdout:0/943: dread d0/f4d [0,4194304] 0 2026-03-09T16:14:57.563 INFO:tasks.workunit.client.0.vm03.stdout:1/829: creat d4/d6/d1d/db5/f10b x:0 0 0 2026-03-09T16:14:57.572 INFO:tasks.workunit.client.0.vm03.stdout:1/830: dread d4/d6/da2/dea/f103 [0,4194304] 0 2026-03-09T16:14:57.582 INFO:tasks.workunit.client.0.vm03.stdout:1/831: dread d4/db/fd6 [0,4194304] 0 2026-03-09T16:14:57.588 INFO:tasks.workunit.client.0.vm03.stdout:2/899: write db/d12/d2a/d61/d6d/f8f [4557003,9702] 0 2026-03-09T16:14:57.590 INFO:tasks.workunit.client.0.vm03.stdout:1/832: dread d4/d6/d3b/d6b/f42 [0,4194304] 0 2026-03-09T16:14:57.591 INFO:tasks.workunit.client.0.vm03.stdout:1/833: chown d4/d39 63538 1 2026-03-09T16:14:57.597 INFO:tasks.workunit.client.0.vm03.stdout:5/972: getdents d2/d7/d115/d24/d27/d43/d4b/de6 0 2026-03-09T16:14:57.600 INFO:tasks.workunit.client.0.vm03.stdout:5/973: dwrite d2/d7/d115/d16/d5c/dfc/f116 [0,4194304] 0 2026-03-09T16:14:57.605 INFO:tasks.workunit.client.0.vm03.stdout:5/974: chown d2/d7/d3c/f113 2612 1 2026-03-09T16:14:57.606 INFO:tasks.workunit.client.0.vm03.stdout:5/975: dread d2/d75/f107 [0,4194304] 0 2026-03-09T16:14:57.606 INFO:tasks.workunit.client.0.vm03.stdout:5/976: read - d2/d7/de/da9/f138 zero size 2026-03-09T16:14:57.621 INFO:tasks.workunit.client.0.vm03.stdout:8/968: rename da/d32/d133/d80/c17 to da/d32/d133/d139/c142 0 2026-03-09T16:14:57.621 INFO:tasks.workunit.client.0.vm03.stdout:6/856: unlink d9/d14/da5/lb6 0 2026-03-09T16:14:57.621 INFO:tasks.workunit.client.0.vm03.stdout:7/881: creat d4/da/dbf/deb/d100/f122 x:0 0 0 2026-03-09T16:14:57.622 INFO:tasks.workunit.client.0.vm03.stdout:7/882: readlink d4/d2d/l6d 0 2026-03-09T16:14:57.622 INFO:tasks.workunit.client.0.vm03.stdout:8/969: chown da/d32/l134 111319 1 2026-03-09T16:14:57.627 INFO:tasks.workunit.client.0.vm03.stdout:6/857: dread d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:57.627 INFO:tasks.workunit.client.0.vm03.stdout:6/858: chown d9/d14/cf4 58 1 2026-03-09T16:14:57.639 INFO:tasks.workunit.client.0.vm03.stdout:6/859: dread d9/d42/d45/d50/f51 [0,4194304] 0 2026-03-09T16:14:57.639 INFO:tasks.workunit.client.0.vm03.stdout:4/937: mkdir d5/dd/dd5/d11d 0 2026-03-09T16:14:57.639 INFO:tasks.workunit.client.0.vm03.stdout:4/938: stat d5/f9 0 2026-03-09T16:14:57.668 INFO:tasks.workunit.client.0.vm03.stdout:2/900: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/f103 [0,4194304] 0 2026-03-09T16:14:57.670 INFO:tasks.workunit.client.0.vm03.stdout:3/914: creat d5/d6d/d6a/d101/f114 x:0 0 0 2026-03-09T16:14:57.670 INFO:tasks.workunit.client.0.vm03.stdout:3/915: fdatasync d5/d1e/d42/d34/f73 0 2026-03-09T16:14:57.679 INFO:tasks.workunit.client.0.vm03.stdout:0/944: rename d0/d7/d3e/d57/d5a/d52/f97 to d0/d7/d3e/d57/d5a/d82/f143 0 2026-03-09T16:14:57.682 INFO:tasks.workunit.client.0.vm03.stdout:7/883: mkdir d4/da/dbf/deb/d100/d123 0 2026-03-09T16:14:57.687 INFO:tasks.workunit.client.0.vm03.stdout:6/860: truncate d9/d42/d45/d50/d80/d8a/dc1/fc7 971045 0 2026-03-09T16:14:57.704 INFO:tasks.workunit.client.0.vm03.stdout:1/834: mknod d4/d39/deb/c10c 0 2026-03-09T16:14:57.715 INFO:tasks.workunit.client.0.vm03.stdout:4/939: write d5/f74 [4173219,22754] 0 2026-03-09T16:14:57.715 INFO:tasks.workunit.client.0.vm03.stdout:8/970: write da/d6c/fae [616078,74821] 0 2026-03-09T16:14:57.716 INFO:tasks.workunit.client.0.vm03.stdout:8/971: stat da/d1d/d3b 0 2026-03-09T16:14:57.718 INFO:tasks.workunit.client.0.vm03.stdout:5/977: symlink d2/d7/de/d11/l13f 0 2026-03-09T16:14:57.719 INFO:tasks.workunit.client.0.vm03.stdout:5/978: stat d2/d7/de/d11/d19/d29 0 2026-03-09T16:14:57.727 INFO:tasks.workunit.client.0.vm03.stdout:2/901: rename db/d12/d2a/d99/d109 to db/d12/da5/dbb/dc3/d133 0 2026-03-09T16:14:57.736 INFO:tasks.workunit.client.0.vm03.stdout:8/972: mknod da/d32/d79/c143 0 2026-03-09T16:14:57.741 INFO:tasks.workunit.client.0.vm03.stdout:5/979: fsync d2/d7/d115/d16/fe5 0 2026-03-09T16:14:57.749 INFO:tasks.workunit.client.0.vm03.stdout:3/916: write d5/d6d/d5a/f78 [1098185,23369] 0 2026-03-09T16:14:57.753 INFO:tasks.workunit.client.0.vm03.stdout:1/835: rename d4/d6/d1d/d3d to d4/d6/d3b/d6b/d25/d50/d10d 0 2026-03-09T16:14:57.754 INFO:tasks.workunit.client.0.vm03.stdout:1/836: chown d4/d6/d3b/f36 41286013 1 2026-03-09T16:14:57.757 INFO:tasks.workunit.client.0.vm03.stdout:2/902: fsync db/f14 0 2026-03-09T16:14:57.760 INFO:tasks.workunit.client.0.vm03.stdout:0/945: mknod d0/d7/d3e/c144 0 2026-03-09T16:14:57.770 INFO:tasks.workunit.client.0.vm03.stdout:7/884: dwrite d4/da/dbf/f105 [0,4194304] 0 2026-03-09T16:14:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:57 vm05.local ceph-mon[58702]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:57 vm05.local ceph-mon[58702]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:14:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:57 vm05.local ceph-mon[58702]: pgmap v8: 65 pgs: 65 active+clean; 1.9 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 25 MiB/s rd, 39 MiB/s wr, 133 op/s 2026-03-09T16:14:57.777 INFO:tasks.workunit.client.0.vm03.stdout:8/973: mknod da/d32/dad/c144 0 2026-03-09T16:14:57.777 INFO:tasks.workunit.client.0.vm03.stdout:8/974: readlink da/d32/d79/l9f 0 2026-03-09T16:14:57.788 INFO:tasks.workunit.client.0.vm03.stdout:4/940: truncate d5/db/f5d 7552856 0 2026-03-09T16:14:57.789 INFO:tasks.workunit.client.0.vm03.stdout:2/903: symlink db/d12/d2a/d61/d79/l134 0 2026-03-09T16:14:57.792 INFO:tasks.workunit.client.0.vm03.stdout:0/946: chown d0/d7/d3e/d57/d5a/d52/d9f/fe3 2 1 2026-03-09T16:14:57.797 INFO:tasks.workunit.client.0.vm03.stdout:6/861: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f115 x:0 0 0 2026-03-09T16:14:57.804 INFO:tasks.workunit.client.0.vm03.stdout:8/975: rmdir da/d10/d28 39 2026-03-09T16:14:57.821 INFO:tasks.workunit.client.0.vm03.stdout:5/980: mknod d2/d7/d115/d16/c140 0 2026-03-09T16:14:57.825 INFO:tasks.workunit.client.0.vm03.stdout:3/917: mknod d5/d53/d88/c115 0 2026-03-09T16:14:57.826 INFO:tasks.workunit.client.0.vm03.stdout:1/837: mknod d4/db/c10e 0 2026-03-09T16:14:57.829 INFO:tasks.workunit.client.0.vm03.stdout:4/941: fdatasync d5/fed 0 2026-03-09T16:14:57.832 INFO:tasks.workunit.client.0.vm03.stdout:2/904: chown db/d12/da5/de4/l104 7338624 1 2026-03-09T16:14:57.835 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:57 vm03.local ceph-mon[51019]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:14:57.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:57 vm03.local ceph-mon[51019]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T16:14:57.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:57 vm03.local ceph-mon[51019]: pgmap v8: 65 pgs: 65 active+clean; 1.9 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 25 MiB/s rd, 39 MiB/s wr, 133 op/s 2026-03-09T16:14:57.844 INFO:tasks.workunit.client.0.vm03.stdout:8/976: sync 2026-03-09T16:14:57.846 INFO:tasks.workunit.client.0.vm03.stdout:7/885: symlink d4/da/d5d/db0/d113/l124 0 2026-03-09T16:14:57.852 INFO:tasks.workunit.client.0.vm03.stdout:5/981: rename d2/d7/de/d11/d19/d31/fcb to d2/d7/de/d54/f141 0 2026-03-09T16:14:57.853 INFO:tasks.workunit.client.0.vm03.stdout:5/982: stat d2/d7/de/d11/dbf/lf2 0 2026-03-09T16:14:57.866 INFO:tasks.workunit.client.0.vm03.stdout:1/838: dread d4/d6/d3b/d6b/d25/fb8 [0,4194304] 0 2026-03-09T16:14:57.870 INFO:tasks.workunit.client.0.vm03.stdout:4/942: mknod d5/dd/d1f/c11e 0 2026-03-09T16:14:57.873 INFO:tasks.workunit.client.0.vm03.stdout:2/905: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/d135 0 2026-03-09T16:14:57.877 INFO:tasks.workunit.client.0.vm03.stdout:0/947: link d0/d7/d3e/d57/d5a/d5f/f71 d0/d7/d3e/d57/d5a/d5f/db2/d8e/dba/f145 0 2026-03-09T16:14:57.882 INFO:tasks.workunit.client.0.vm03.stdout:6/862: symlink d9/d14/da5/dd8/d10d/l116 0 2026-03-09T16:14:57.885 INFO:tasks.workunit.client.0.vm03.stdout:6/863: dwrite d9/d42/d45/d65/dae/fff [0,4194304] 0 2026-03-09T16:14:57.898 INFO:tasks.workunit.client.0.vm03.stdout:8/977: rmdir da/d32/d133/d139 39 2026-03-09T16:14:57.902 INFO:tasks.workunit.client.0.vm03.stdout:7/886: creat d4/da/d5d/db0/d9d/f125 x:0 0 0 2026-03-09T16:14:57.909 INFO:tasks.workunit.client.0.vm03.stdout:1/839: mkdir d4/d6/d1d/d20/d93/d10f 0 2026-03-09T16:14:57.917 INFO:tasks.workunit.client.0.vm03.stdout:0/948: mkdir d0/d7/d3e/d57/d5a/d52/d9f/d146 0 2026-03-09T16:14:57.922 INFO:tasks.workunit.client.0.vm03.stdout:0/949: readlink d0/d7/d3e/d111/l118 0 2026-03-09T16:14:57.924 INFO:tasks.workunit.client.0.vm03.stdout:8/978: write da/d32/f61 [2089834,47335] 0 2026-03-09T16:14:57.927 INFO:tasks.workunit.client.0.vm03.stdout:3/918: rename d5/d6d/d6a/dbd/lcf to d5/d1e/d42/d4c/l116 0 2026-03-09T16:14:57.931 INFO:tasks.workunit.client.0.vm03.stdout:1/840: rmdir d4/d6/da2/dea 39 2026-03-09T16:14:57.935 INFO:tasks.workunit.client.0.vm03.stdout:0/950: unlink d0/d7/d3e/d57/d5a/d5f/db2/dab/f113 0 2026-03-09T16:14:57.938 INFO:tasks.workunit.client.0.vm03.stdout:6/864: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/d117 0 2026-03-09T16:14:57.943 INFO:tasks.workunit.client.0.vm03.stdout:1/841: read d4/d6/f15 [88576,71226] 0 2026-03-09T16:14:57.943 INFO:tasks.workunit.client.0.vm03.stdout:8/979: readlink da/d10/d28/db1/l11b 0 2026-03-09T16:14:57.943 INFO:tasks.workunit.client.0.vm03.stdout:8/980: truncate da/db/d43/f12c 143315 0 2026-03-09T16:14:57.947 INFO:tasks.workunit.client.0.vm03.stdout:5/983: rename d2/d7/d115/d24/l51 to d2/d7/d115/d16/d5c/dfc/d106/d52/l142 0 2026-03-09T16:14:57.947 INFO:tasks.workunit.client.0.vm03.stdout:4/943: rename d5/db/d25/d8b/da8/df3/df7 to d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d7d/d11f 22 2026-03-09T16:14:57.954 INFO:tasks.workunit.client.0.vm03.stdout:0/951: creat d0/d7/d3e/d57/de9/f147 x:0 0 0 2026-03-09T16:14:57.957 INFO:tasks.workunit.client.0.vm03.stdout:6/865: mknod d9/d8e/def/c118 0 2026-03-09T16:14:57.971 INFO:tasks.workunit.client.0.vm03.stdout:2/906: dwrite db/d12/d2a/d99/de7/df9/d64/f80 [0,4194304] 0 2026-03-09T16:14:57.972 INFO:tasks.workunit.client.0.vm03.stdout:2/907: stat db/c76 0 2026-03-09T16:14:57.977 INFO:tasks.workunit.client.0.vm03.stdout:8/981: truncate da/f4c 694817 0 2026-03-09T16:14:57.980 INFO:tasks.workunit.client.0.vm03.stdout:4/944: creat d5/db/d25/d8b/da8/df3/f120 x:0 0 0 2026-03-09T16:14:57.999 INFO:tasks.workunit.client.0.vm03.stdout:6/866: rmdir d9/d14/da5 39 2026-03-09T16:14:58.004 INFO:tasks.workunit.client.0.vm03.stdout:1/842: symlink d4/d6/d3b/d6b/d25/d50/d10d/d101/d104/l110 0 2026-03-09T16:14:58.029 INFO:tasks.workunit.client.0.vm03.stdout:5/984: getdents d2/d13d 0 2026-03-09T16:14:58.032 INFO:tasks.workunit.client.0.vm03.stdout:4/945: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/f121 x:0 0 0 2026-03-09T16:14:58.042 INFO:tasks.workunit.client.0.vm03.stdout:7/887: rename d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/c7f to d4/da/d5d/dd8/c126 0 2026-03-09T16:14:58.051 INFO:tasks.workunit.client.0.vm03.stdout:6/867: creat d9/d14/da5/dd8/d10d/f119 x:0 0 0 2026-03-09T16:14:58.056 INFO:tasks.workunit.client.0.vm03.stdout:5/985: truncate d2/d7/de/d11/f26 4047324 0 2026-03-09T16:14:58.057 INFO:tasks.workunit.client.0.vm03.stdout:5/986: write d2/d7/d115/d16/d5c/dcf/f130 [349157,76041] 0 2026-03-09T16:14:58.057 INFO:tasks.workunit.client.0.vm03.stdout:5/987: readlink d2/d7/d115/d16/l5f 0 2026-03-09T16:14:58.063 INFO:tasks.workunit.client.0.vm03.stdout:3/919: rename d5/d53/d88/dd7/df1/f103 to d5/d6d/d5a/f117 0 2026-03-09T16:14:58.065 INFO:tasks.workunit.client.0.vm03.stdout:7/888: mknod d4/da/dbf/c127 0 2026-03-09T16:14:58.072 INFO:tasks.workunit.client.0.vm03.stdout:0/952: getdents d0/d7/d3e/d57 0 2026-03-09T16:14:58.091 INFO:tasks.workunit.client.0.vm03.stdout:4/946: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T16:14:58.092 INFO:tasks.workunit.client.0.vm03.stdout:0/953: sync 2026-03-09T16:14:58.095 INFO:tasks.workunit.client.0.vm03.stdout:4/947: dread - d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d77/f121 zero size 2026-03-09T16:14:58.117 INFO:tasks.workunit.client.0.vm03.stdout:5/988: mkdir d2/d75/d119/d143 0 2026-03-09T16:14:58.117 INFO:tasks.workunit.client.0.vm03.stdout:2/908: rename db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/f103 to db/d12/d2a/d61/dca/f136 0 2026-03-09T16:14:58.118 INFO:tasks.workunit.client.0.vm03.stdout:5/989: chown d2/d7/de/d11/c61 0 1 2026-03-09T16:14:58.119 INFO:tasks.workunit.client.0.vm03.stdout:5/990: dread d2/d7/d115/d24/d27/d43/d4b/fd1 [0,4194304] 0 2026-03-09T16:14:58.121 INFO:tasks.workunit.client.0.vm03.stdout:3/920: creat d5/d1e/f118 x:0 0 0 2026-03-09T16:14:58.123 INFO:tasks.workunit.client.0.vm03.stdout:4/948: chown d5/db/d25/d8b/ce8 706317 1 2026-03-09T16:14:58.124 INFO:tasks.workunit.client.0.vm03.stdout:6/868: link d9/d42/d45/d50/f105 d9/d42/d45/d65/dae/df0/f11a 0 2026-03-09T16:14:58.125 INFO:tasks.workunit.client.0.vm03.stdout:1/843: getdents d4/db 0 2026-03-09T16:14:58.129 INFO:tasks.workunit.client.0.vm03.stdout:1/844: dwrite d4/d39/f5a [8388608,4194304] 0 2026-03-09T16:14:58.138 INFO:tasks.workunit.client.0.vm03.stdout:8/982: rename da/db/da8/db8 to da/d1d/d10b/d145 0 2026-03-09T16:14:58.138 INFO:tasks.workunit.client.0.vm03.stdout:2/909: truncate f0 5911152 0 2026-03-09T16:14:58.147 INFO:tasks.workunit.client.0.vm03.stdout:5/991: symlink d2/d7/de/d11/d19/d31/l144 0 2026-03-09T16:14:58.151 INFO:tasks.workunit.client.0.vm03.stdout:5/992: write d2/fd4 [1874829,107156] 0 2026-03-09T16:14:58.151 INFO:tasks.workunit.client.0.vm03.stdout:3/921: mknod d5/d44/d61/c119 0 2026-03-09T16:14:58.160 INFO:tasks.workunit.client.0.vm03.stdout:0/954: rename d0/d7/d3e/d57/d5a/f4b to d0/d7/d3e/d111/f148 0 2026-03-09T16:14:58.164 INFO:tasks.workunit.client.0.vm03.stdout:8/983: mknod da/d6c/d7a/de4/c146 0 2026-03-09T16:14:58.174 INFO:tasks.workunit.client.0.vm03.stdout:3/922: mkdir d5/d1e/d11a 0 2026-03-09T16:14:58.174 INFO:tasks.workunit.client.0.vm03.stdout:3/923: chown d5/d53/d6c/c4e 7208 1 2026-03-09T16:14:58.179 INFO:tasks.workunit.client.0.vm03.stdout:3/924: dread d5/d53/d88/dd7/fc7 [0,4194304] 0 2026-03-09T16:14:58.183 INFO:tasks.workunit.client.0.vm03.stdout:7/889: creat d4/f128 x:0 0 0 2026-03-09T16:14:58.184 INFO:tasks.workunit.client.0.vm03.stdout:7/890: truncate d4/da/dbf/f11e 736154 0 2026-03-09T16:14:58.188 INFO:tasks.workunit.client.0.vm03.stdout:1/845: mkdir d4/d31/d5c/d111 0 2026-03-09T16:14:58.190 INFO:tasks.workunit.client.0.vm03.stdout:4/949: write d5/db/fb3 [222963,62063] 0 2026-03-09T16:14:58.197 INFO:tasks.workunit.client.0.vm03.stdout:6/869: dwrite d9/d42/d45/ffd [0,4194304] 0 2026-03-09T16:14:58.201 INFO:tasks.workunit.client.0.vm03.stdout:0/955: dread d0/d7/d75/f62 [0,4194304] 0 2026-03-09T16:14:58.211 INFO:tasks.workunit.client.0.vm03.stdout:3/925: creat d5/d53/d6c/d79/dd9/f11b x:0 0 0 2026-03-09T16:14:58.213 INFO:tasks.workunit.client.0.vm03.stdout:7/891: chown d4/c6 2 1 2026-03-09T16:14:58.217 INFO:tasks.workunit.client.0.vm03.stdout:5/993: write d2/d7/d115/d16/d5c/f10d [406688,6227] 0 2026-03-09T16:14:58.219 INFO:tasks.workunit.client.0.vm03.stdout:2/910: write db/d12/d2a/d99/de7/df9/fa7 [893632,56822] 0 2026-03-09T16:14:58.224 INFO:tasks.workunit.client.0.vm03.stdout:8/984: mknod da/d10/c147 0 2026-03-09T16:14:58.224 INFO:tasks.workunit.client.0.vm03.stdout:8/985: chown da/d10/ld5 10 1 2026-03-09T16:14:58.230 INFO:tasks.workunit.client.0.vm03.stdout:3/926: unlink d5/d6d/c112 0 2026-03-09T16:14:58.230 INFO:tasks.workunit.client.0.vm03.stdout:1/846: mkdir d4/d6/d1d/d20/d23/d112 0 2026-03-09T16:14:58.232 INFO:tasks.workunit.client.0.vm03.stdout:5/994: sync 2026-03-09T16:14:58.232 INFO:tasks.workunit.client.0.vm03.stdout:8/986: sync 2026-03-09T16:14:58.235 INFO:tasks.workunit.client.0.vm03.stdout:0/956: read d0/f60 [2040283,73907] 0 2026-03-09T16:14:58.236 INFO:tasks.workunit.client.0.vm03.stdout:6/870: symlink d9/d42/d45/d50/d80/d8a/d9c/d97/da8/dbd/l11b 0 2026-03-09T16:14:58.237 INFO:tasks.workunit.client.0.vm03.stdout:0/957: sync 2026-03-09T16:14:58.238 INFO:tasks.workunit.client.0.vm03.stdout:0/958: chown d0/da/d5c/c64 2845798 1 2026-03-09T16:14:58.239 INFO:tasks.workunit.client.0.vm03.stdout:0/959: sync 2026-03-09T16:14:58.240 INFO:tasks.workunit.client.0.vm03.stdout:0/960: readlink d0/d7/d3e/d57/d5a/d5f/db2/dcf/ldd 0 2026-03-09T16:14:58.245 INFO:tasks.workunit.client.0.vm03.stdout:2/911: symlink db/d12/d2a/d99/de7/df9/d52/l137 0 2026-03-09T16:14:58.247 INFO:tasks.workunit.client.0.vm03.stdout:3/927: fdatasync d5/d1e/d42/d8b/fd0 0 2026-03-09T16:14:58.252 INFO:tasks.workunit.client.0.vm03.stdout:1/847: rename d4/d6/d3b/d6b/d25/fc7 to d4/d6/d1d/d20/d5f/def/f113 0 2026-03-09T16:14:58.254 INFO:tasks.workunit.client.0.vm03.stdout:7/892: write d4/da/f20 [1945657,41848] 0 2026-03-09T16:14:58.255 INFO:tasks.workunit.client.0.vm03.stdout:7/893: chown d4/da/d45/l104 733378 1 2026-03-09T16:14:58.256 INFO:tasks.workunit.client.0.vm03.stdout:8/987: dread - da/d10/d28/d4f/daf/f123 zero size 2026-03-09T16:14:58.257 INFO:tasks.workunit.client.0.vm03.stdout:8/988: truncate da/d10/d28/f29 4288642 0 2026-03-09T16:14:58.260 INFO:tasks.workunit.client.0.vm03.stdout:6/871: stat d9/d42/d45/d50/d80/d90/db7/f106 0 2026-03-09T16:14:58.264 INFO:tasks.workunit.client.0.vm03.stdout:4/950: getdents d5/db/d25/d8b/da8/df3/df7/d4d/da9 0 2026-03-09T16:14:58.266 INFO:tasks.workunit.client.0.vm03.stdout:1/848: unlink d4/d6/d1d/d20/fc1 0 2026-03-09T16:14:58.266 INFO:tasks.workunit.client.0.vm03.stdout:5/995: creat d2/d7/d1a/d135/f145 x:0 0 0 2026-03-09T16:14:58.271 INFO:tasks.workunit.client.0.vm03.stdout:2/912: dread db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fd6 [0,4194304] 0 2026-03-09T16:14:58.272 INFO:tasks.workunit.client.0.vm03.stdout:2/913: chown db/d12/d2a/d99/de7/df9/d52 261531754 1 2026-03-09T16:14:58.273 INFO:tasks.workunit.client.0.vm03.stdout:2/914: truncate db/d12/da5/dc2/dc9/ff1 1009061 0 2026-03-09T16:14:58.278 INFO:tasks.workunit.client.0.vm03.stdout:6/872: rename d9/d42/d45/d50/d80/d8a/d9c/d97/da8/c6b to d9/d8e/c11c 0 2026-03-09T16:14:58.279 INFO:tasks.workunit.client.0.vm03.stdout:6/873: chown d9/d42/d45/d50/fb0 215890073 1 2026-03-09T16:14:58.279 INFO:tasks.workunit.client.0.vm03.stdout:7/894: dread d4/da/d5d/db0/d9d/fac [0,4194304] 0 2026-03-09T16:14:58.279 INFO:tasks.workunit.client.0.vm03.stdout:6/874: chown d9/d42/d45/d50/d80/d8a/d9c/d97/da8/f81 89388 1 2026-03-09T16:14:58.289 INFO:tasks.workunit.client.0.vm03.stdout:0/961: mkdir d0/d7/d3e/d57/d5a/d47/dce/d11f/d149 0 2026-03-09T16:14:58.290 INFO:tasks.workunit.client.0.vm03.stdout:3/928: mknod d5/d44/c11c 0 2026-03-09T16:14:58.290 INFO:tasks.workunit.client.0.vm03.stdout:3/929: chown d5/f113 0 1 2026-03-09T16:14:58.293 INFO:tasks.workunit.client.0.vm03.stdout:8/989: dwrite da/d32/db5/f128 [0,4194304] 0 2026-03-09T16:14:58.306 INFO:tasks.workunit.client.0.vm03.stdout:4/951: mknod d5/db/d25/d8b/da8/df3/df7/d4d/da9/d105/c122 0 2026-03-09T16:14:58.315 INFO:tasks.workunit.client.0.vm03.stdout:2/915: mknod db/d12/d2a/d99/de7/df9/c138 0 2026-03-09T16:14:58.319 INFO:tasks.workunit.client.0.vm03.stdout:6/875: mknod d9/d42/c11d 0 2026-03-09T16:14:58.323 INFO:tasks.workunit.client.0.vm03.stdout:4/952: truncate d5/db/d25/d8b/da8/df3/df7/d33/d79/f89 1435511 0 2026-03-09T16:14:58.324 INFO:tasks.workunit.client.0.vm03.stdout:4/953: chown d5/db/d25/d8b/fc6 203 1 2026-03-09T16:14:58.326 INFO:tasks.workunit.client.0.vm03.stdout:5/996: write d2/d7/d115/d16/d5c/ff1 [437909,106835] 0 2026-03-09T16:14:58.329 INFO:tasks.workunit.client.0.vm03.stdout:0/962: write d0/d7/d3e/d57/d5a/d82/d89/dc0/fe1 [1026584,56459] 0 2026-03-09T16:14:58.331 INFO:tasks.workunit.client.0.vm03.stdout:1/849: write d4/d6/d3b/d63/fdb [978346,10493] 0 2026-03-09T16:14:58.346 INFO:tasks.workunit.client.0.vm03.stdout:8/990: link da/d32/d133/fa9 da/d32/db5/f148 0 2026-03-09T16:14:58.346 INFO:tasks.workunit.client.0.vm03.stdout:7/895: creat d4/f129 x:0 0 0 2026-03-09T16:14:58.346 INFO:tasks.workunit.client.0.vm03.stdout:4/954: symlink d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d82/l123 0 2026-03-09T16:14:58.347 INFO:tasks.workunit.client.0.vm03.stdout:5/997: mknod d2/d7/d115/d24/c146 0 2026-03-09T16:14:58.347 INFO:tasks.workunit.client.0.vm03.stdout:8/991: chown da/d32/d133 14053 1 2026-03-09T16:14:58.348 INFO:tasks.workunit.client.0.vm03.stdout:5/998: chown d2/d7/d115/d16/d5c/dfc/d106/d108 1431011 1 2026-03-09T16:14:58.356 INFO:tasks.workunit.client.0.vm03.stdout:1/850: fsync d4/d6/f15 0 2026-03-09T16:14:58.358 INFO:tasks.workunit.client.0.vm03.stdout:6/876: dread d9/d42/d45/d50/d80/d8a/d9c/f6a [0,4194304] 0 2026-03-09T16:14:58.361 INFO:tasks.workunit.client.0.vm03.stdout:5/999: sync 2026-03-09T16:14:58.371 INFO:tasks.workunit.client.0.vm03.stdout:2/916: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/d135/f139 x:0 0 0 2026-03-09T16:14:58.375 INFO:tasks.workunit.client.0.vm03.stdout:4/955: creat d5/db/d25/d8b/f124 x:0 0 0 2026-03-09T16:14:58.381 INFO:tasks.workunit.client.0.vm03.stdout:0/963: fdatasync d0/d7/d75/f12e 0 2026-03-09T16:14:58.396 INFO:tasks.workunit.client.0.vm03.stdout:8/992: mknod da/d10/d28/d4f/daf/dee/c149 0 2026-03-09T16:14:58.396 INFO:tasks.workunit.client.0.vm03.stdout:1/851: creat d4/d6/d3b/d6b/d25/d50/f114 x:0 0 0 2026-03-09T16:14:58.396 INFO:tasks.workunit.client.0.vm03.stdout:3/930: getdents d5/d53/d88 0 2026-03-09T16:14:58.396 INFO:tasks.workunit.client.0.vm03.stdout:3/931: chown d5/d6d/d5a/f7c 796 1 2026-03-09T16:14:58.396 INFO:tasks.workunit.client.0.vm03.stdout:6/877: dread d9/d14/f44 [0,4194304] 0 2026-03-09T16:14:58.397 INFO:tasks.workunit.client.0.vm03.stdout:2/917: truncate db/d12/d2a/d61/d6d/fb0 428806 0 2026-03-09T16:14:58.403 INFO:tasks.workunit.client.0.vm03.stdout:0/964: mknod d0/d7/d3e/d57/c14a 0 2026-03-09T16:14:58.404 INFO:tasks.workunit.client.0.vm03.stdout:8/993: creat da/d10/d28/d64/f14a x:0 0 0 2026-03-09T16:14:58.405 INFO:tasks.workunit.client.0.vm03.stdout:8/994: chown da/d10/d28/d4f/d85/d9c/d10e/f117 129263 1 2026-03-09T16:14:58.407 INFO:tasks.workunit.client.0.vm03.stdout:8/995: dwrite da/d10/d28/f29 [0,4194304] 0 2026-03-09T16:14:58.426 INFO:tasks.workunit.client.0.vm03.stdout:7/896: symlink d4/da/d45/d51/d36/d66/df4/l12a 0 2026-03-09T16:14:58.426 INFO:tasks.workunit.client.0.vm03.stdout:7/897: chown d4/d2d/d4b/fd6 82565439 1 2026-03-09T16:14:58.433 INFO:tasks.workunit.client.0.vm03.stdout:6/878: unlink d9/d42/d45/d65/c10f 0 2026-03-09T16:14:58.454 INFO:tasks.workunit.client.0.vm03.stdout:1/852: write d4/d6/d3b/d6b/d25/fb8 [3147571,99380] 0 2026-03-09T16:14:58.464 INFO:tasks.workunit.client.0.vm03.stdout:1/853: dread d4/d6/d1d/d20/d23/f9f [0,4194304] 0 2026-03-09T16:14:58.465 INFO:tasks.workunit.client.0.vm03.stdout:4/956: dwrite d5/db/d25/d8b/da8/df3/df7/d33/ffd [0,4194304] 0 2026-03-09T16:14:58.476 INFO:tasks.workunit.client.0.vm03.stdout:1/854: sync 2026-03-09T16:14:58.484 INFO:tasks.workunit.client.0.vm03.stdout:0/965: creat d0/da/d5c/db6/f14b x:0 0 0 2026-03-09T16:14:58.487 INFO:tasks.workunit.client.0.vm03.stdout:3/932: getdents d5/d1e/d11a 0 2026-03-09T16:14:58.487 INFO:tasks.workunit.client.0.vm03.stdout:3/933: stat d5/d1e/d42/d34/dd2/ff7 0 2026-03-09T16:14:58.487 INFO:tasks.workunit.client.0.vm03.stdout:3/934: stat d5/c93 0 2026-03-09T16:14:58.491 INFO:tasks.workunit.client.0.vm03.stdout:4/957: creat d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d82/de4/f125 x:0 0 0 2026-03-09T16:14:58.493 INFO:tasks.workunit.client.0.vm03.stdout:1/855: truncate d4/d6/da2/dea/f103 5169873 0 2026-03-09T16:14:58.493 INFO:tasks.workunit.client.0.vm03.stdout:1/856: stat d4/d31/d5c/da8/da1 0 2026-03-09T16:14:58.494 INFO:tasks.workunit.client.0.vm03.stdout:3/935: symlink d5/d53/d6c/d79/dd9/l11d 0 2026-03-09T16:14:58.495 INFO:tasks.workunit.client.0.vm03.stdout:2/918: link db/d12/d2a/d99/de7/df9/d64/l71 db/d12/da5/dc2/d110/l13a 0 2026-03-09T16:14:58.508 INFO:tasks.workunit.client.0.vm03.stdout:3/936: dread d5/d6d/f7a [0,4194304] 0 2026-03-09T16:14:58.508 INFO:tasks.workunit.client.0.vm03.stdout:3/937: dread - d5/d44/d102/f104 zero size 2026-03-09T16:14:58.509 INFO:tasks.workunit.client.0.vm03.stdout:0/966: mknod d0/c14c 0 2026-03-09T16:14:58.518 INFO:tasks.workunit.client.0.vm03.stdout:2/919: chown db/d12/d2a/d61/f4c 722887 1 2026-03-09T16:14:58.521 INFO:tasks.workunit.client.0.vm03.stdout:3/938: creat d5/d6d/f11e x:0 0 0 2026-03-09T16:14:58.529 INFO:tasks.workunit.client.0.vm03.stdout:1/857: mknod d4/c115 0 2026-03-09T16:14:58.530 INFO:tasks.workunit.client.0.vm03.stdout:8/996: link da/d6c/d7a/ff9 da/d32/d133/d80/f14b 0 2026-03-09T16:14:58.530 INFO:tasks.workunit.client.0.vm03.stdout:8/997: readlink da/d10/l54 0 2026-03-09T16:14:58.532 INFO:tasks.workunit.client.0.vm03.stdout:6/879: rename d9/d42/d45/d50/d80/f10a to d9/d42/d45/d65/f11e 0 2026-03-09T16:14:58.533 INFO:tasks.workunit.client.0.vm03.stdout:6/880: dread - d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcf zero size 2026-03-09T16:14:58.545 INFO:tasks.workunit.client.0.vm03.stdout:1/858: mknod d4/d39/d7f/df3/c116 0 2026-03-09T16:14:58.550 INFO:tasks.workunit.client.0.vm03.stdout:6/881: read d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/feb [1023094,101180] 0 2026-03-09T16:14:58.551 INFO:tasks.workunit.client.0.vm03.stdout:6/882: chown d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/fcd 217 1 2026-03-09T16:14:58.552 INFO:tasks.workunit.client.0.vm03.stdout:6/883: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f115 [900355,60698] 0 2026-03-09T16:14:58.552 INFO:tasks.workunit.client.0.vm03.stdout:6/884: readlink d9/d42/d45/ddf/l104 0 2026-03-09T16:14:58.558 INFO:tasks.workunit.client.0.vm03.stdout:1/859: truncate d4/d31/d5c/faf 303128 0 2026-03-09T16:14:58.558 INFO:tasks.workunit.client.0.vm03.stdout:1/860: chown d4/c5e 55800 1 2026-03-09T16:14:58.564 INFO:tasks.workunit.client.0.vm03.stdout:1/861: unlink d4/d6/d1d/d20/fc8 0 2026-03-09T16:14:58.565 INFO:tasks.workunit.client.0.vm03.stdout:8/998: creat da/d32/d133/f14c x:0 0 0 2026-03-09T16:14:58.572 INFO:tasks.workunit.client.0.vm03.stdout:8/999: dread da/d10/f33 [0,4194304] 0 2026-03-09T16:14:58.574 INFO:tasks.workunit.client.0.vm03.stdout:7/898: write d4/da/f42 [4687214,91205] 0 2026-03-09T16:14:58.578 INFO:tasks.workunit.client.0.vm03.stdout:4/958: write d5/db/d25/d8b/da8/df3/df7/d33/f92 [663803,55915] 0 2026-03-09T16:14:58.582 INFO:tasks.workunit.client.0.vm03.stdout:4/959: dwrite d5/db/d25/d8b/da8/df3/df7/fa1 [0,4194304] 0 2026-03-09T16:14:58.594 INFO:tasks.workunit.client.0.vm03.stdout:1/862: symlink d4/l117 0 2026-03-09T16:14:58.600 INFO:tasks.workunit.client.0.vm03.stdout:2/920: dwrite db/d12/d2a/d61/f54 [0,4194304] 0 2026-03-09T16:14:58.605 INFO:tasks.workunit.client.0.vm03.stdout:4/960: creat d5/db/d25/d8b/da8/d81/dd4/f126 x:0 0 0 2026-03-09T16:14:58.607 INFO:tasks.workunit.client.0.vm03.stdout:0/967: dwrite d0/d7/d3e/d57/d5a/d82/dd2/f10f [0,4194304] 0 2026-03-09T16:14:58.608 INFO:tasks.workunit.client.0.vm03.stdout:0/968: readlink d0/d7/d3e/d57/d5a/d52/l8d 0 2026-03-09T16:14:58.622 INFO:tasks.workunit.client.0.vm03.stdout:1/863: dread d4/d6/da2/fe7 [0,4194304] 0 2026-03-09T16:14:58.626 INFO:tasks.workunit.client.0.vm03.stdout:3/939: write d5/d6d/db9/df2/dae/fda [1991787,49360] 0 2026-03-09T16:14:58.629 INFO:tasks.workunit.client.0.vm03.stdout:6/885: write d9/d14/f1d [182732,71173] 0 2026-03-09T16:14:58.631 INFO:tasks.workunit.client.0.vm03.stdout:0/969: fdatasync d0/d7/d3e/d57/f90 0 2026-03-09T16:14:58.631 INFO:tasks.workunit.client.0.vm03.stdout:0/970: readlink d0/d7/ld6 0 2026-03-09T16:14:58.632 INFO:tasks.workunit.client.0.vm03.stdout:0/971: fdatasync d0/d7/d3e/d57/d5a/d5f/db2/fa2 0 2026-03-09T16:14:58.635 INFO:tasks.workunit.client.0.vm03.stdout:3/940: creat d5/d53/d88/dd3/f11f x:0 0 0 2026-03-09T16:14:58.644 INFO:tasks.workunit.client.0.vm03.stdout:0/972: truncate d0/da/d1b/d9b/f126 198878 0 2026-03-09T16:14:58.644 INFO:tasks.workunit.client.0.vm03.stdout:6/886: link d9/d42/d45/d50/fba d9/d8e/f11f 0 2026-03-09T16:14:58.645 INFO:tasks.workunit.client.0.vm03.stdout:0/973: stat d0/d7/c3f 0 2026-03-09T16:14:58.647 INFO:tasks.workunit.client.0.vm03.stdout:0/974: mknod d0/d7/d3e/d95/c14d 0 2026-03-09T16:14:58.653 INFO:tasks.workunit.client.0.vm03.stdout:6/887: creat d9/d42/f120 x:0 0 0 2026-03-09T16:14:58.653 INFO:tasks.workunit.client.0.vm03.stdout:6/888: chown d9/d84/la2 126 1 2026-03-09T16:14:58.655 INFO:tasks.workunit.client.0.vm03.stdout:6/889: dread d9/d42/d45/f4a [0,4194304] 0 2026-03-09T16:14:58.658 INFO:tasks.workunit.client.0.vm03.stdout:0/975: creat d0/da/d5c/f14e x:0 0 0 2026-03-09T16:14:58.661 INFO:tasks.workunit.client.0.vm03.stdout:7/899: dwrite d4/da/d5d/dd8/d22/d24/d16/d2b/f56 [0,4194304] 0 2026-03-09T16:14:58.662 INFO:tasks.workunit.client.0.vm03.stdout:7/900: stat d4/d2d/d4b/f6b 0 2026-03-09T16:14:58.663 INFO:tasks.workunit.client.0.vm03.stdout:7/901: chown d4/f128 51818 1 2026-03-09T16:14:58.664 INFO:tasks.workunit.client.0.vm03.stdout:6/890: mkdir d9/d14/da5/dd8/d10d/d121 0 2026-03-09T16:14:58.666 INFO:tasks.workunit.client.0.vm03.stdout:0/976: mknod d0/d7/d3e/d57/d5a/d5f/c14f 0 2026-03-09T16:14:58.668 INFO:tasks.workunit.client.0.vm03.stdout:7/902: rename d4/da/dbf/deb/d100/d123 to d4/da/d5d/dd8/d22/d24/d16/d10b/d12b 0 2026-03-09T16:14:58.669 INFO:tasks.workunit.client.0.vm03.stdout:7/903: readlink d4/da/l76 0 2026-03-09T16:14:58.669 INFO:tasks.workunit.client.0.vm03.stdout:7/904: write d4/f128 [342652,114684] 0 2026-03-09T16:14:58.680 INFO:tasks.workunit.client.0.vm03.stdout:2/921: write db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/fdc [3608158,56445] 0 2026-03-09T16:14:58.682 INFO:tasks.workunit.client.0.vm03.stdout:1/864: write d4/d6/d1d/fae [826014,3705] 0 2026-03-09T16:14:58.685 INFO:tasks.workunit.client.0.vm03.stdout:6/891: rename d9/d42/c5f to d9/d42/d45/d50/d80/d8a/d9c/d97/da8/c122 0 2026-03-09T16:14:58.687 INFO:tasks.workunit.client.0.vm03.stdout:4/961: truncate d5/dd/dd5/fef 1225537 0 2026-03-09T16:14:58.688 INFO:tasks.workunit.client.0.vm03.stdout:4/962: dread - d5/db/d25/d8b/da8/df3/df7/d4d/da9/fff zero size 2026-03-09T16:14:58.690 INFO:tasks.workunit.client.0.vm03.stdout:2/922: creat db/d12/da5/dc2/d110/f13b x:0 0 0 2026-03-09T16:14:58.692 INFO:tasks.workunit.client.0.vm03.stdout:3/941: dwrite d5/d44/f54 [0,4194304] 0 2026-03-09T16:14:58.694 INFO:tasks.workunit.client.0.vm03.stdout:1/865: creat d4/d6/d3b/d6b/da5/dc0/f118 x:0 0 0 2026-03-09T16:14:58.701 INFO:tasks.workunit.client.0.vm03.stdout:4/963: unlink d5/dd/d1f/d5f/ff2 0 2026-03-09T16:14:58.704 INFO:tasks.workunit.client.0.vm03.stdout:2/923: mknod db/d12/d2a/d61/d6d/c13c 0 2026-03-09T16:14:58.704 INFO:tasks.workunit.client.0.vm03.stdout:2/924: chown db/laa 0 1 2026-03-09T16:14:58.704 INFO:tasks.workunit.client.0.vm03.stdout:0/977: creat d0/da/d5c/f150 x:0 0 0 2026-03-09T16:14:58.705 INFO:tasks.workunit.client.0.vm03.stdout:1/866: mknod d4/db/d8b/c119 0 2026-03-09T16:14:58.706 INFO:tasks.workunit.client.0.vm03.stdout:3/942: rename d5/d1e/d42/d4c/f7d to d5/d44/d61/f120 0 2026-03-09T16:14:58.708 INFO:tasks.workunit.client.0.vm03.stdout:4/964: mkdir d5/d17/db7/d127 0 2026-03-09T16:14:58.710 INFO:tasks.workunit.client.0.vm03.stdout:3/943: dwrite d5/f109 [0,4194304] 0 2026-03-09T16:14:58.713 INFO:tasks.workunit.client.0.vm03.stdout:2/925: mknod db/d12/d2a/d61/d79/c13d 0 2026-03-09T16:14:58.715 INFO:tasks.workunit.client.0.vm03.stdout:2/926: chown db/d12/d2a/d61/d6d/fa8 0 1 2026-03-09T16:14:58.716 INFO:tasks.workunit.client.0.vm03.stdout:0/978: truncate d0/d7/d48/f43 3664596 0 2026-03-09T16:14:58.718 INFO:tasks.workunit.client.0.vm03.stdout:0/979: chown d0/d7/d3e/d57/d5a/d82/d89/c96 527296 1 2026-03-09T16:14:58.722 INFO:tasks.workunit.client.0.vm03.stdout:4/965: truncate d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/f94 1719214 0 2026-03-09T16:14:58.778 INFO:tasks.workunit.client.0.vm03.stdout:3/944: link d5/d2e/fd4 d5/d44/d102/f121 0 2026-03-09T16:14:58.781 INFO:tasks.workunit.client.0.vm03.stdout:2/927: mkdir db/d12/da5/d13e 0 2026-03-09T16:14:58.787 INFO:tasks.workunit.client.0.vm03.stdout:4/966: symlink d5/db/d25/d8b/dd6/dfe/l128 0 2026-03-09T16:14:58.788 INFO:tasks.workunit.client.0.vm03.stdout:4/967: readlink d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/la2 0 2026-03-09T16:14:58.797 INFO:tasks.workunit.client.0.vm03.stdout:7/905: dwrite d4/da/d45/d51/f50 [0,4194304] 0 2026-03-09T16:14:58.809 INFO:tasks.workunit.client.0.vm03.stdout:4/968: mkdir d5/db/d25/d8b/da8/df3/df7/d33/d79/d101/d129 0 2026-03-09T16:14:58.816 INFO:tasks.workunit.client.0.vm03.stdout:6/892: dwrite d9/d42/d45/d50/d80/d8a/dc1/dd4/fea [0,4194304] 0 2026-03-09T16:14:58.817 INFO:tasks.workunit.client.0.vm03.stdout:6/893: chown d9/d8e/def 5632 1 2026-03-09T16:14:58.828 INFO:tasks.workunit.client.0.vm03.stdout:6/894: unlink d9/d14/da5/laa 0 2026-03-09T16:14:58.838 INFO:tasks.workunit.client.0.vm03.stdout:6/895: dread d9/d42/d45/d50/fb0 [4194304,4194304] 0 2026-03-09T16:14:58.903 INFO:tasks.workunit.client.0.vm03.stdout:2/928: write db/d12/d2a/f5f [2501838,89342] 0 2026-03-09T16:14:58.903 INFO:tasks.workunit.client.0.vm03.stdout:0/980: dwrite d0/d7/d3e/f72 [0,4194304] 0 2026-03-09T16:14:58.910 INFO:tasks.workunit.client.0.vm03.stdout:3/945: truncate d5/d6d/d5a/f78 2515653 0 2026-03-09T16:14:58.912 INFO:tasks.workunit.client.0.vm03.stdout:2/929: symlink db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/df7/l13f 0 2026-03-09T16:14:58.912 INFO:tasks.workunit.client.0.vm03.stdout:3/946: truncate d5/d44/d61/fb8 500589 0 2026-03-09T16:14:58.914 INFO:tasks.workunit.client.0.vm03.stdout:3/947: dread - d5/d1e/d42/d8b/fd0 zero size 2026-03-09T16:14:58.915 INFO:tasks.workunit.client.0.vm03.stdout:0/981: dread d0/d7/f8 [0,4194304] 0 2026-03-09T16:14:58.918 INFO:tasks.workunit.client.0.vm03.stdout:3/948: dread d5/d1e/d42/d55/f7e [4194304,4194304] 0 2026-03-09T16:14:58.922 INFO:tasks.workunit.client.0.vm03.stdout:0/982: creat d0/d7/d3e/d57/d5a/d82/d89/def/d125/f151 x:0 0 0 2026-03-09T16:14:58.926 INFO:tasks.workunit.client.0.vm03.stdout:2/930: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/f140 x:0 0 0 2026-03-09T16:14:58.928 INFO:tasks.workunit.client.0.vm03.stdout:3/949: mkdir d5/d44/d61/d122 0 2026-03-09T16:14:58.930 INFO:tasks.workunit.client.0.vm03.stdout:0/983: mkdir d0/d7/d3e/d57/d5a/d152 0 2026-03-09T16:14:58.931 INFO:tasks.workunit.client.0.vm03.stdout:0/984: read d0/d7/d75/f62 [247996,121804] 0 2026-03-09T16:14:58.935 INFO:tasks.workunit.client.0.vm03.stdout:3/950: dwrite d5/d44/d102/f104 [0,4194304] 0 2026-03-09T16:14:58.942 INFO:tasks.workunit.client.0.vm03.stdout:2/931: getdents db/d12/d2a/d99 0 2026-03-09T16:14:58.943 INFO:tasks.workunit.client.0.vm03.stdout:3/951: rename d5/d2e/c81 to d5/d53/d6c/c123 0 2026-03-09T16:14:58.947 INFO:tasks.workunit.client.0.vm03.stdout:3/952: fsync d5/d44/d61/fab 0 2026-03-09T16:14:58.954 INFO:tasks.workunit.client.0.vm03.stdout:0/985: dread d0/da/d1b/d9b/ff3 [0,4194304] 0 2026-03-09T16:14:58.955 INFO:tasks.workunit.client.0.vm03.stdout:3/953: creat d5/d2e/f124 x:0 0 0 2026-03-09T16:14:58.962 INFO:tasks.workunit.client.0.vm03.stdout:7/906: dwrite d4/da/d5d/db0/d61/fdb [0,4194304] 0 2026-03-09T16:14:58.963 INFO:tasks.workunit.client.0.vm03.stdout:0/986: mknod d0/d7/d3e/d95/c153 0 2026-03-09T16:14:58.964 INFO:tasks.workunit.client.0.vm03.stdout:7/907: fsync d4/da/f20 0 2026-03-09T16:14:58.976 INFO:tasks.workunit.client.0.vm03.stdout:3/954: symlink d5/d6d/d6a/d101/l125 0 2026-03-09T16:14:58.980 INFO:tasks.workunit.client.0.vm03.stdout:4/969: dwrite d5/dd/dd5/fef [0,4194304] 0 2026-03-09T16:14:58.980 INFO:tasks.workunit.client.0.vm03.stdout:4/970: readlink d5/d17/d44/l5c 0 2026-03-09T16:14:58.987 INFO:tasks.workunit.client.0.vm03.stdout:2/932: getdents db/d12/d2a/d99/de7/df9/d64 0 2026-03-09T16:14:58.999 INFO:tasks.workunit.client.0.vm03.stdout:2/933: fdatasync db/d12/d2a/f8d 0 2026-03-09T16:14:59.003 INFO:tasks.workunit.client.0.vm03.stdout:7/908: truncate d4/da/d5d/db0/d61/f84 912313 0 2026-03-09T16:14:59.005 INFO:tasks.workunit.client.0.vm03.stdout:6/896: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f27 [500562,27450] 0 2026-03-09T16:14:59.017 INFO:tasks.workunit.client.0.vm03.stdout:4/971: creat d5/dd/dd5/d11d/f12a x:0 0 0 2026-03-09T16:14:59.027 INFO:tasks.workunit.client.0.vm03.stdout:2/934: rmdir db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/dda 39 2026-03-09T16:14:59.031 INFO:tasks.workunit.client.0.vm03.stdout:7/909: fsync d4/d2d/fd5 0 2026-03-09T16:14:59.034 INFO:tasks.workunit.client.0.vm03.stdout:0/987: link d0/da/d1b/d9b/f93 d0/d7/d3e/d57/d5a/d5f/db2/d8e/f154 0 2026-03-09T16:14:59.035 INFO:tasks.workunit.client.0.vm03.stdout:6/897: mkdir d9/d84/d123 0 2026-03-09T16:14:59.036 INFO:tasks.workunit.client.0.vm03.stdout:4/972: mknod d5/db/d25/d8b/da8/df3/df7/d33/d79/c12b 0 2026-03-09T16:14:59.040 INFO:tasks.workunit.client.0.vm03.stdout:2/935: unlink db/d12/d2a/d61/d79/c13d 0 2026-03-09T16:14:59.049 INFO:tasks.workunit.client.0.vm03.stdout:7/910: symlink d4/d2d/d4b/l12c 0 2026-03-09T16:14:59.052 INFO:tasks.workunit.client.0.vm03.stdout:4/973: truncate d5/d17/ffc 455558 0 2026-03-09T16:14:59.052 INFO:tasks.workunit.client.0.vm03.stdout:4/974: chown d5/db/d25/d8b/da8/d81 659 1 2026-03-09T16:14:59.056 INFO:tasks.workunit.client.0.vm03.stdout:7/911: creat d4/da/d5d/db0/d113/db8/f12d x:0 0 0 2026-03-09T16:14:59.060 INFO:tasks.workunit.client.0.vm03.stdout:7/912: creat d4/d2d/d4b/f12e x:0 0 0 2026-03-09T16:14:59.061 INFO:tasks.workunit.client.0.vm03.stdout:0/988: link d0/d7/d3e/d57/d5a/d47/f10c d0/da/d1b/dc8/d104/f155 0 2026-03-09T16:14:59.062 INFO:tasks.workunit.client.0.vm03.stdout:1/867: creat d4/d6/d1d/f11a x:0 0 0 2026-03-09T16:14:59.063 INFO:tasks.workunit.client.0.vm03.stdout:1/868: readlink d4/d6/d1d/d20/d5f/lb7 0 2026-03-09T16:14:59.064 INFO:tasks.workunit.client.0.vm03.stdout:6/898: link d9/d14/f1d d9/d42/d45/d50/f124 0 2026-03-09T16:14:59.069 INFO:tasks.workunit.client.0.vm03.stdout:2/936: rename db/d12/d2a/d99/de7/df9/d64/dbd/df5 to db/d12/d2a/d99/de7/df9/d141 0 2026-03-09T16:14:59.074 INFO:tasks.workunit.client.0.vm03.stdout:0/989: creat d0/da/d1b/de0/f156 x:0 0 0 2026-03-09T16:14:59.076 INFO:tasks.workunit.client.0.vm03.stdout:1/869: symlink d4/d6/d3b/d6b/da5/dc3/l11b 0 2026-03-09T16:14:59.083 INFO:tasks.workunit.client.0.vm03.stdout:6/899: rename d9/d42/d45/d65/dae to d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/d125 0 2026-03-09T16:14:59.085 INFO:tasks.workunit.client.0.vm03.stdout:2/937: dread db/d12/d2a/d61/d6d/fb0 [0,4194304] 0 2026-03-09T16:14:59.086 INFO:tasks.workunit.client.0.vm03.stdout:2/938: write db/d12/fe8 [1662378,106039] 0 2026-03-09T16:14:59.089 INFO:tasks.workunit.client.0.vm03.stdout:0/990: chown d0/d7/c34 0 1 2026-03-09T16:14:59.089 INFO:tasks.workunit.client.0.vm03.stdout:0/991: dread - d0/d7/d3e/d57/fe4 zero size 2026-03-09T16:14:59.093 INFO:tasks.workunit.client.0.vm03.stdout:4/975: getdents d5/db/d25/d8b/da8/df3/df7/d4d/d5b 0 2026-03-09T16:14:59.097 INFO:tasks.workunit.client.0.vm03.stdout:2/939: readlink db/d12/d2a/d99/de7/df9/lfb 0 2026-03-09T16:14:59.119 INFO:tasks.workunit.client.0.vm03.stdout:1/870: symlink d4/d6/d3b/l11c 0 2026-03-09T16:14:59.121 INFO:tasks.workunit.client.0.vm03.stdout:4/976: rename d5/d17/d44/f4a to d5/dd/dba/f12c 0 2026-03-09T16:14:59.123 INFO:tasks.workunit.client.0.vm03.stdout:6/900: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/d117/d126 0 2026-03-09T16:14:59.125 INFO:tasks.workunit.client.0.vm03.stdout:7/913: getdents d4/da/d45/d51/d36/d66 0 2026-03-09T16:14:59.127 INFO:tasks.workunit.client.0.vm03.stdout:1/871: fdatasync d4/fd 0 2026-03-09T16:14:59.130 INFO:tasks.workunit.client.0.vm03.stdout:6/901: creat d9/d42/d45/d50/d80/d8a/d9c/d97/f127 x:0 0 0 2026-03-09T16:14:59.138 INFO:tasks.workunit.client.0.vm03.stdout:3/955: dwrite d5/d2e/fd4 [0,4194304] 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:3/956: write d5/d44/d102/f121 [5319876,80344] 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:3/957: chown d5/d1e/d42/d55/c95 107 1 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:0/992: link d0/d7/d3e/d57/d5a/d82/d89/dc0/c12c d0/d7/d3e/d57/d5a/d82/d89/c157 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:1/872: fsync d4/d6/d1d/d20/d23/f74 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:0/993: rmdir d0/da/d7a 39 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:1/873: creat d4/d6/da2/dea/f11d x:0 0 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:4/977: getdents d5/d17/db7/d10e 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:1/874: truncate d4/d7b/f90 2517997 0 2026-03-09T16:14:59.165 INFO:tasks.workunit.client.0.vm03.stdout:4/978: creat d5/d17/d44/f12d x:0 0 0 2026-03-09T16:14:59.167 INFO:tasks.workunit.client.0.vm03.stdout:1/875: truncate d4/d6/d3b/d6b/f42 3054048 0 2026-03-09T16:14:59.170 INFO:tasks.workunit.client.0.vm03.stdout:7/914: dread d4/da/d5d/db0/d113/db8/f10a [0,4194304] 0 2026-03-09T16:14:59.175 INFO:tasks.workunit.client.0.vm03.stdout:6/902: dread d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f3e [0,4194304] 0 2026-03-09T16:14:59.176 INFO:tasks.workunit.client.0.vm03.stdout:1/876: dread d4/d6/da2/dea/f103 [0,4194304] 0 2026-03-09T16:14:59.179 INFO:tasks.workunit.client.0.vm03.stdout:4/979: chown d5/dd/f73 19 1 2026-03-09T16:14:59.181 INFO:tasks.workunit.client.0.vm03.stdout:7/915: rmdir d4/da/d5d/db0 39 2026-03-09T16:14:59.182 INFO:tasks.workunit.client.0.vm03.stdout:1/877: fsync d4/d6/f15 0 2026-03-09T16:14:59.182 INFO:tasks.workunit.client.0.vm03.stdout:7/916: dread - d4/da/d5d/dd8/d22/d24/d16/d6e/fa1 zero size 2026-03-09T16:14:59.185 INFO:tasks.workunit.client.0.vm03.stdout:2/940: sync 2026-03-09T16:14:59.193 INFO:tasks.workunit.client.0.vm03.stdout:4/980: rmdir d5/db/d25 39 2026-03-09T16:14:59.194 INFO:tasks.workunit.client.0.vm03.stdout:1/878: dread d4/d39/f5a [4194304,4194304] 0 2026-03-09T16:14:59.195 INFO:tasks.workunit.client.0.vm03.stdout:1/879: readlink d4/d6/d3b/d6b/da5/dc3/le2 0 2026-03-09T16:14:59.204 INFO:tasks.workunit.client.0.vm03.stdout:1/880: truncate d4/db/d8b/db2/fc6 653777 0 2026-03-09T16:14:59.207 INFO:tasks.workunit.client.0.vm03.stdout:4/981: dwrite d5/db/d25/d8b/da8/dbe/fd0 [0,4194304] 0 2026-03-09T16:14:59.208 INFO:tasks.workunit.client.0.vm03.stdout:4/982: chown d5/db/d25 0 1 2026-03-09T16:14:59.209 INFO:tasks.workunit.client.0.vm03.stdout:6/903: link d9/d14/ff5 d9/d42/d45/dfa/f128 0 2026-03-09T16:14:59.214 INFO:tasks.workunit.client.0.vm03.stdout:4/983: dread d5/db/d25/d8b/da8/df3/df7/fa1 [0,4194304] 0 2026-03-09T16:14:59.215 INFO:tasks.workunit.client.0.vm03.stdout:4/984: write d5/db/d25/d8b/da8/df3/f11a [178628,12985] 0 2026-03-09T16:14:59.219 INFO:tasks.workunit.client.0.vm03.stdout:1/881: readlink d4/d31/l52 0 2026-03-09T16:14:59.221 INFO:tasks.workunit.client.0.vm03.stdout:7/917: rename d4/da/d5d/dd8/d22/d24/d15/le0 to d4/l12f 0 2026-03-09T16:14:59.225 INFO:tasks.workunit.client.0.vm03.stdout:3/958: write d5/d53/d6c/f9f [1218263,111021] 0 2026-03-09T16:14:59.226 INFO:tasks.workunit.client.0.vm03.stdout:3/959: readlink d5/d53/d6c/l4f 0 2026-03-09T16:14:59.228 INFO:tasks.workunit.client.0.vm03.stdout:0/994: write d0/d7/d3e/d57/fe4 [197959,12287] 0 2026-03-09T16:14:59.233 INFO:tasks.workunit.client.0.vm03.stdout:1/882: unlink d4/d6/d1d/d20/fcc 0 2026-03-09T16:14:59.238 INFO:tasks.workunit.client.0.vm03.stdout:7/918: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/d114/d130 0 2026-03-09T16:14:59.240 INFO:tasks.workunit.client.0.vm03.stdout:3/960: creat d5/d53/d6c/f126 x:0 0 0 2026-03-09T16:14:59.244 INFO:tasks.workunit.client.0.vm03.stdout:0/995: dwrite d0/d7/d3e/d57/d5a/d52/d9f/fd3 [4194304,4194304] 0 2026-03-09T16:14:59.246 INFO:tasks.workunit.client.0.vm03.stdout:0/996: write d0/d7/d3e/d57/d5a/d5f/db2/fa2 [1458987,61339] 0 2026-03-09T16:14:59.247 INFO:tasks.workunit.client.0.vm03.stdout:0/997: chown d0/d7/d3e/d57/d5a/d5f/db2/dab 13693 1 2026-03-09T16:14:59.258 INFO:tasks.workunit.client.0.vm03.stdout:4/985: link d5/dd/d1f/c11e d5/db/d25/d8b/da8/df3/df7/d4d/d5b/d72/d82/de4/c12e 0 2026-03-09T16:14:59.265 INFO:tasks.workunit.client.0.vm03.stdout:4/986: creat d5/db/d25/d8b/da8/df3/df7/d33/d79/f12f x:0 0 0 2026-03-09T16:14:59.267 INFO:tasks.workunit.client.0.vm03.stdout:2/941: rename db/d12/da5/dc2/d110/l13a to db/d12/d2a/d61/l142 0 2026-03-09T16:14:59.274 INFO:tasks.workunit.client.0.vm03.stdout:6/904: rename d9/d14/d71/fac to d9/d42/d45/d50/d80/d90/db7/f129 0 2026-03-09T16:14:59.275 INFO:tasks.workunit.client.0.vm03.stdout:6/905: chown d9/d42/d45/ddf 2705229 1 2026-03-09T16:14:59.276 INFO:tasks.workunit.client.0.vm03.stdout:7/919: write d4/da/f5f [5854938,124039] 0 2026-03-09T16:14:59.277 INFO:tasks.workunit.client.0.vm03.stdout:4/987: dread d5/d56/f6a [0,4194304] 0 2026-03-09T16:14:59.281 INFO:tasks.workunit.client.0.vm03.stdout:3/961: write d5/d6d/d6a/ff5 [2575517,84331] 0 2026-03-09T16:14:59.283 INFO:tasks.workunit.client.0.vm03.stdout:1/883: truncate d4/d6/d3b/d63/fdb 641983 0 2026-03-09T16:14:59.286 INFO:tasks.workunit.client.0.vm03.stdout:0/998: rename d0/d7/d3e/d57/d5a/f38 to d0/d7/d75/f158 0 2026-03-09T16:14:59.287 INFO:tasks.workunit.client.0.vm03.stdout:0/999: truncate d0/da/d1b/de0/f156 793889 0 2026-03-09T16:14:59.289 INFO:tasks.workunit.client.0.vm03.stdout:6/906: dread d9/d42/d45/d50/d80/d90/db7/ff1 [0,4194304] 0 2026-03-09T16:14:59.291 INFO:tasks.workunit.client.0.vm03.stdout:4/988: mknod d5/db/d25/d8b/da8/d81/dd4/c130 0 2026-03-09T16:14:59.293 INFO:tasks.workunit.client.0.vm03.stdout:2/942: write db/d12/d2a/d61/dbe/f10f [1012787,85256] 0 2026-03-09T16:14:59.294 INFO:tasks.workunit.client.0.vm03.stdout:3/962: creat d5/f127 x:0 0 0 2026-03-09T16:14:59.303 INFO:tasks.workunit.client.0.vm03.stdout:1/884: write d4/d31/d5c/f9e [1427976,3130] 0 2026-03-09T16:14:59.304 INFO:tasks.workunit.client.0.vm03.stdout:7/920: write d4/da/d5d/dd8/d22/d24/f41 [805065,110950] 0 2026-03-09T16:14:59.307 INFO:tasks.workunit.client.0.vm03.stdout:6/907: mkdir d9/d14/da5/d12a 0 2026-03-09T16:14:59.311 INFO:tasks.workunit.client.0.vm03.stdout:2/943: dread db/d12/d2a/d99/de7/df9/d64/dbd/fd8 [0,4194304] 0 2026-03-09T16:14:59.313 INFO:tasks.workunit.client.0.vm03.stdout:3/963: rmdir d5/d44 39 2026-03-09T16:14:59.314 INFO:tasks.workunit.client.0.vm03.stdout:3/964: read d5/d53/d6c/f9f [4185610,50531] 0 2026-03-09T16:14:59.319 INFO:tasks.workunit.client.0.vm03.stdout:4/989: mknod d5/db/d25/d8b/da8/df8/c131 0 2026-03-09T16:14:59.321 INFO:tasks.workunit.client.0.vm03.stdout:2/944: stat db/d12/f39 0 2026-03-09T16:14:59.322 INFO:tasks.workunit.client.0.vm03.stdout:3/965: rmdir d5/d6d/db9/df2 39 2026-03-09T16:14:59.323 INFO:tasks.workunit.client.0.vm03.stdout:1/885: symlink d4/d6/l11e 0 2026-03-09T16:14:59.325 INFO:tasks.workunit.client.0.vm03.stdout:7/921: mknod d4/da/d5d/c131 0 2026-03-09T16:14:59.327 INFO:tasks.workunit.client.0.vm03.stdout:4/990: dread - d5/dd/d1f/d95/f106 zero size 2026-03-09T16:14:59.329 INFO:tasks.workunit.client.0.vm03.stdout:2/945: fdatasync db/d12/d2a/d99/de7/df9/d52/fd0 0 2026-03-09T16:14:59.333 INFO:tasks.workunit.client.0.vm03.stdout:1/886: truncate d4/d39/d7f/fda 526882 0 2026-03-09T16:14:59.334 INFO:tasks.workunit.client.0.vm03.stdout:7/922: symlink d4/da/dbf/l132 0 2026-03-09T16:14:59.335 INFO:tasks.workunit.client.0.vm03.stdout:7/923: write d4/da/f42 [461661,7052] 0 2026-03-09T16:14:59.341 INFO:tasks.workunit.client.0.vm03.stdout:6/908: getdents d9/d8e/def 0 2026-03-09T16:14:59.343 INFO:tasks.workunit.client.0.vm03.stdout:7/924: mkdir d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/d133 0 2026-03-09T16:14:59.345 INFO:tasks.workunit.client.0.vm03.stdout:2/946: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/d124/d129/d143 0 2026-03-09T16:14:59.346 INFO:tasks.workunit.client.0.vm03.stdout:2/947: readlink db/d12/d2a/d99/de7/df9/d64/dbd/da0/lac 0 2026-03-09T16:14:59.347 INFO:tasks.workunit.client.0.vm03.stdout:4/991: dread d5/db/d25/d8b/da8/df3/fd3 [0,4194304] 0 2026-03-09T16:14:59.347 INFO:tasks.workunit.client.0.vm03.stdout:3/966: dread d5/f11 [0,4194304] 0 2026-03-09T16:14:59.347 INFO:tasks.workunit.client.0.vm03.stdout:1/887: symlink d4/d6/l11f 0 2026-03-09T16:14:59.348 INFO:tasks.workunit.client.0.vm03.stdout:7/925: fdatasync d4/da/d45/d51/d36/ff6 0 2026-03-09T16:14:59.356 INFO:tasks.workunit.client.0.vm03.stdout:1/888: dread d4/d6/d3b/d63/f89 [0,4194304] 0 2026-03-09T16:14:59.357 INFO:tasks.workunit.client.0.vm03.stdout:1/889: chown d4/d6/d3b/d6b/da5/la7 25716 1 2026-03-09T16:14:59.358 INFO:tasks.workunit.client.0.vm03.stdout:3/967: write d5/fc8 [419592,99434] 0 2026-03-09T16:14:59.363 INFO:tasks.workunit.client.0.vm03.stdout:1/890: dread d4/d39/d7f/f88 [0,4194304] 0 2026-03-09T16:14:59.364 INFO:tasks.workunit.client.0.vm03.stdout:4/992: mkdir d5/db/d132 0 2026-03-09T16:14:59.368 INFO:tasks.workunit.client.0.vm03.stdout:6/909: getdents d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad 0 2026-03-09T16:14:59.369 INFO:tasks.workunit.client.0.vm03.stdout:4/993: creat d5/dd/d1f/f133 x:0 0 0 2026-03-09T16:14:59.372 INFO:tasks.workunit.client.0.vm03.stdout:7/926: dwrite d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/fe7 [4194304,4194304] 0 2026-03-09T16:14:59.381 INFO:tasks.workunit.client.0.vm03.stdout:2/948: write db/d12/d2a/d99/de7/df9/fe9 [903981,11012] 0 2026-03-09T16:14:59.381 INFO:tasks.workunit.client.0.vm03.stdout:3/968: symlink d5/d6d/db9/l128 0 2026-03-09T16:14:59.381 INFO:tasks.workunit.client.0.vm03.stdout:1/891: mkdir d4/d6/d3b/d6b/d120 0 2026-03-09T16:14:59.383 INFO:tasks.workunit.client.0.vm03.stdout:3/969: truncate d5/d6d/d6a/dbd/ff4 758197 0 2026-03-09T16:14:59.384 INFO:tasks.workunit.client.0.vm03.stdout:3/970: stat d5/d1e/d42/d8b/ddf 0 2026-03-09T16:14:59.389 INFO:tasks.workunit.client.0.vm03.stdout:3/971: dwrite d5/d6d/d6a/fa9 [4194304,4194304] 0 2026-03-09T16:14:59.395 INFO:tasks.workunit.client.0.vm03.stdout:3/972: write d5/f113 [987219,64311] 0 2026-03-09T16:14:59.396 INFO:tasks.workunit.client.0.vm03.stdout:6/910: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/c12b 0 2026-03-09T16:14:59.396 INFO:tasks.workunit.client.0.vm03.stdout:7/927: chown d4/da/d5d/db0/d113/db8/f12d 74015459 1 2026-03-09T16:14:59.404 INFO:tasks.workunit.client.0.vm03.stdout:7/928: truncate d4/d2d/d4b/f9f 1922118 0 2026-03-09T16:14:59.404 INFO:tasks.workunit.client.0.vm03.stdout:2/949: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/d124/d129/d143/d144 0 2026-03-09T16:14:59.405 INFO:tasks.workunit.client.0.vm03.stdout:1/892: dread d4/d6/d1d/d20/d5f/f57 [0,4194304] 0 2026-03-09T16:14:59.407 INFO:tasks.workunit.client.0.vm03.stdout:1/893: dread - d4/d6/d3b/d6b/da5/dc0/f118 zero size 2026-03-09T16:14:59.408 INFO:tasks.workunit.client.0.vm03.stdout:1/894: readlink d4/d39/l51 0 2026-03-09T16:14:59.412 INFO:tasks.workunit.client.0.vm03.stdout:6/911: dread d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f3f [0,4194304] 0 2026-03-09T16:14:59.415 INFO:tasks.workunit.client.0.vm03.stdout:2/950: dwrite db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/f140 [0,4194304] 0 2026-03-09T16:14:59.423 INFO:tasks.workunit.client.0.vm03.stdout:2/951: dwrite db/d12/d2a/d61/d6d/f8f [0,4194304] 0 2026-03-09T16:14:59.433 INFO:tasks.workunit.client.0.vm03.stdout:4/994: write d5/d17/f80 [308270,121916] 0 2026-03-09T16:14:59.442 INFO:tasks.workunit.client.0.vm03.stdout:2/952: mkdir db/d12/d2a/d99/d145 0 2026-03-09T16:14:59.443 INFO:tasks.workunit.client.0.vm03.stdout:2/953: chown db/d12/d2a/d99 2313 1 2026-03-09T16:14:59.445 INFO:tasks.workunit.client.0.vm03.stdout:1/895: symlink d4/d39/d70/df0/l121 0 2026-03-09T16:14:59.452 INFO:tasks.workunit.client.0.vm03.stdout:1/896: mknod d4/d31/d5c/da8/da1/c122 0 2026-03-09T16:14:59.453 INFO:tasks.workunit.client.0.vm03.stdout:1/897: dread - d4/d6/d3b/d6b/da5/dc0/f118 zero size 2026-03-09T16:14:59.453 INFO:tasks.workunit.client.0.vm03.stdout:1/898: chown d4/d6/d1d/d20/d93/c86 134845780 1 2026-03-09T16:14:59.454 INFO:tasks.workunit.client.0.vm03.stdout:1/899: chown d4/d6/d1d/d20/d93/c86 17 1 2026-03-09T16:14:59.454 INFO:tasks.workunit.client.0.vm03.stdout:1/900: chown d4/d31 62984577 1 2026-03-09T16:14:59.457 INFO:tasks.workunit.client.0.vm03.stdout:6/912: rename d9/d42/l60 to d9/d42/l12c 0 2026-03-09T16:14:59.461 INFO:tasks.workunit.client.0.vm03.stdout:7/929: link d4/da/c25 d4/da/d5d/db0/d113/dfb/c134 0 2026-03-09T16:14:59.462 INFO:tasks.workunit.client.0.vm03.stdout:1/901: truncate d4/d6/d3b/f98 552547 0 2026-03-09T16:14:59.466 INFO:tasks.workunit.client.0.vm03.stdout:7/930: dwrite d4/da/d5d/dd8/f6a [0,4194304] 0 2026-03-09T16:14:59.471 INFO:tasks.workunit.client.0.vm03.stdout:7/931: creat d4/da/dbf/f135 x:0 0 0 2026-03-09T16:14:59.473 INFO:tasks.workunit.client.0.vm03.stdout:7/932: symlink d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/l136 0 2026-03-09T16:14:59.474 INFO:tasks.workunit.client.0.vm03.stdout:7/933: dread - d4/da/d5d/db0/d9d/f125 zero size 2026-03-09T16:14:59.483 INFO:tasks.workunit.client.0.vm03.stdout:7/934: creat d4/da/d5d/db0/d113/f137 x:0 0 0 2026-03-09T16:14:59.484 INFO:tasks.workunit.client.0.vm03.stdout:7/935: symlink d4/da/d5d/db0/d61/dca/l138 0 2026-03-09T16:14:59.485 INFO:tasks.workunit.client.0.vm03.stdout:7/936: mknod d4/da/dbf/deb/c139 0 2026-03-09T16:14:59.489 INFO:tasks.workunit.client.0.vm03.stdout:7/937: dwrite d4/da/d5d/dd8/d22/d24/f11d [0,4194304] 0 2026-03-09T16:14:59.492 INFO:tasks.workunit.client.0.vm03.stdout:7/938: dread - d4/da/dbf/f135 zero size 2026-03-09T16:14:59.493 INFO:tasks.workunit.client.0.vm03.stdout:7/939: fdatasync d4/d2d/f90 0 2026-03-09T16:14:59.497 INFO:tasks.workunit.client.0.vm03.stdout:7/940: dwrite d4/d2d/d4b/f12e [0,4194304] 0 2026-03-09T16:14:59.504 INFO:tasks.workunit.client.0.vm03.stdout:3/973: dwrite d5/d44/d61/f120 [0,4194304] 0 2026-03-09T16:14:59.513 INFO:tasks.workunit.client.0.vm03.stdout:7/941: symlink d4/da/d5d/dd8/d22/d24/d15/d71/db7/l13a 0 2026-03-09T16:14:59.520 INFO:tasks.workunit.client.0.vm03.stdout:4/995: dwrite d5/dd/d1f/d95/fad [0,4194304] 0 2026-03-09T16:14:59.524 INFO:tasks.workunit.client.0.vm03.stdout:2/954: write db/d12/d2a/d61/f5c [199352,90522] 0 2026-03-09T16:14:59.533 INFO:tasks.workunit.client.0.vm03.stdout:6/913: truncate d9/d42/d45/d50/d80/d8a/d9c/fe7 3241649 0 2026-03-09T16:14:59.533 INFO:tasks.workunit.client.0.vm03.stdout:7/942: truncate d4/da/d5d/dd8/d22/d24/d16/d6e/f73 1124856 0 2026-03-09T16:14:59.533 INFO:tasks.workunit.client.0.vm03.stdout:3/974: creat d5/d6d/d6a/dbd/f129 x:0 0 0 2026-03-09T16:14:59.534 INFO:tasks.workunit.client.0.vm03.stdout:3/975: write d5/d1e/d42/d34/dd2/ff7 [707122,34594] 0 2026-03-09T16:14:59.538 INFO:tasks.workunit.client.0.vm03.stdout:1/902: dwrite d4/d6/d3b/d63/fdb [0,4194304] 0 2026-03-09T16:14:59.549 INFO:tasks.workunit.client.0.vm03.stdout:2/955: creat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3/df7/f146 x:0 0 0 2026-03-09T16:14:59.551 INFO:tasks.workunit.client.0.vm03.stdout:7/943: fdatasync d4/f26 0 2026-03-09T16:14:59.553 INFO:tasks.workunit.client.0.vm03.stdout:1/903: fsync d4/d6/d1d/d20/d23/f9f 0 2026-03-09T16:14:59.555 INFO:tasks.workunit.client.0.vm03.stdout:7/944: read d4/f8 [153217,117122] 0 2026-03-09T16:14:59.560 INFO:tasks.workunit.client.0.vm03.stdout:4/996: link d5/db/d25/dc8/dd2/dd1/ce0 d5/d17/c134 0 2026-03-09T16:14:59.562 INFO:tasks.workunit.client.0.vm03.stdout:2/956: dread db/d12/d2a/d99/de7/df9/f7b [0,4194304] 0 2026-03-09T16:14:59.576 INFO:tasks.workunit.client.0.vm03.stdout:7/945: truncate d4/da/d5d/db0/d9d/fac 2333749 0 2026-03-09T16:14:59.576 INFO:tasks.workunit.client.0.vm03.stdout:2/957: chown db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/c126 129013 1 2026-03-09T16:14:59.576 INFO:tasks.workunit.client.0.vm03.stdout:2/958: mkdir db/d12/d2a/d61/dca/d147 0 2026-03-09T16:14:59.576 INFO:tasks.workunit.client.0.vm03.stdout:2/959: dread db/d12/d2a/d99/de7/df9/d64/f80 [4194304,4194304] 0 2026-03-09T16:14:59.577 INFO:tasks.workunit.client.0.vm03.stdout:2/960: chown db/d12/d2a/d61/d6d/c13c 189 1 2026-03-09T16:14:59.608 INFO:tasks.workunit.client.0.vm03.stdout:6/914: sync 2026-03-09T16:14:59.615 INFO:tasks.workunit.client.0.vm03.stdout:3/976: read d5/d1e/d42/d34/dd2/ff7 [562178,29333] 0 2026-03-09T16:14:59.617 INFO:tasks.workunit.client.0.vm03.stdout:3/977: truncate d5/d1e/d42/f99 2800934 0 2026-03-09T16:14:59.621 INFO:tasks.workunit.client.0.vm03.stdout:3/978: dwrite d5/d53/d88/dd7/df1/ffe [0,4194304] 0 2026-03-09T16:14:59.622 INFO:tasks.workunit.client.0.vm03.stdout:6/915: dread d9/d14/d71/f95 [0,4194304] 0 2026-03-09T16:14:59.633 INFO:tasks.workunit.client.0.vm03.stdout:7/946: write d4/f26 [6615649,67510] 0 2026-03-09T16:14:59.635 INFO:tasks.workunit.client.0.vm03.stdout:7/947: stat d4/da/d5d/db0/d61/dca/l138 0 2026-03-09T16:14:59.637 INFO:tasks.workunit.client.0.vm03.stdout:1/904: dwrite d4/d6/d3b/d63/fdb [4194304,4194304] 0 2026-03-09T16:14:59.637 INFO:tasks.workunit.client.0.vm03.stdout:4/997: dwrite d5/d17/f83 [0,4194304] 0 2026-03-09T16:14:59.645 INFO:tasks.workunit.client.0.vm03.stdout:3/979: creat d5/d53/d6c/d79/f12a x:0 0 0 2026-03-09T16:14:59.649 INFO:tasks.workunit.client.0.vm03.stdout:7/948: mkdir d4/da/d5d/dd8/d22/d24/d16/d6e/d13b 0 2026-03-09T16:14:59.656 INFO:tasks.workunit.client.0.vm03.stdout:6/916: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/d125/df0/f12d x:0 0 0 2026-03-09T16:14:59.659 INFO:tasks.workunit.client.0.vm03.stdout:6/917: dread d9/d14/f44 [0,4194304] 0 2026-03-09T16:14:59.665 INFO:tasks.workunit.client.0.vm03.stdout:4/998: truncate d5/db/d25/f78 5271514 0 2026-03-09T16:14:59.669 INFO:tasks.workunit.client.0.vm03.stdout:6/918: creat d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f12e x:0 0 0 2026-03-09T16:14:59.673 INFO:tasks.workunit.client.0.vm03.stdout:6/919: stat d9/d14/ff5 0 2026-03-09T16:14:59.676 INFO:tasks.workunit.client.0.vm03.stdout:6/920: creat d9/d42/d45/d65/dbf/dc9/de0/f12f x:0 0 0 2026-03-09T16:14:59.676 INFO:tasks.workunit.client.0.vm03.stdout:6/921: chown d9/d42/lab 5 1 2026-03-09T16:14:59.678 INFO:tasks.workunit.client.0.vm03.stdout:6/922: rename d9/d8e/def/c118 to d9/d8e/def/d112/c130 0 2026-03-09T16:14:59.680 INFO:tasks.workunit.client.0.vm03.stdout:6/923: mkdir d9/d84/d123/d131 0 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: Upgrade: Updating mgr.vm05.dygxfv 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:14:59.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:14:59 vm05.local ceph-mon[58702]: pgmap v9: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 60 MiB/s wr, 201 op/s 2026-03-09T16:14:59.681 INFO:tasks.workunit.client.0.vm03.stdout:4/999: sync 2026-03-09T16:14:59.684 INFO:tasks.workunit.client.0.vm03.stdout:6/924: dwrite d9/d42/d45/d50/d80/d8a/dc1/dd4/fea [0,4194304] 0 2026-03-09T16:14:59.693 INFO:tasks.workunit.client.0.vm03.stdout:6/925: rename d9/d42/d45/fd5 to d9/d42/d45/d50/d80/d90/f132 0 2026-03-09T16:14:59.719 INFO:tasks.workunit.client.0.vm03.stdout:2/961: write db/f2d [325688,94644] 0 2026-03-09T16:14:59.731 INFO:tasks.workunit.client.0.vm03.stdout:1/905: dwrite d4/d39/d7f/fda [0,4194304] 0 2026-03-09T16:14:59.739 INFO:tasks.workunit.client.0.vm03.stdout:3/980: dwrite d5/d6d/d5a/f78 [0,4194304] 0 2026-03-09T16:14:59.745 INFO:tasks.workunit.client.0.vm03.stdout:7/949: truncate d4/da/d45/d51/f50 3219966 0 2026-03-09T16:14:59.748 INFO:tasks.workunit.client.0.vm03.stdout:1/906: truncate d4/d6/da2/fbb 661016 0 2026-03-09T16:14:59.755 INFO:tasks.workunit.client.0.vm03.stdout:2/962: creat db/d12/da5/f148 x:0 0 0 2026-03-09T16:14:59.760 INFO:tasks.workunit.client.0.vm03.stdout:2/963: dwrite db/d12/d2a/f5f [4194304,4194304] 0 2026-03-09T16:14:59.761 INFO:tasks.workunit.client.0.vm03.stdout:7/950: symlink d4/da/d5d/db0/l13c 0 2026-03-09T16:14:59.770 INFO:tasks.workunit.client.0.vm03.stdout:6/926: dwrite d9/d8e/fd3 [0,4194304] 0 2026-03-09T16:14:59.779 INFO:tasks.workunit.client.0.vm03.stdout:6/927: write d9/d8e/def/f10b [1770846,61266] 0 2026-03-09T16:14:59.780 INFO:tasks.workunit.client.0.vm03.stdout:6/928: readlink d9/d42/d45/d50/d80/d90/db7/lbc 0 2026-03-09T16:14:59.789 INFO:tasks.workunit.client.0.vm03.stdout:2/964: dread db/d12/d2a/d61/f9d [0,4194304] 0 2026-03-09T16:14:59.790 INFO:tasks.workunit.client.0.vm03.stdout:2/965: mknod db/d12/d11a/c149 0 2026-03-09T16:14:59.793 INFO:tasks.workunit.client.0.vm03.stdout:2/966: unlink db/d12/d2a/d99/de7/df9/d64/dbd/da0/fcb 0 2026-03-09T16:14:59.796 INFO:tasks.workunit.client.0.vm03.stdout:2/967: rename db/d12/d2a/d99/de7/df9/lfe to db/d12/da5/dbb/dc3/d133/d130/l14a 0 2026-03-09T16:14:59.797 INFO:tasks.workunit.client.0.vm03.stdout:2/968: mknod db/d12/da5/dbb/c14b 0 2026-03-09T16:14:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/981: dwrite d5/d1e/d42/f74 [0,4194304] 0 2026-03-09T16:14:59.812 INFO:tasks.workunit.client.0.vm03.stdout:1/907: write d4/d6/d3b/f95 [1296290,95001] 0 2026-03-09T16:14:59.814 INFO:tasks.workunit.client.0.vm03.stdout:1/908: fdatasync d4/d31/d5c/f9e 0 2026-03-09T16:14:59.816 INFO:tasks.workunit.client.0.vm03.stdout:1/909: chown d4/d31/d5c/da8 121218322 1 2026-03-09T16:14:59.817 INFO:tasks.workunit.client.0.vm03.stdout:6/929: write d9/f40 [2160337,94477] 0 2026-03-09T16:14:59.818 INFO:tasks.workunit.client.0.vm03.stdout:7/951: dwrite d4/da/d5d/f9b [4194304,4194304] 0 2026-03-09T16:14:59.825 INFO:tasks.workunit.client.0.vm03.stdout:2/969: write db/d12/d2a/d61/d79/f100 [19773,64295] 0 2026-03-09T16:14:59.825 INFO:tasks.workunit.client.0.vm03.stdout:3/982: creat d5/d1e/f12b x:0 0 0 2026-03-09T16:14:59.830 INFO:tasks.workunit.client.0.vm03.stdout:2/970: stat db/d12/d2a/d99/de7/df9/d64/dbd/l12e 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:2/971: chown db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/db3 687385466 1 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:2/972: readlink db/d12/d2a/d99/de7/df9/d64/dbd/da0/lac 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:7/952: link d4/d2d/d4b/l86 d4/da/d5d/dd8/d22/d24/d16/d3e/d114/d10c/l13d 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:2/973: symlink db/d12/da5/dc2/dc9/l14c 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:3/983: mkdir d5/d6d/db9/df2/dae/d12c 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:3/984: unlink d5/d2e/fd4 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:7/953: creat d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/d133/f13e x:0 0 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:3/985: creat d5/d53/d88/dd7/f12d x:0 0 0 2026-03-09T16:14:59.843 INFO:tasks.workunit.client.0.vm03.stdout:2/974: read db/d12/d2a/d61/f47 [4676963,111798] 0 2026-03-09T16:14:59.848 INFO:tasks.workunit.client.0.vm03.stdout:2/975: dread db/f34 [0,4194304] 0 2026-03-09T16:14:59.852 INFO:tasks.workunit.client.0.vm03.stdout:2/976: getdents db/d12/d2a/d99/de7/df9/d64/dbd/da0 0 2026-03-09T16:14:59.856 INFO:tasks.workunit.client.0.vm03.stdout:2/977: creat db/d12/d2a/d99/de7/df9/d64/dbd/da0/db6/f14d x:0 0 0 2026-03-09T16:14:59.857 INFO:tasks.workunit.client.0.vm03.stdout:2/978: readlink db/d12/d2a/d61/la2 0 2026-03-09T16:14:59.857 INFO:tasks.workunit.client.0.vm03.stdout:1/910: sync 2026-03-09T16:14:59.876 INFO:tasks.workunit.client.0.vm03.stdout:6/930: dwrite d9/f73 [0,4194304] 0 2026-03-09T16:14:59.876 INFO:tasks.workunit.client.0.vm03.stdout:3/986: write d5/d1e/d42/d8b/fd0 [588923,106175] 0 2026-03-09T16:14:59.878 INFO:tasks.workunit.client.0.vm03.stdout:7/954: truncate d4/da/dbf/f105 3726100 0 2026-03-09T16:14:59.889 INFO:tasks.workunit.client.0.vm03.stdout:1/911: dwrite d4/d6/d3b/f98 [0,4194304] 0 2026-03-09T16:14:59.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: Upgrade: Updating mgr.vm05.dygxfv 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: Deploying daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:14:59.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:14:59 vm03.local ceph-mon[51019]: pgmap v9: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 60 MiB/s wr, 201 op/s 2026-03-09T16:14:59.896 INFO:tasks.workunit.client.0.vm03.stdout:2/979: dwrite f0 [0,4194304] 0 2026-03-09T16:14:59.896 INFO:tasks.workunit.client.0.vm03.stdout:2/980: truncate db/d12/fe8 2090599 0 2026-03-09T16:14:59.896 INFO:tasks.workunit.client.0.vm03.stdout:2/981: stat db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/da4/c126 0 2026-03-09T16:14:59.906 INFO:tasks.workunit.client.0.vm03.stdout:7/955: dread d4/f3b [4194304,4194304] 0 2026-03-09T16:14:59.910 INFO:tasks.workunit.client.0.vm03.stdout:2/982: rename db/l3a to db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad/l14e 0 2026-03-09T16:14:59.911 INFO:tasks.workunit.client.0.vm03.stdout:7/956: stat d4/da/d5d/dd8/d22/d24/d15/d71/db7/l112 0 2026-03-09T16:14:59.912 INFO:tasks.workunit.client.0.vm03.stdout:1/912: read d4/d6/d1d/d20/d93/f85 [1134256,70871] 0 2026-03-09T16:14:59.913 INFO:tasks.workunit.client.0.vm03.stdout:2/983: creat db/d12/d2a/d99/de7/df9/d64/dbd/da0/d112/f14f x:0 0 0 2026-03-09T16:14:59.913 INFO:tasks.workunit.client.0.vm03.stdout:2/984: chown db/d12/d2a/d61/d6d/c13c 137925 1 2026-03-09T16:14:59.914 INFO:tasks.workunit.client.0.vm03.stdout:1/913: symlink d4/d31/l123 0 2026-03-09T16:14:59.915 INFO:tasks.workunit.client.0.vm03.stdout:2/985: mkdir db/d12/da5/de4/d150 0 2026-03-09T16:14:59.916 INFO:tasks.workunit.client.0.vm03.stdout:2/986: chown db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/d94/dad 171384 1 2026-03-09T16:14:59.917 INFO:tasks.workunit.client.0.vm03.stdout:1/914: creat d4/d6/d1d/d20/d23/d112/f124 x:0 0 0 2026-03-09T16:14:59.925 INFO:tasks.workunit.client.0.vm03.stdout:2/987: read db/d12/d2a/d61/d6d/f91 [4034110,40378] 0 2026-03-09T16:14:59.927 INFO:tasks.workunit.client.0.vm03.stdout:2/988: dread db/d12/da5/dc2/d110/f123 [0,4194304] 0 2026-03-09T16:14:59.928 INFO:tasks.workunit.client.0.vm03.stdout:2/989: creat db/d12/d11a/f151 x:0 0 0 2026-03-09T16:14:59.932 INFO:tasks.workunit.client.0.vm03.stdout:7/957: sync 2026-03-09T16:14:59.957 INFO:tasks.workunit.client.0.vm03.stdout:6/931: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/faf [0,4194304] 0 2026-03-09T16:14:59.968 INFO:tasks.workunit.client.0.vm03.stdout:3/987: write d5/d6d/db9/df2/dbe/f107 [7974632,40894] 0 2026-03-09T16:14:59.969 INFO:tasks.workunit.client.0.vm03.stdout:3/988: dread d5/d6d/d6a/dbd/ff4 [0,4194304] 0 2026-03-09T16:14:59.976 INFO:tasks.workunit.client.0.vm03.stdout:6/932: mknod d9/d84/d123/d131/c133 0 2026-03-09T16:14:59.979 INFO:tasks.workunit.client.0.vm03.stdout:3/989: rename d5/d1e/d42/f29 to d5/d53/d6c/d79/d91/f12e 0 2026-03-09T16:14:59.979 INFO:tasks.workunit.client.0.vm03.stdout:6/933: dwrite d9/d42/d45/ffd [0,4194304] 0 2026-03-09T16:14:59.980 INFO:tasks.workunit.client.0.vm03.stdout:3/990: stat d5/d1e/d42/d55/cb1 0 2026-03-09T16:14:59.981 INFO:tasks.workunit.client.0.vm03.stdout:3/991: write d5/d1e/f118 [938074,64476] 0 2026-03-09T16:14:59.994 INFO:tasks.workunit.client.0.vm03.stdout:3/992: mkdir d5/d53/d88/dd7/d12f 0 2026-03-09T16:14:59.998 INFO:tasks.workunit.client.0.vm03.stdout:1/915: dwrite d4/d39/d70/f97 [0,4194304] 0 2026-03-09T16:15:00.002 INFO:tasks.workunit.client.0.vm03.stdout:6/934: rename d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f4e to d9/d42/d45/d65/f134 0 2026-03-09T16:15:00.012 INFO:tasks.workunit.client.0.vm03.stdout:2/990: write db/d12/f85 [1366545,124512] 0 2026-03-09T16:15:00.013 INFO:tasks.workunit.client.0.vm03.stdout:2/991: dread - db/d12/d2a/d99/de7/df9/d64/f66 zero size 2026-03-09T16:15:00.016 INFO:tasks.workunit.client.0.vm03.stdout:3/993: rename d5/d53/d6c/l68 to d5/d1e/d42/d8b/l130 0 2026-03-09T16:15:00.024 INFO:tasks.workunit.client.0.vm03.stdout:6/935: chown d9/d42/d45/d50/d80/d8a/d9c/d97/da8/dbd/ldd 3543 1 2026-03-09T16:15:00.024 INFO:tasks.workunit.client.0.vm03.stdout:7/958: dwrite d4/da/d5d/dd8/d22/f33 [0,4194304] 0 2026-03-09T16:15:00.024 INFO:tasks.workunit.client.0.vm03.stdout:7/959: stat d4/da/dbf/deb/d100 0 2026-03-09T16:15:00.028 INFO:tasks.workunit.client.0.vm03.stdout:2/992: readlink db/d12/l32 0 2026-03-09T16:15:00.028 INFO:tasks.workunit.client.0.vm03.stdout:1/916: creat d4/d31/d5c/d111/f125 x:0 0 0 2026-03-09T16:15:00.034 INFO:tasks.workunit.client.0.vm03.stdout:3/994: truncate d5/d1e/f26 502875 0 2026-03-09T16:15:00.036 INFO:tasks.workunit.client.0.vm03.stdout:6/936: fdatasync d9/d42/d45/d50/f124 0 2026-03-09T16:15:00.041 INFO:tasks.workunit.client.0.vm03.stdout:1/917: creat d4/d6/d3b/d6b/da5/f126 x:0 0 0 2026-03-09T16:15:00.047 INFO:tasks.workunit.client.0.vm03.stdout:2/993: mkdir db/d12/d2a/d99/de7/df9/d64/dbd/da0/d152 0 2026-03-09T16:15:00.047 INFO:tasks.workunit.client.0.vm03.stdout:3/995: rename d5/d53/d88/c10e to d5/d44/d61/d122/c131 0 2026-03-09T16:15:00.047 INFO:tasks.workunit.client.0.vm03.stdout:1/918: mkdir d4/d39/d70/d127 0 2026-03-09T16:15:00.047 INFO:tasks.workunit.client.0.vm03.stdout:1/919: write d4/fa [2203383,3789] 0 2026-03-09T16:15:00.048 INFO:tasks.workunit.client.0.vm03.stdout:1/920: dread - d4/d6/d1d/f11a zero size 2026-03-09T16:15:00.052 INFO:tasks.workunit.client.0.vm03.stdout:1/921: dwrite d4/d6/d3b/d6b/da5/f126 [0,4194304] 0 2026-03-09T16:15:00.064 INFO:tasks.workunit.client.0.vm03.stdout:6/937: dwrite d9/d84/f91 [0,4194304] 0 2026-03-09T16:15:00.066 INFO:tasks.workunit.client.0.vm03.stdout:2/994: write db/d12/d2a/d61/d6d/f81 [1144903,22993] 0 2026-03-09T16:15:00.074 INFO:tasks.workunit.client.0.vm03.stdout:6/938: creat d9/d8e/def/f135 x:0 0 0 2026-03-09T16:15:00.078 INFO:tasks.workunit.client.0.vm03.stdout:3/996: dwrite d5/d53/d88/dd3/f11f [0,4194304] 0 2026-03-09T16:15:00.079 INFO:tasks.workunit.client.0.vm03.stdout:6/939: dread d9/d42/f9a [0,4194304] 0 2026-03-09T16:15:00.080 INFO:tasks.workunit.client.0.vm03.stdout:3/997: readlink d5/d6d/d6a/dbd/lfd 0 2026-03-09T16:15:00.083 INFO:tasks.workunit.client.0.vm03.stdout:6/940: chown d9/d42/d45/d50/f124 2005343 1 2026-03-09T16:15:00.086 INFO:tasks.workunit.client.0.vm03.stdout:7/960: getdents d4/da/d45/d51/d36/d66 0 2026-03-09T16:15:00.088 INFO:tasks.workunit.client.0.vm03.stdout:7/961: stat d4/da/d5d/db0/d113/de8 0 2026-03-09T16:15:00.088 INFO:tasks.workunit.client.0.vm03.stdout:1/922: creat d4/d6/d3b/d6b/d25/d50/de3/f128 x:0 0 0 2026-03-09T16:15:00.089 INFO:tasks.workunit.client.0.vm03.stdout:7/962: chown d4/da/d45/d51/d36/c9c 1943 1 2026-03-09T16:15:00.093 INFO:tasks.workunit.client.0.vm03.stdout:2/995: dread db/f14 [0,4194304] 0 2026-03-09T16:15:00.096 INFO:tasks.workunit.client.0.vm03.stdout:3/998: rename d5/d53/d6c/c4e to d5/d1e/d42/d34/dd2/c132 0 2026-03-09T16:15:00.096 INFO:tasks.workunit.client.0.vm03.stdout:1/923: dwrite d4/d6/d1d/f11a [0,4194304] 0 2026-03-09T16:15:00.102 INFO:tasks.workunit.client.0.vm03.stdout:7/963: mknod d4/da/d5d/dd8/d22/d24/d16/d3e/d114/c13f 0 2026-03-09T16:15:00.108 INFO:tasks.workunit.client.0.vm03.stdout:7/964: mknod d4/da/d5d/dd8/d22/d24/d16/d3e/d77/c140 0 2026-03-09T16:15:00.112 INFO:tasks.workunit.client.0.vm03.stdout:6/941: getdents d9/d84 0 2026-03-09T16:15:00.112 INFO:tasks.workunit.client.0.vm03.stdout:7/965: symlink d4/d2d/l141 0 2026-03-09T16:15:00.113 INFO:tasks.workunit.client.0.vm03.stdout:7/966: chown d4/da/d5d/db0/d61/fd2 28686487 1 2026-03-09T16:15:00.116 INFO:tasks.workunit.client.0.vm03.stdout:7/967: stat d4/da/d5d/dd8/d22/d24/d16/lb4 0 2026-03-09T16:15:00.120 INFO:tasks.workunit.client.0.vm03.stdout:7/968: dwrite d4/da/d45/d51/f91 [0,4194304] 0 2026-03-09T16:15:00.126 INFO:tasks.workunit.client.0.vm03.stdout:7/969: chown d4/da/d5d/db0/la2 0 1 2026-03-09T16:15:00.145 INFO:tasks.workunit.client.0.vm03.stdout:6/942: sync 2026-03-09T16:15:00.146 INFO:tasks.workunit.client.0.vm03.stdout:6/943: fsync d9/d42/d45/dfa/f128 0 2026-03-09T16:15:00.148 INFO:tasks.workunit.client.0.vm03.stdout:6/944: creat d9/d42/d45/dfa/f136 x:0 0 0 2026-03-09T16:15:00.149 INFO:tasks.workunit.client.0.vm03.stdout:6/945: chown d9/d42/lab 3026 1 2026-03-09T16:15:00.151 INFO:tasks.workunit.client.0.vm03.stdout:6/946: getdents d9/d42/d45/d50/d80/d8a/dc1/dd4/de5/dfe/d110 0 2026-03-09T16:15:00.156 INFO:tasks.workunit.client.0.vm03.stdout:6/947: link d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f27 d9/d42/d45/d65/dbf/dc9/f137 0 2026-03-09T16:15:00.160 INFO:tasks.workunit.client.0.vm03.stdout:3/999: dwrite d5/d1e/f66 [4194304,4194304] 0 2026-03-09T16:15:00.162 INFO:tasks.workunit.client.0.vm03.stdout:2/996: write db/d12/d2a/d61/f65 [339945,71458] 0 2026-03-09T16:15:00.167 INFO:tasks.workunit.client.0.vm03.stdout:1/924: dwrite d4/d6/f15 [0,4194304] 0 2026-03-09T16:15:00.180 INFO:tasks.workunit.client.0.vm03.stdout:7/970: write d4/d2d/fd5 [4249715,13516] 0 2026-03-09T16:15:00.181 INFO:tasks.workunit.client.0.vm03.stdout:7/971: dread - d4/da/d5d/dd8/d22/d24/d16/d6e/fa1 zero size 2026-03-09T16:15:00.183 INFO:tasks.workunit.client.0.vm03.stdout:7/972: dread - d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/d133/f13e zero size 2026-03-09T16:15:00.189 INFO:tasks.workunit.client.0.vm03.stdout:2/997: mknod db/d12/d2a/d99/de7/df9/d64/dbd/dec/df8/c153 0 2026-03-09T16:15:00.189 INFO:tasks.workunit.client.0.vm03.stdout:7/973: truncate d4/da/d45/d51/d36/f6f 1989057 0 2026-03-09T16:15:00.189 INFO:tasks.workunit.client.0.vm03.stdout:1/925: symlink d4/d39/l129 0 2026-03-09T16:15:00.189 INFO:tasks.workunit.client.0.vm03.stdout:1/926: readlink d4/d39/d7f/lf9 0 2026-03-09T16:15:00.190 INFO:tasks.workunit.client.0.vm03.stdout:6/948: dread d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/feb [0,4194304] 0 2026-03-09T16:15:00.193 INFO:tasks.workunit.client.0.vm03.stdout:6/949: fsync d9/d42/d45/d50/d80/d8a/d9c/d97/f99 0 2026-03-09T16:15:00.195 INFO:tasks.workunit.client.0.vm03.stdout:6/950: creat d9/d84/d123/d131/f138 x:0 0 0 2026-03-09T16:15:00.196 INFO:tasks.workunit.client.0.vm03.stdout:6/951: mknod d9/d42/d45/d50/d80/d90/c139 0 2026-03-09T16:15:00.198 INFO:tasks.workunit.client.0.vm03.stdout:6/952: unlink d9/d42/d45/lb4 0 2026-03-09T16:15:00.199 INFO:tasks.workunit.client.0.vm03.stdout:6/953: unlink d9/d84/d123/d131/f138 0 2026-03-09T16:15:00.201 INFO:tasks.workunit.client.0.vm03.stdout:6/954: creat d9/d42/d45/d65/f13a x:0 0 0 2026-03-09T16:15:00.201 INFO:tasks.workunit.client.0.vm03.stdout:6/955: symlink d9/d42/d45/ddf/l13b 0 2026-03-09T16:15:00.203 INFO:tasks.workunit.client.0.vm03.stdout:6/956: truncate d9/f15 925162 0 2026-03-09T16:15:00.205 INFO:tasks.workunit.client.0.vm03.stdout:6/957: creat d9/d42/d45/ddf/f13c x:0 0 0 2026-03-09T16:15:00.207 INFO:tasks.workunit.client.0.vm03.stdout:6/958: truncate d9/d42/d45/d65/dbf/dc9/ff8 315006 0 2026-03-09T16:15:00.215 INFO:tasks.workunit.client.0.vm03.stdout:2/998: dwrite db/d12/da5/dc2/d110/f123 [0,4194304] 0 2026-03-09T16:15:00.218 INFO:tasks.workunit.client.0.vm03.stdout:7/974: truncate d4/d2d/d4b/f12e 567445 0 2026-03-09T16:15:00.222 INFO:tasks.workunit.client.0.vm03.stdout:1/927: dwrite d4/d6/d3b/d6b/d25/d50/d10d/f45 [0,4194304] 0 2026-03-09T16:15:00.225 INFO:tasks.workunit.client.0.vm03.stdout:7/975: mkdir d4/d2d/d4b/d142 0 2026-03-09T16:15:00.226 INFO:tasks.workunit.client.0.vm03.stdout:7/976: chown d4/da/d5d/db0/d113 60 1 2026-03-09T16:15:00.231 INFO:tasks.workunit.client.0.vm03.stdout:2/999: dread db/d12/d2a/d99/de7/df9/d52/fd0 [0,4194304] 0 2026-03-09T16:15:00.236 INFO:tasks.workunit.client.0.vm03.stdout:7/977: dread d4/da/d5d/dd8/d22/d24/d16/d2b/fe1 [0,4194304] 0 2026-03-09T16:15:00.237 INFO:tasks.workunit.client.0.vm03.stdout:7/978: unlink d4/f129 0 2026-03-09T16:15:00.291 INFO:tasks.workunit.client.0.vm03.stdout:6/959: dwrite d9/d42/d45/d65/dbf/dc9/de4/f100 [0,4194304] 0 2026-03-09T16:15:00.292 INFO:tasks.workunit.client.0.vm03.stdout:6/960: chown d9/d42/d45/d50/d80/d90/l8b 17762908 1 2026-03-09T16:15:00.295 INFO:tasks.workunit.client.0.vm03.stdout:6/961: mknod d9/d8e/def/d112/c13d 0 2026-03-09T16:15:00.296 INFO:tasks.workunit.client.0.vm03.stdout:6/962: fdatasync d9/d42/d45/d50/fb0 0 2026-03-09T16:15:00.300 INFO:tasks.workunit.client.0.vm03.stdout:6/963: symlink d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/d117/d126/l13e 0 2026-03-09T16:15:00.301 INFO:tasks.workunit.client.0.vm03.stdout:1/928: write d4/d6/d1d/f66 [4494532,29197] 0 2026-03-09T16:15:00.305 INFO:tasks.workunit.client.0.vm03.stdout:1/929: dwrite d4/d6/d1d/d20/d23/d112/f124 [0,4194304] 0 2026-03-09T16:15:00.309 INFO:tasks.workunit.client.0.vm03.stdout:1/930: symlink d4/d39/d70/df0/l12a 0 2026-03-09T16:15:00.310 INFO:tasks.workunit.client.0.vm03.stdout:1/931: truncate d4/d31/d5c/da8/da1/fbf 139158 0 2026-03-09T16:15:00.312 INFO:tasks.workunit.client.0.vm03.stdout:1/932: truncate d4/d6/d1d/d20/d93/f85 3932893 0 2026-03-09T16:15:00.313 INFO:tasks.workunit.client.0.vm03.stdout:1/933: stat d4/d6/d3b/d6b/c8a 0 2026-03-09T16:15:00.315 INFO:tasks.workunit.client.0.vm03.stdout:1/934: rmdir d4/d39/d70/d127 0 2026-03-09T16:15:00.320 INFO:tasks.workunit.client.0.vm03.stdout:1/935: rename d4/d6/ld2 to d4/d39/l12b 0 2026-03-09T16:15:00.324 INFO:tasks.workunit.client.0.vm03.stdout:1/936: creat d4/d6/d3b/d6b/d25/f12c x:0 0 0 2026-03-09T16:15:00.328 INFO:tasks.workunit.client.0.vm03.stdout:1/937: readlink d4/d31/l73 0 2026-03-09T16:15:00.328 INFO:tasks.workunit.client.0.vm03.stdout:1/938: fdatasync d4/db/f21 0 2026-03-09T16:15:00.328 INFO:tasks.workunit.client.0.vm03.stdout:1/939: mknod d4/d39/deb/c12d 0 2026-03-09T16:15:00.328 INFO:tasks.workunit.client.0.vm03.stdout:1/940: write d4/d6/d3b/f36 [9084288,3150] 0 2026-03-09T16:15:00.330 INFO:tasks.workunit.client.0.vm03.stdout:7/979: write d4/d2d/f95 [5105015,15732] 0 2026-03-09T16:15:00.331 INFO:tasks.workunit.client.0.vm03.stdout:1/941: stat d4/d6/d3b/d6b/d25/f84 0 2026-03-09T16:15:00.332 INFO:tasks.workunit.client.0.vm03.stdout:7/980: symlink d4/da/d45/d51/dea/l143 0 2026-03-09T16:15:00.333 INFO:tasks.workunit.client.0.vm03.stdout:7/981: chown d4/da/d5d/dd8/d22/d24/d16/d2b/c2e 46536430 1 2026-03-09T16:15:00.334 INFO:tasks.workunit.client.0.vm03.stdout:1/942: rename d4/d6/d3b/d6b/da5/dc3/l11b to d4/db/d8b/db2/l12e 0 2026-03-09T16:15:00.335 INFO:tasks.workunit.client.0.vm03.stdout:7/982: fdatasync d4/da/d45/d51/d36/fc8 0 2026-03-09T16:15:00.336 INFO:tasks.workunit.client.0.vm03.stdout:1/943: symlink d4/d6/da2/dea/l12f 0 2026-03-09T16:15:00.336 INFO:tasks.workunit.client.0.vm03.stdout:1/944: stat d4/db/f21 0 2026-03-09T16:15:00.338 INFO:tasks.workunit.client.0.vm03.stdout:1/945: creat d4/d6/d1d/d20/d5f/f130 x:0 0 0 2026-03-09T16:15:00.339 INFO:tasks.workunit.client.0.vm03.stdout:7/983: rename d4/da/l21 to d4/da/d5d/dd8/l144 0 2026-03-09T16:15:00.340 INFO:tasks.workunit.client.0.vm03.stdout:7/984: dread - d4/da/d45/d51/dea/ff3 zero size 2026-03-09T16:15:00.342 INFO:tasks.workunit.client.0.vm03.stdout:7/985: creat d4/da/d5d/f145 x:0 0 0 2026-03-09T16:15:00.372 INFO:tasks.workunit.client.0.vm03.stdout:6/964: write d9/d42/d45/d50/d80/d8a/dc1/fe8 [3277326,76291] 0 2026-03-09T16:15:00.380 INFO:tasks.workunit.client.0.vm03.stdout:1/946: write d4/db/d59/df1/ff6 [232641,106713] 0 2026-03-09T16:15:00.381 INFO:tasks.workunit.client.0.vm03.stdout:7/986: write d4/da/d5d/dd8/d22/d24/d15/fd3 [1744653,124118] 0 2026-03-09T16:15:00.385 INFO:tasks.workunit.client.0.vm03.stdout:1/947: symlink d4/d6/d3b/d6b/d25/d50/l131 0 2026-03-09T16:15:00.386 INFO:tasks.workunit.client.0.vm03.stdout:7/987: fdatasync d4/da/d5d/dd8/d22/d24/d16/d6e/fa1 0 2026-03-09T16:15:00.387 INFO:tasks.workunit.client.0.vm03.stdout:7/988: read d4/da/d5d/dd8/d22/d24/d16/d6e/d7e/fe7 [6868732,124227] 0 2026-03-09T16:15:00.389 INFO:tasks.workunit.client.0.vm03.stdout:7/989: dread d4/da/d5d/f9b [4194304,4194304] 0 2026-03-09T16:15:00.389 INFO:tasks.workunit.client.0.vm03.stdout:6/965: truncate f7 431501 0 2026-03-09T16:15:00.392 INFO:tasks.workunit.client.0.vm03.stdout:1/948: creat d4/d6/da2/dea/f132 x:0 0 0 2026-03-09T16:15:00.392 INFO:tasks.workunit.client.0.vm03.stdout:7/990: dread d4/d2d/f8c [0,4194304] 0 2026-03-09T16:15:00.398 INFO:tasks.workunit.client.0.vm03.stdout:1/949: dwrite d4/d6/da2/dea/f132 [0,4194304] 0 2026-03-09T16:15:00.400 INFO:tasks.workunit.client.0.vm03.stdout:7/991: rename d4/da/d5d/dd8/d22/d24/d16/d3e/c47 to d4/da/d5d/dd8/d22/d24/d16/d3e/d114/c146 0 2026-03-09T16:15:00.408 INFO:tasks.workunit.client.0.vm03.stdout:7/992: fsync d4/da/d45/fb9 0 2026-03-09T16:15:00.418 INFO:tasks.workunit.client.0.vm03.stdout:7/993: fdatasync d4/da/d45/fb9 0 2026-03-09T16:15:00.421 INFO:tasks.workunit.client.0.vm03.stdout:1/950: link d4/d39/l12b d4/d6/d1d/dfa/l133 0 2026-03-09T16:15:00.422 INFO:tasks.workunit.client.0.vm03.stdout:1/951: fdatasync d4/f6d 0 2026-03-09T16:15:00.423 INFO:tasks.workunit.client.0.vm03.stdout:1/952: write d4/d6/d3b/d6b/d25/fb8 [376094,100798] 0 2026-03-09T16:15:00.430 INFO:tasks.workunit.client.0.vm03.stdout:6/966: sync 2026-03-09T16:15:00.439 INFO:tasks.workunit.client.0.vm03.stdout:6/967: sync 2026-03-09T16:15:00.442 INFO:tasks.workunit.client.0.vm03.stdout:6/968: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/f13f x:0 0 0 2026-03-09T16:15:00.446 INFO:tasks.workunit.client.0.vm03.stdout:6/969: dwrite d9/d42/d45/d50/d80/d90/f64 [0,4194304] 0 2026-03-09T16:15:00.449 INFO:tasks.workunit.client.0.vm03.stdout:6/970: mknod d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/dad/d117/c140 0 2026-03-09T16:15:00.450 INFO:tasks.workunit.client.0.vm03.stdout:6/971: readlink d9/d42/d45/d50/d80/d8a/d9c/d97/da8/la4 0 2026-03-09T16:15:00.453 INFO:tasks.workunit.client.0.vm03.stdout:6/972: dread d9/d14/f3d [0,4194304] 0 2026-03-09T16:15:00.453 INFO:tasks.workunit.client.0.vm03.stdout:7/994: dwrite d4/da/d5d/dd8/d22/d24/d16/d2b/f5a [0,4194304] 0 2026-03-09T16:15:00.459 INFO:tasks.workunit.client.0.vm03.stdout:1/953: dwrite d4/d31/f81 [0,4194304] 0 2026-03-09T16:15:00.471 INFO:tasks.workunit.client.0.vm03.stdout:6/973: link d9/l2a d9/d42/d45/d65/l141 0 2026-03-09T16:15:00.472 INFO:tasks.workunit.client.0.vm03.stdout:6/974: read d9/d42/d45/d65/dbf/dc9/de4/f100 [903412,97996] 0 2026-03-09T16:15:00.473 INFO:tasks.workunit.client.0.vm03.stdout:6/975: readlink d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/d125/lca 0 2026-03-09T16:15:00.474 INFO:tasks.workunit.client.0.vm03.stdout:6/976: write d9/d8e/fd3 [1692132,41810] 0 2026-03-09T16:15:00.478 INFO:tasks.workunit.client.0.vm03.stdout:1/954: getdents d4/d6/d1d/d20/d93 0 2026-03-09T16:15:00.481 INFO:tasks.workunit.client.0.vm03.stdout:1/955: fsync d4/d6/d3b/d63/f77 0 2026-03-09T16:15:00.481 INFO:tasks.workunit.client.0.vm03.stdout:1/956: chown d4/d6/d3b/d63/ccd 19 1 2026-03-09T16:15:00.483 INFO:tasks.workunit.client.0.vm03.stdout:7/995: dread d4/da/d5d/db0/d61/fdb [0,4194304] 0 2026-03-09T16:15:00.484 INFO:tasks.workunit.client.0.vm03.stdout:1/957: unlink d4/d6/d3b/d63/lf4 0 2026-03-09T16:15:00.485 INFO:tasks.workunit.client.0.vm03.stdout:1/958: chown d4/fa 207612 1 2026-03-09T16:15:00.485 INFO:tasks.workunit.client.0.vm03.stdout:1/959: chown d4/d6/da2/cd4 1635 1 2026-03-09T16:15:00.485 INFO:tasks.workunit.client.0.vm03.stdout:7/996: symlink d4/da/dbf/deb/d100/l147 0 2026-03-09T16:15:00.487 INFO:tasks.workunit.client.0.vm03.stdout:1/960: mkdir d4/d6/d3b/d6b/da5/dc0/d134 0 2026-03-09T16:15:00.489 INFO:tasks.workunit.client.0.vm03.stdout:7/997: mknod d4/da/c148 0 2026-03-09T16:15:00.491 INFO:tasks.workunit.client.0.vm03.stdout:1/961: mkdir d4/d6/d3b/d6b/da5/d135 0 2026-03-09T16:15:00.493 INFO:tasks.workunit.client.0.vm03.stdout:1/962: truncate d4/d39/d7f/fcb 5007878 0 2026-03-09T16:15:00.493 INFO:tasks.workunit.client.0.vm03.stdout:7/998: creat d4/da/d5d/dd8/d22/d24/d16/d3e/d114/df2/f149 x:0 0 0 2026-03-09T16:15:00.494 INFO:tasks.workunit.client.0.vm03.stdout:6/977: dread d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f3e [0,4194304] 0 2026-03-09T16:15:00.495 INFO:tasks.workunit.client.0.vm03.stdout:1/963: fsync d4/d6/d3b/f95 0 2026-03-09T16:15:00.495 INFO:tasks.workunit.client.0.vm03.stdout:7/999: read d4/da/d5d/db0/d113/db8/ddf/ffe [2403652,41071] 0 2026-03-09T16:15:00.496 INFO:tasks.workunit.client.0.vm03.stdout:1/964: readlink d4/d6/d3b/d6b/d25/l8d 0 2026-03-09T16:15:00.497 INFO:tasks.workunit.client.0.vm03.stdout:1/965: write d4/d6/d1d/f66 [1370940,80167] 0 2026-03-09T16:15:00.503 INFO:tasks.workunit.client.0.vm03.stdout:6/978: dwrite d9/d42/d45/d50/d80/d8a/d9c/d97/faf [4194304,4194304] 0 2026-03-09T16:15:00.505 INFO:tasks.workunit.client.0.vm03.stdout:6/979: chown d9/d42/d45/d65/dbf/dc9 9 1 2026-03-09T16:15:00.513 INFO:tasks.workunit.client.0.vm03.stdout:6/980: creat d9/d42/d45/d50/d80/d8a/dc1/dd4/df9/f142 x:0 0 0 2026-03-09T16:15:00.515 INFO:tasks.workunit.client.0.vm03.stdout:1/966: getdents d4/d6/d1d/d20/d93 0 2026-03-09T16:15:00.517 INFO:tasks.workunit.client.0.vm03.stdout:1/967: fdatasync d4/fbc 0 2026-03-09T16:15:00.520 INFO:tasks.workunit.client.0.vm03.stdout:1/968: rename d4/d6/l11e to d4/d39/deb/l136 0 2026-03-09T16:15:00.527 INFO:tasks.workunit.client.0.vm03.stdout:1/969: mknod d4/d31/d5c/da8/c137 0 2026-03-09T16:15:00.527 INFO:tasks.workunit.client.0.vm03.stdout:1/970: mknod d4/d39/c138 0 2026-03-09T16:15:00.527 INFO:tasks.workunit.client.0.vm03.stdout:1/971: truncate d4/d6/d3b/d6b/f42 432494 0 2026-03-09T16:15:00.529 INFO:tasks.workunit.client.0.vm03.stdout:1/972: symlink d4/d6/d3b/l139 0 2026-03-09T16:15:00.530 INFO:tasks.workunit.client.0.vm03.stdout:6/981: rmdir d9/d42/d45/d50/d80/d8a/dc1/dd4 39 2026-03-09T16:15:00.531 INFO:tasks.workunit.client.0.vm03.stdout:1/973: mknod d4/d39/d70/c13a 0 2026-03-09T16:15:00.535 INFO:tasks.workunit.client.0.vm03.stdout:1/974: dread d4/d39/d70/f97 [0,4194304] 0 2026-03-09T16:15:00.538 INFO:tasks.workunit.client.0.vm03.stdout:6/982: getdents d9/d42/d45/d50/d80/d8a/d9c 0 2026-03-09T16:15:00.539 INFO:tasks.workunit.client.0.vm03.stdout:1/975: dread d4/d39/d7f/f88 [0,4194304] 0 2026-03-09T16:15:00.543 INFO:tasks.workunit.client.0.vm03.stdout:1/976: dwrite d4/d6/d3b/f95 [0,4194304] 0 2026-03-09T16:15:00.596 INFO:tasks.workunit.client.0.vm03.stdout:6/983: write d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f27 [4000195,23500] 0 2026-03-09T16:15:00.602 INFO:tasks.workunit.client.0.vm03.stdout:6/984: link d9/d42/f74 d9/d42/d45/d50/d80/d8a/dc1/dd4/df9/f143 0 2026-03-09T16:15:00.602 INFO:tasks.workunit.client.0.vm03.stdout:1/977: write d4/d6/d1d/d20/d93/f8c [759378,100011] 0 2026-03-09T16:15:00.603 INFO:tasks.workunit.client.0.vm03.stdout:6/985: read d9/d42/d45/d50/d80/d8a/d9c/d97/faf [4916276,85488] 0 2026-03-09T16:15:00.609 INFO:tasks.workunit.client.0.vm03.stdout:6/986: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/d125/d144 0 2026-03-09T16:15:00.610 INFO:tasks.workunit.client.0.vm03.stdout:6/987: write d9/d42/d45/d50/d80/d8a/dc1/dd4/fea [1243577,46163] 0 2026-03-09T16:15:00.614 INFO:tasks.workunit.client.0.vm03.stdout:1/978: fdatasync d4/f1b 0 2026-03-09T16:15:00.614 INFO:tasks.workunit.client.0.vm03.stdout:1/979: chown d4 173 1 2026-03-09T16:15:00.617 INFO:tasks.workunit.client.0.vm03.stdout:6/988: rmdir d9/d14/da5/dd8/d10d 39 2026-03-09T16:15:00.620 INFO:tasks.workunit.client.0.vm03.stdout:1/980: truncate d4/d6/f9 1285140 0 2026-03-09T16:15:00.626 INFO:tasks.workunit.client.0.vm03.stdout:6/989: dread d9/d42/d45/d50/d80/d8a/d9c/d97/dbe/dc6/f37 [0,4194304] 0 2026-03-09T16:15:00.627 INFO:tasks.workunit.client.0.vm03.stdout:1/981: unlink d4/d39/d70/c13a 0 2026-03-09T16:15:00.630 INFO:tasks.workunit.client.0.vm03.stdout:1/982: dwrite d4/d6/d3b/d6b/d25/d50/d10d/f45 [4194304,4194304] 0 2026-03-09T16:15:00.632 INFO:tasks.workunit.client.0.vm03.stdout:1/983: readlink d4/db/d8b/db2/l12e 0 2026-03-09T16:15:00.633 INFO:tasks.workunit.client.0.vm03.stdout:6/990: mknod d9/d14/da5/d12a/c145 0 2026-03-09T16:15:00.634 INFO:tasks.workunit.client.0.vm03.stdout:6/991: read d9/d14/f44 [2803148,17473] 0 2026-03-09T16:15:00.636 INFO:tasks.workunit.client.0.vm03.stdout:1/984: mknod d4/d31/d5c/da8/da1/c13b 0 2026-03-09T16:15:00.636 INFO:tasks.workunit.client.0.vm03.stdout:1/985: stat d4/d6/d1d/c26 0 2026-03-09T16:15:00.637 INFO:tasks.workunit.client.0.vm03.stdout:1/986: write d4/d6/d3b/f95 [4236746,60664] 0 2026-03-09T16:15:00.638 INFO:tasks.workunit.client.0.vm03.stdout:1/987: write d4/d6/da2/fe7 [5135667,102109] 0 2026-03-09T16:15:00.645 INFO:tasks.workunit.client.0.vm03.stdout:1/988: dwrite d4/d39/d7f/fcb [0,4194304] 0 2026-03-09T16:15:00.646 INFO:tasks.workunit.client.0.vm03.stdout:1/989: stat d4/d6/d3b/d6b/d25/d50/l131 0 2026-03-09T16:15:00.654 INFO:tasks.workunit.client.0.vm03.stdout:1/990: creat d4/d31/d5c/da8/da1/f13c x:0 0 0 2026-03-09T16:15:00.656 INFO:tasks.workunit.client.0.vm03.stdout:1/991: creat d4/d6/d1d/d20/d5f/f13d x:0 0 0 2026-03-09T16:15:00.696 INFO:tasks.workunit.client.0.vm03.stdout:6/992: dwrite d9/d42/f74 [4194304,4194304] 0 2026-03-09T16:15:00.714 INFO:tasks.workunit.client.0.vm03.stdout:1/992: write d4/d6/d1d/d20/d5f/f57 [1280369,124468] 0 2026-03-09T16:15:00.715 INFO:tasks.workunit.client.0.vm03.stdout:1/993: chown d4/db/c109 3539 1 2026-03-09T16:15:00.718 INFO:tasks.workunit.client.0.vm03.stdout:1/994: dread d4/d31/f81 [0,4194304] 0 2026-03-09T16:15:00.721 INFO:tasks.workunit.client.0.vm03.stdout:1/995: fdatasync d4/d6/d1d/d20/d93/f10a 0 2026-03-09T16:15:00.724 INFO:tasks.workunit.client.0.vm03.stdout:1/996: mkdir d4/d6/d1d/d20/d13e 0 2026-03-09T16:15:00.740 INFO:tasks.workunit.client.0.vm03.stdout:1/997: symlink d4/d6/d1d/d69/l13f 0 2026-03-09T16:15:00.740 INFO:tasks.workunit.client.0.vm03.stdout:1/998: rename d4 to d4/d6/d3b/d6b/d25/d50/d10d/d140 22 2026-03-09T16:15:00.740 INFO:tasks.workunit.client.0.vm03.stdout:1/999: rename d4/d6/d1d/d20/ffb to d4/d6/d107/f141 0 2026-03-09T16:15:00.750 INFO:tasks.workunit.client.0.vm03.stdout:6/993: sync 2026-03-09T16:15:00.750 INFO:tasks.workunit.client.0.vm03.stdout:6/994: stat d9/d42/d45/ddf/l13b 0 2026-03-09T16:15:00.753 INFO:tasks.workunit.client.0.vm03.stdout:6/995: truncate d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/fd6 3882718 0 2026-03-09T16:15:00.755 INFO:tasks.workunit.client.0.vm03.stdout:6/996: mkdir d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d146 0 2026-03-09T16:15:00.757 INFO:tasks.workunit.client.0.vm03.stdout:6/997: creat d9/d42/d45/d50/d80/f147 x:0 0 0 2026-03-09T16:15:00.759 INFO:tasks.workunit.client.0.vm03.stdout:6/998: getdents d9/d84/d123 0 2026-03-09T16:15:00.761 INFO:tasks.workunit.client.0.vm03.stdout:6/999: symlink d9/d42/d45/d50/d80/d8a/d9c/d97/da8/d92/l148 0 2026-03-09T16:15:00.764 INFO:tasks.workunit.client.0.vm03.stderr:+ rm -rf -- ./tmp.rYSksf2L2n 2026-03-09T16:15:01.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:01.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:01.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:01.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:01 vm05.local ceph-mon[58702]: pgmap v10: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 55 MiB/s wr, 182 op/s 2026-03-09T16:15:02.101 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:02.101 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:02.101 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:02.101 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:01 vm03.local ceph-mon[51019]: pgmap v10: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 55 MiB/s wr, 182 op/s 2026-03-09T16:15:02.208 INFO:tasks.workunit.client.1.vm05.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T16:15:02.219 INFO:tasks.workunit.client.1.vm05.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T16:15:02.219 INFO:tasks.workunit.client.1.vm05.stderr:+ make 2026-03-09T16:15:02.261 INFO:tasks.workunit.client.1.vm05.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T16:15:02.706 INFO:tasks.workunit.client.1.vm05.stderr:++ readlink -f fsstress 2026-03-09T16:15:02.710 INFO:tasks.workunit.client.1.vm05.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T16:15:02.710 INFO:tasks.workunit.client.1.vm05.stderr:+ popd 2026-03-09T16:15:02.711 INFO:tasks.workunit.client.1.vm05.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T16:15:02.711 INFO:tasks.workunit.client.1.vm05.stderr:+ popd 2026-03-09T16:15:02.712 INFO:tasks.workunit.client.1.vm05.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-09T16:15:02.715 INFO:tasks.workunit.client.1.vm05.stderr:++ mktemp -d -p . 2026-03-09T16:15:02.727 INFO:tasks.workunit.client.1.vm05.stderr:+ T=./tmp.ArsmWsNydV 2026-03-09T16:15:02.727 INFO:tasks.workunit.client.1.vm05.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.ArsmWsNydV -l 1 -n 1000 -p 10 -v 2026-03-09T16:15:02.738 INFO:tasks.workunit.client.1.vm05.stdout:seed = 1772468215 2026-03-09T16:15:02.753 INFO:tasks.workunit.client.1.vm05.stdout:5/0: mknod c0 0 2026-03-09T16:15:02.753 INFO:tasks.workunit.client.1.vm05.stdout:0/0: creat f0 x:0 0 0 2026-03-09T16:15:02.756 INFO:tasks.workunit.client.1.vm05.stdout:0/1: unlink f0 0 2026-03-09T16:15:02.758 INFO:tasks.workunit.client.1.vm05.stdout:5/1: creat f1 x:0 0 0 2026-03-09T16:15:02.759 INFO:tasks.workunit.client.1.vm05.stdout:5/2: write f1 [676077,107999] 0 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/0: unlink - no file 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/1: truncate - no filename 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/2: stat - no entries 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/3: stat - no entries 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/4: truncate - no filename 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/5: dread - no filename 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/6: dwrite - no filename 2026-03-09T16:15:02.760 INFO:tasks.workunit.client.1.vm05.stdout:8/7: dread - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/2: mknod c1 0 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/3: dread - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/4: write - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/5: dwrite - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/6: dread - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:0/7: fdatasync - no filename 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:7/0: symlink l0 0 2026-03-09T16:15:02.764 INFO:tasks.workunit.client.1.vm05.stdout:7/1: fsync - no filename 2026-03-09T16:15:02.765 INFO:tasks.workunit.client.1.vm05.stdout:7/2: chown l0 1506096 1 2026-03-09T16:15:02.765 INFO:tasks.workunit.client.1.vm05.stdout:7/3: dread - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/0: dread - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/1: dwrite - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/2: unlink - no file 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/3: dwrite - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/4: dwrite - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/5: fsync - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/6: write - no filename 2026-03-09T16:15:02.766 INFO:tasks.workunit.client.1.vm05.stdout:9/7: read - no filename 2026-03-09T16:15:02.769 INFO:tasks.workunit.client.1.vm05.stdout:0/8: rename c1 to c2 0 2026-03-09T16:15:02.770 INFO:tasks.workunit.client.1.vm05.stdout:7/4: mkdir d1 0 2026-03-09T16:15:02.770 INFO:tasks.workunit.client.1.vm05.stdout:7/5: dwrite - no filename 2026-03-09T16:15:02.770 INFO:tasks.workunit.client.1.vm05.stdout:7/6: chown d1 2531774 1 2026-03-09T16:15:02.770 INFO:tasks.workunit.client.1.vm05.stdout:7/7: chown l0 11 1 2026-03-09T16:15:02.771 INFO:tasks.workunit.client.1.vm05.stdout:8/8: creat f0 x:0 0 0 2026-03-09T16:15:02.776 INFO:tasks.workunit.client.1.vm05.stdout:4/0: dwrite - no filename 2026-03-09T16:15:02.776 INFO:tasks.workunit.client.1.vm05.stdout:4/1: dread - no filename 2026-03-09T16:15:02.777 INFO:tasks.workunit.client.1.vm05.stdout:0/9: mknod c3 0 2026-03-09T16:15:02.779 INFO:tasks.workunit.client.1.vm05.stdout:8/9: creat f1 x:0 0 0 2026-03-09T16:15:02.780 INFO:tasks.workunit.client.1.vm05.stdout:9/8: creat f0 x:0 0 0 2026-03-09T16:15:02.781 INFO:tasks.workunit.client.1.vm05.stdout:3/0: write - no filename 2026-03-09T16:15:02.783 INFO:tasks.workunit.client.1.vm05.stdout:0/10: rename c3 to c4 0 2026-03-09T16:15:02.800 INFO:tasks.workunit.client.1.vm05.stdout:0/11: fdatasync - no filename 2026-03-09T16:15:02.800 INFO:tasks.workunit.client.1.vm05.stdout:0/12: dwrite - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:4/2: creat f0 x:0 0 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:4/3: dread - f0 zero size 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:4/4: write f0 [576020,39676] 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:9/9: mkdir d1 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:0/13: mkdir d5 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:0/14: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:8/10: link f0 f2 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/1: getdents . 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/2: write - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/0: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/1: dwrite - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/2: fdatasync - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/3: dwrite - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/4: dwrite - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/5: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/6: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/7: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/8: rename - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/9: link - no file 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/10: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:8/11: symlink l3 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:0/15: mknod d5/c6 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:0/16: dread - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:8/12: truncate f0 781060 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:8/13: read f2 [453675,60888] 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:1/0: write - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:1/1: fdatasync - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:9/10: rmdir d1 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:8/14: dread f0 [0,4194304] 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/3: mkdir d0 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/4: write - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/5: chown d0 247 1 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:3/6: readlink - no filename 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:0/17: rmdir d5 39 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/11: symlink l0 0 2026-03-09T16:15:02.801 INFO:tasks.workunit.client.1.vm05.stdout:2/12: fdatasync - no filename 2026-03-09T16:15:02.807 INFO:tasks.workunit.client.1.vm05.stdout:2/13: creat f1 x:0 0 0 2026-03-09T16:15:02.814 INFO:tasks.workunit.client.1.vm05.stdout:9/11: dwrite f0 [0,4194304] 0 2026-03-09T16:15:02.818 INFO:tasks.workunit.client.1.vm05.stdout:1/2: creat f0 x:0 0 0 2026-03-09T16:15:02.818 INFO:tasks.workunit.client.1.vm05.stdout:3/7: creat d0/f1 x:0 0 0 2026-03-09T16:15:02.820 INFO:tasks.workunit.client.1.vm05.stdout:3/8: read - d0/f1 zero size 2026-03-09T16:15:02.828 INFO:tasks.workunit.client.1.vm05.stdout:1/3: dread - f0 zero size 2026-03-09T16:15:02.837 INFO:tasks.workunit.client.1.vm05.stdout:2/14: dwrite f1 [0,4194304] 0 2026-03-09T16:15:02.837 INFO:tasks.workunit.client.1.vm05.stdout:9/12: creat f2 x:0 0 0 2026-03-09T16:15:02.837 INFO:tasks.workunit.client.1.vm05.stdout:8/15: dwrite f1 [0,4194304] 0 2026-03-09T16:15:02.837 INFO:tasks.workunit.client.1.vm05.stdout:8/16: write f1 [3881724,92596] 0 2026-03-09T16:15:02.837 INFO:tasks.workunit.client.1.vm05.stdout:4/5: fdatasync f0 0 2026-03-09T16:15:02.838 INFO:tasks.workunit.client.1.vm05.stdout:3/9: creat d0/f2 x:0 0 0 2026-03-09T16:15:02.838 INFO:tasks.workunit.client.1.vm05.stdout:4/6: stat f0 0 2026-03-09T16:15:02.838 INFO:tasks.workunit.client.1.vm05.stdout:9/13: mknod c3 0 2026-03-09T16:15:02.838 INFO:tasks.workunit.client.1.vm05.stdout:2/15: rename f1 to f2 0 2026-03-09T16:15:02.839 INFO:tasks.workunit.client.1.vm05.stdout:9/14: truncate f2 869087 0 2026-03-09T16:15:02.839 INFO:tasks.workunit.client.1.vm05.stdout:2/16: chown l0 0 1 2026-03-09T16:15:02.840 INFO:tasks.workunit.client.1.vm05.stdout:8/17: mkdir d4 0 2026-03-09T16:15:02.845 INFO:tasks.workunit.client.1.vm05.stdout:4/7: creat f1 x:0 0 0 2026-03-09T16:15:02.846 INFO:tasks.workunit.client.1.vm05.stdout:4/8: chown f0 55 1 2026-03-09T16:15:02.846 INFO:tasks.workunit.client.1.vm05.stdout:4/9: chown f1 28871 1 2026-03-09T16:15:02.851 INFO:tasks.workunit.client.1.vm05.stdout:8/18: dwrite f1 [0,4194304] 0 2026-03-09T16:15:02.852 INFO:tasks.workunit.client.1.vm05.stdout:3/10: unlink d0/f1 0 2026-03-09T16:15:02.852 INFO:tasks.workunit.client.1.vm05.stdout:9/15: mkdir d4 0 2026-03-09T16:15:02.855 INFO:tasks.workunit.client.1.vm05.stdout:9/16: chown f2 0 1 2026-03-09T16:15:02.864 INFO:tasks.workunit.client.1.vm05.stdout:4/10: symlink l2 0 2026-03-09T16:15:02.871 INFO:tasks.workunit.client.1.vm05.stdout:3/11: symlink d0/l3 0 2026-03-09T16:15:02.872 INFO:tasks.workunit.client.1.vm05.stdout:9/17: dwrite f2 [0,4194304] 0 2026-03-09T16:15:02.872 INFO:tasks.workunit.client.1.vm05.stdout:3/12: symlink d0/l4 0 2026-03-09T16:15:02.873 INFO:tasks.workunit.client.1.vm05.stdout:3/13: write d0/f2 [829109,37626] 0 2026-03-09T16:15:02.882 INFO:tasks.workunit.client.1.vm05.stdout:9/18: unlink c3 0 2026-03-09T16:15:02.884 INFO:tasks.workunit.client.1.vm05.stdout:9/19: dread f2 [0,4194304] 0 2026-03-09T16:15:02.884 INFO:tasks.workunit.client.1.vm05.stdout:9/20: chown f0 36 1 2026-03-09T16:15:02.884 INFO:tasks.workunit.client.1.vm05.stdout:9/21: readlink - no filename 2026-03-09T16:15:02.885 INFO:tasks.workunit.client.1.vm05.stdout:9/22: truncate f2 4344171 0 2026-03-09T16:15:02.885 INFO:tasks.workunit.client.1.vm05.stdout:3/14: unlink d0/l4 0 2026-03-09T16:15:02.931 INFO:tasks.workunit.client.1.vm05.stdout:5/3: getdents . 0 2026-03-09T16:15:02.934 INFO:tasks.workunit.client.1.vm05.stdout:0/18: chown d5/c6 444 1 2026-03-09T16:15:02.936 INFO:tasks.workunit.client.1.vm05.stdout:7/8: sync 2026-03-09T16:15:02.936 INFO:tasks.workunit.client.1.vm05.stdout:6/0: sync 2026-03-09T16:15:02.936 INFO:tasks.workunit.client.1.vm05.stdout:4/11: sync 2026-03-09T16:15:02.936 INFO:tasks.workunit.client.1.vm05.stdout:9/23: sync 2026-03-09T16:15:02.936 INFO:tasks.workunit.client.1.vm05.stdout:1/4: sync 2026-03-09T16:15:02.937 INFO:tasks.workunit.client.1.vm05.stdout:4/12: write f0 [405002,55322] 0 2026-03-09T16:15:02.937 INFO:tasks.workunit.client.1.vm05.stdout:5/4: dread f1 [0,4194304] 0 2026-03-09T16:15:02.937 INFO:tasks.workunit.client.1.vm05.stdout:0/19: creat d5/f7 x:0 0 0 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:1/5: sync 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:1/6: rmdir - no directory 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/1: sync 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/2: dwrite - no filename 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/3: write - no filename 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/4: dread - no filename 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/5: chown . 15968570 1 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:6/6: write - no filename 2026-03-09T16:15:02.939 INFO:tasks.workunit.client.1.vm05.stdout:4/13: sync 2026-03-09T16:15:02.945 INFO:tasks.workunit.client.1.vm05.stdout:5/5: write f1 [386833,79609] 0 2026-03-09T16:15:02.948 INFO:tasks.workunit.client.1.vm05.stdout:5/6: sync 2026-03-09T16:15:02.949 INFO:tasks.workunit.client.1.vm05.stdout:4/14: unlink l2 0 2026-03-09T16:15:02.951 INFO:tasks.workunit.client.1.vm05.stdout:1/7: creat f1 x:0 0 0 2026-03-09T16:15:02.952 INFO:tasks.workunit.client.1.vm05.stdout:7/9: mkdir d1/d2 0 2026-03-09T16:15:02.952 INFO:tasks.workunit.client.1.vm05.stdout:7/10: dwrite - no filename 2026-03-09T16:15:02.953 INFO:tasks.workunit.client.1.vm05.stdout:8/19: truncate f0 1824910 0 2026-03-09T16:15:02.956 INFO:tasks.workunit.client.1.vm05.stdout:0/20: link d5/f7 d5/f8 0 2026-03-09T16:15:02.958 INFO:tasks.workunit.client.1.vm05.stdout:9/24: dwrite f2 [0,4194304] 0 2026-03-09T16:15:02.960 INFO:tasks.workunit.client.1.vm05.stdout:6/7: symlink l0 0 2026-03-09T16:15:02.960 INFO:tasks.workunit.client.1.vm05.stdout:6/8: fdatasync - no filename 2026-03-09T16:15:02.961 INFO:tasks.workunit.client.1.vm05.stdout:4/15: mknod c3 0 2026-03-09T16:15:02.961 INFO:tasks.workunit.client.1.vm05.stdout:9/25: chown d4 20762600 1 2026-03-09T16:15:02.962 INFO:tasks.workunit.client.1.vm05.stdout:8/20: unlink f1 0 2026-03-09T16:15:02.962 INFO:tasks.workunit.client.1.vm05.stdout:0/21: creat d5/f9 x:0 0 0 2026-03-09T16:15:02.963 INFO:tasks.workunit.client.1.vm05.stdout:6/9: stat l0 0 2026-03-09T16:15:02.964 INFO:tasks.workunit.client.1.vm05.stdout:0/22: truncate d5/f7 915172 0 2026-03-09T16:15:02.966 INFO:tasks.workunit.client.1.vm05.stdout:1/8: mknod c2 0 2026-03-09T16:15:02.967 INFO:tasks.workunit.client.1.vm05.stdout:5/7: dwrite f1 [0,4194304] 0 2026-03-09T16:15:02.971 INFO:tasks.workunit.client.1.vm05.stdout:7/11: mknod d1/d2/c3 0 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:4/16: write f1 [153675,125585] 0 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:4/17: readlink - no filename 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:7/12: chown d1/d2 90511345 1 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:7/13: write - no filename 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:7/14: dwrite - no filename 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:7/15: dwrite - no filename 2026-03-09T16:15:02.972 INFO:tasks.workunit.client.1.vm05.stdout:4/18: stat f0 0 2026-03-09T16:15:02.990 INFO:tasks.workunit.client.1.vm05.stdout:0/23: mknod d5/ca 0 2026-03-09T16:15:02.990 INFO:tasks.workunit.client.1.vm05.stdout:0/24: readlink - no filename 2026-03-09T16:15:02.997 INFO:tasks.workunit.client.1.vm05.stdout:6/10: creat f1 x:0 0 0 2026-03-09T16:15:03.005 INFO:tasks.workunit.client.1.vm05.stdout:5/8: creat f2 x:0 0 0 2026-03-09T16:15:03.005 INFO:tasks.workunit.client.1.vm05.stdout:6/11: dread - f1 zero size 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:7/16: creat d1/d2/f4 x:0 0 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:9/26: getdents d4 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:0/25: mkdir d5/db 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:5/9: symlink l3 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:7/17: creat d1/d2/f5 x:0 0 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:5/10: write f1 [3500151,127059] 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:9/27: dread f2 [4194304,4194304] 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:8/21: link l3 d4/l5 0 2026-03-09T16:15:03.006 INFO:tasks.workunit.client.1.vm05.stdout:7/18: write d1/d2/f5 [498556,7158] 0 2026-03-09T16:15:03.008 INFO:tasks.workunit.client.1.vm05.stdout:8/22: rmdir d4 39 2026-03-09T16:15:03.009 INFO:tasks.workunit.client.1.vm05.stdout:1/9: link c2 c3 0 2026-03-09T16:15:03.009 INFO:tasks.workunit.client.1.vm05.stdout:5/11: rename f2 to f4 0 2026-03-09T16:15:03.012 INFO:tasks.workunit.client.1.vm05.stdout:0/26: link d5/f8 d5/db/fc 0 2026-03-09T16:15:03.013 INFO:tasks.workunit.client.1.vm05.stdout:7/19: mkdir d1/d2/d6 0 2026-03-09T16:15:03.015 INFO:tasks.workunit.client.1.vm05.stdout:5/12: creat f5 x:0 0 0 2026-03-09T16:15:03.015 INFO:tasks.workunit.client.1.vm05.stdout:5/13: truncate f4 845466 0 2026-03-09T16:15:03.016 INFO:tasks.workunit.client.1.vm05.stdout:1/10: symlink l4 0 2026-03-09T16:15:03.021 INFO:tasks.workunit.client.1.vm05.stdout:0/27: mknod d5/cd 0 2026-03-09T16:15:03.022 INFO:tasks.workunit.client.1.vm05.stdout:9/28: creat d4/f5 x:0 0 0 2026-03-09T16:15:03.026 INFO:tasks.workunit.client.1.vm05.stdout:9/29: dread f2 [0,4194304] 0 2026-03-09T16:15:03.026 INFO:tasks.workunit.client.1.vm05.stdout:7/20: fdatasync d1/d2/f5 0 2026-03-09T16:15:03.028 INFO:tasks.workunit.client.1.vm05.stdout:1/11: symlink l5 0 2026-03-09T16:15:03.028 INFO:tasks.workunit.client.1.vm05.stdout:2/17: rename f2 to f3 0 2026-03-09T16:15:03.031 INFO:tasks.workunit.client.1.vm05.stdout:7/21: truncate d1/d2/f4 47295 0 2026-03-09T16:15:03.032 INFO:tasks.workunit.client.1.vm05.stdout:7/22: chown d1/d2/f5 13833425 1 2026-03-09T16:15:03.035 INFO:tasks.workunit.client.1.vm05.stdout:7/23: rmdir d1 39 2026-03-09T16:15:03.036 INFO:tasks.workunit.client.1.vm05.stdout:1/12: creat f6 x:0 0 0 2026-03-09T16:15:03.038 INFO:tasks.workunit.client.1.vm05.stdout:2/18: dread f3 [0,4194304] 0 2026-03-09T16:15:03.039 INFO:tasks.workunit.client.1.vm05.stdout:0/28: dwrite d5/db/fc [0,4194304] 0 2026-03-09T16:15:03.045 INFO:tasks.workunit.client.1.vm05.stdout:4/19: dread f1 [0,4194304] 0 2026-03-09T16:15:03.045 INFO:tasks.workunit.client.1.vm05.stdout:0/29: chown d5/f8 320704197 1 2026-03-09T16:15:03.045 INFO:tasks.workunit.client.1.vm05.stdout:4/20: dread f1 [0,4194304] 0 2026-03-09T16:15:03.046 INFO:tasks.workunit.client.1.vm05.stdout:9/30: rename d4/f5 to d4/f6 0 2026-03-09T16:15:03.046 INFO:tasks.workunit.client.1.vm05.stdout:0/30: write d5/f8 [2116766,61955] 0 2026-03-09T16:15:03.047 INFO:tasks.workunit.client.1.vm05.stdout:4/21: write f0 [945606,68097] 0 2026-03-09T16:15:03.049 INFO:tasks.workunit.client.1.vm05.stdout:1/13: unlink f6 0 2026-03-09T16:15:03.057 INFO:tasks.workunit.client.1.vm05.stdout:9/31: mknod d4/c7 0 2026-03-09T16:15:03.057 INFO:tasks.workunit.client.1.vm05.stdout:3/15: rmdir d0 39 2026-03-09T16:15:03.064 INFO:tasks.workunit.client.1.vm05.stdout:2/19: symlink l4 0 2026-03-09T16:15:03.067 INFO:tasks.workunit.client.1.vm05.stdout:7/24: rmdir d1 39 2026-03-09T16:15:03.067 INFO:tasks.workunit.client.1.vm05.stdout:6/12: fsync f1 0 2026-03-09T16:15:03.069 INFO:tasks.workunit.client.1.vm05.stdout:1/14: mkdir d7 0 2026-03-09T16:15:03.069 INFO:tasks.workunit.client.1.vm05.stdout:1/15: dread - f1 zero size 2026-03-09T16:15:03.072 INFO:tasks.workunit.client.1.vm05.stdout:5/14: rename f4 to f6 0 2026-03-09T16:15:03.072 INFO:tasks.workunit.client.1.vm05.stdout:6/13: write f1 [621845,73648] 0 2026-03-09T16:15:03.073 INFO:tasks.workunit.client.1.vm05.stdout:3/16: chown d0 3489176 1 2026-03-09T16:15:03.084 INFO:tasks.workunit.client.1.vm05.stdout:6/14: truncate f1 913811 0 2026-03-09T16:15:03.086 INFO:tasks.workunit.client.1.vm05.stdout:4/22: link f1 f4 0 2026-03-09T16:15:03.086 INFO:tasks.workunit.client.1.vm05.stdout:4/23: chown f4 0 1 2026-03-09T16:15:03.090 INFO:tasks.workunit.client.1.vm05.stdout:1/16: dwrite f1 [0,4194304] 0 2026-03-09T16:15:03.090 INFO:tasks.workunit.client.1.vm05.stdout:6/15: write f1 [491003,92392] 0 2026-03-09T16:15:03.091 INFO:tasks.workunit.client.1.vm05.stdout:0/31: rmdir d5 39 2026-03-09T16:15:03.091 INFO:tasks.workunit.client.1.vm05.stdout:3/17: write d0/f2 [1623565,43821] 0 2026-03-09T16:15:03.102 INFO:tasks.workunit.client.1.vm05.stdout:3/18: write d0/f2 [1355563,41568] 0 2026-03-09T16:15:03.109 INFO:tasks.workunit.client.1.vm05.stdout:4/24: mkdir d5 0 2026-03-09T16:15:03.111 INFO:tasks.workunit.client.1.vm05.stdout:1/17: rename c3 to d7/c8 0 2026-03-09T16:15:03.113 INFO:tasks.workunit.client.1.vm05.stdout:9/32: dwrite f2 [0,4194304] 0 2026-03-09T16:15:03.116 INFO:tasks.workunit.client.1.vm05.stdout:7/25: write d1/d2/f5 [825911,99132] 0 2026-03-09T16:15:03.116 INFO:tasks.workunit.client.1.vm05.stdout:5/15: dwrite f1 [4194304,4194304] 0 2026-03-09T16:15:03.117 INFO:tasks.workunit.client.1.vm05.stdout:0/32: mknod d5/ce 0 2026-03-09T16:15:03.122 INFO:tasks.workunit.client.1.vm05.stdout:3/19: rename d0/f2 to d0/f5 0 2026-03-09T16:15:03.125 INFO:tasks.workunit.client.1.vm05.stdout:9/33: mkdir d4/d8 0 2026-03-09T16:15:03.125 INFO:tasks.workunit.client.1.vm05.stdout:2/20: dwrite f3 [4194304,4194304] 0 2026-03-09T16:15:03.126 INFO:tasks.workunit.client.1.vm05.stdout:5/16: symlink l7 0 2026-03-09T16:15:03.129 INFO:tasks.workunit.client.1.vm05.stdout:6/16: dwrite f1 [0,4194304] 0 2026-03-09T16:15:03.129 INFO:tasks.workunit.client.1.vm05.stdout:1/18: creat d7/f9 x:0 0 0 2026-03-09T16:15:03.130 INFO:tasks.workunit.client.1.vm05.stdout:0/33: mkdir d5/df 0 2026-03-09T16:15:03.132 INFO:tasks.workunit.client.1.vm05.stdout:7/26: mknod d1/d2/c7 0 2026-03-09T16:15:03.134 INFO:tasks.workunit.client.1.vm05.stdout:3/20: chown d0/l3 7 1 2026-03-09T16:15:03.134 INFO:tasks.workunit.client.1.vm05.stdout:2/21: write f3 [6563925,81448] 0 2026-03-09T16:15:03.134 INFO:tasks.workunit.client.1.vm05.stdout:7/27: write d1/d2/f5 [315987,17967] 0 2026-03-09T16:15:03.141 INFO:tasks.workunit.client.1.vm05.stdout:7/28: chown d1/d2/c3 25123 1 2026-03-09T16:15:03.143 INFO:tasks.workunit.client.1.vm05.stdout:4/25: creat d5/f6 x:0 0 0 2026-03-09T16:15:03.144 INFO:tasks.workunit.client.1.vm05.stdout:7/29: chown d1/d2/f5 132950388 1 2026-03-09T16:15:03.144 INFO:tasks.workunit.client.1.vm05.stdout:6/17: creat f2 x:0 0 0 2026-03-09T16:15:03.144 INFO:tasks.workunit.client.1.vm05.stdout:5/17: mkdir d8 0 2026-03-09T16:15:03.150 INFO:tasks.workunit.client.1.vm05.stdout:0/34: rename d5/c6 to d5/db/c10 0 2026-03-09T16:15:03.157 INFO:tasks.workunit.client.1.vm05.stdout:1/19: unlink c2 0 2026-03-09T16:15:03.157 INFO:tasks.workunit.client.1.vm05.stdout:6/18: unlink l0 0 2026-03-09T16:15:03.161 INFO:tasks.workunit.client.1.vm05.stdout:4/26: dwrite f4 [0,4194304] 0 2026-03-09T16:15:03.166 INFO:tasks.workunit.client.1.vm05.stdout:0/35: rename d5/df to d5/d11 0 2026-03-09T16:15:03.168 INFO:tasks.workunit.client.1.vm05.stdout:1/20: mknod d7/ca 0 2026-03-09T16:15:03.175 INFO:tasks.workunit.client.1.vm05.stdout:9/34: dwrite f0 [0,4194304] 0 2026-03-09T16:15:03.175 INFO:tasks.workunit.client.1.vm05.stdout:4/27: dwrite d5/f6 [0,4194304] 0 2026-03-09T16:15:03.176 INFO:tasks.workunit.client.1.vm05.stdout:6/19: mknod c3 0 2026-03-09T16:15:03.177 INFO:tasks.workunit.client.1.vm05.stdout:2/22: dwrite f3 [4194304,4194304] 0 2026-03-09T16:15:03.181 INFO:tasks.workunit.client.1.vm05.stdout:0/36: truncate d5/f9 716935 0 2026-03-09T16:15:03.181 INFO:tasks.workunit.client.1.vm05.stdout:0/37: read d5/f8 [667079,114260] 0 2026-03-09T16:15:03.181 INFO:tasks.workunit.client.1.vm05.stdout:2/23: write f3 [183578,130507] 0 2026-03-09T16:15:03.184 INFO:tasks.workunit.client.1.vm05.stdout:1/21: creat d7/fb x:0 0 0 2026-03-09T16:15:03.185 INFO:tasks.workunit.client.1.vm05.stdout:1/22: read f1 [437751,52224] 0 2026-03-09T16:15:03.185 INFO:tasks.workunit.client.1.vm05.stdout:0/38: write d5/f8 [2906348,119964] 0 2026-03-09T16:15:03.186 INFO:tasks.workunit.client.1.vm05.stdout:0/39: chown d5/f8 1 1 2026-03-09T16:15:03.190 INFO:tasks.workunit.client.1.vm05.stdout:7/30: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.191 INFO:tasks.workunit.client.1.vm05.stdout:6/20: dwrite f1 [0,4194304] 0 2026-03-09T16:15:03.195 INFO:tasks.workunit.client.1.vm05.stdout:2/24: creat f5 x:0 0 0 2026-03-09T16:15:03.195 INFO:tasks.workunit.client.1.vm05.stdout:2/25: rmdir - no directory 2026-03-09T16:15:03.195 INFO:tasks.workunit.client.1.vm05.stdout:2/26: chown f3 41007439 1 2026-03-09T16:15:03.204 INFO:tasks.workunit.client.1.vm05.stdout:1/23: creat d7/fc x:0 0 0 2026-03-09T16:15:03.204 INFO:tasks.workunit.client.1.vm05.stdout:0/40: creat d5/db/f12 x:0 0 0 2026-03-09T16:15:03.207 INFO:tasks.workunit.client.1.vm05.stdout:7/31: write d1/d2/f5 [46768,51656] 0 2026-03-09T16:15:03.207 INFO:tasks.workunit.client.1.vm05.stdout:6/21: creat f4 x:0 0 0 2026-03-09T16:15:03.207 INFO:tasks.workunit.client.1.vm05.stdout:6/22: readlink - no filename 2026-03-09T16:15:03.207 INFO:tasks.workunit.client.1.vm05.stdout:2/27: symlink l6 0 2026-03-09T16:15:03.208 INFO:tasks.workunit.client.1.vm05.stdout:1/24: mkdir d7/dd 0 2026-03-09T16:15:03.209 INFO:tasks.workunit.client.1.vm05.stdout:0/41: symlink d5/db/l13 0 2026-03-09T16:15:03.209 INFO:tasks.workunit.client.1.vm05.stdout:2/28: creat f7 x:0 0 0 2026-03-09T16:15:03.210 INFO:tasks.workunit.client.1.vm05.stdout:2/29: dread - f7 zero size 2026-03-09T16:15:03.212 INFO:tasks.workunit.client.1.vm05.stdout:2/30: readlink l0 0 2026-03-09T16:15:03.214 INFO:tasks.workunit.client.1.vm05.stdout:7/32: chown d1/d2/f4 142 1 2026-03-09T16:15:03.215 INFO:tasks.workunit.client.1.vm05.stdout:7/33: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.215 INFO:tasks.workunit.client.1.vm05.stdout:7/34: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.216 INFO:tasks.workunit.client.1.vm05.stdout:2/31: mknod c8 0 2026-03-09T16:15:03.218 INFO:tasks.workunit.client.1.vm05.stdout:0/42: dread d5/f8 [0,4194304] 0 2026-03-09T16:15:03.218 INFO:tasks.workunit.client.1.vm05.stdout:0/43: read - d5/db/f12 zero size 2026-03-09T16:15:03.218 INFO:tasks.workunit.client.1.vm05.stdout:1/25: dwrite d7/fc [0,4194304] 0 2026-03-09T16:15:03.219 INFO:tasks.workunit.client.1.vm05.stdout:2/32: write f3 [2844008,44953] 0 2026-03-09T16:15:03.219 INFO:tasks.workunit.client.1.vm05.stdout:7/35: write d1/d2/f4 [171648,49155] 0 2026-03-09T16:15:03.228 INFO:tasks.workunit.client.1.vm05.stdout:2/33: truncate f5 596923 0 2026-03-09T16:15:03.228 INFO:tasks.workunit.client.1.vm05.stdout:2/34: dread f5 [0,4194304] 0 2026-03-09T16:15:03.238 INFO:tasks.workunit.client.1.vm05.stdout:7/36: mkdir d1/d2/d8 0 2026-03-09T16:15:03.242 INFO:tasks.workunit.client.1.vm05.stdout:0/44: rename d5/cd to d5/db/c14 0 2026-03-09T16:15:03.242 INFO:tasks.workunit.client.1.vm05.stdout:1/26: mkdir d7/dd/de 0 2026-03-09T16:15:03.244 INFO:tasks.workunit.client.1.vm05.stdout:7/37: symlink d1/d2/l9 0 2026-03-09T16:15:03.252 INFO:tasks.workunit.client.1.vm05.stdout:2/35: link f3 f9 0 2026-03-09T16:15:03.256 INFO:tasks.workunit.client.1.vm05.stdout:2/36: dread f9 [0,4194304] 0 2026-03-09T16:15:03.259 INFO:tasks.workunit.client.1.vm05.stdout:2/37: rename f9 to fa 0 2026-03-09T16:15:03.259 INFO:tasks.workunit.client.1.vm05.stdout:2/38: readlink l0 0 2026-03-09T16:15:03.260 INFO:tasks.workunit.client.1.vm05.stdout:1/27: creat d7/dd/de/ff x:0 0 0 2026-03-09T16:15:03.270 INFO:tasks.workunit.client.1.vm05.stdout:7/38: link d1/d2/l9 d1/d2/la 0 2026-03-09T16:15:03.271 INFO:tasks.workunit.client.1.vm05.stdout:0/45: link c2 d5/c15 0 2026-03-09T16:15:03.271 INFO:tasks.workunit.client.1.vm05.stdout:0/46: readlink d5/db/l13 0 2026-03-09T16:15:03.271 INFO:tasks.workunit.client.1.vm05.stdout:2/39: write f3 [7000734,84916] 0 2026-03-09T16:15:03.272 INFO:tasks.workunit.client.1.vm05.stdout:7/39: symlink d1/d2/d8/lb 0 2026-03-09T16:15:03.277 INFO:tasks.workunit.client.1.vm05.stdout:2/40: mkdir db 0 2026-03-09T16:15:03.277 INFO:tasks.workunit.client.1.vm05.stdout:7/40: mkdir d1/d2/d8/dc 0 2026-03-09T16:15:03.278 INFO:tasks.workunit.client.1.vm05.stdout:2/41: rename l6 to db/lc 0 2026-03-09T16:15:03.292 INFO:tasks.workunit.client.1.vm05.stdout:7/41: dwrite d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.295 INFO:tasks.workunit.client.1.vm05.stdout:2/42: dwrite f7 [0,4194304] 0 2026-03-09T16:15:03.307 INFO:tasks.workunit.client.1.vm05.stdout:7/42: mknod d1/d2/cd 0 2026-03-09T16:15:03.313 INFO:tasks.workunit.client.1.vm05.stdout:2/43: mkdir db/dd 0 2026-03-09T16:15:03.313 INFO:tasks.workunit.client.1.vm05.stdout:7/43: link d1/d2/f4 d1/d2/fe 0 2026-03-09T16:15:03.313 INFO:tasks.workunit.client.1.vm05.stdout:7/44: stat d1/d2 0 2026-03-09T16:15:03.314 INFO:tasks.workunit.client.1.vm05.stdout:7/45: chown d1/d2/d8 218088 1 2026-03-09T16:15:03.315 INFO:tasks.workunit.client.1.vm05.stdout:2/44: mkdir db/dd/de 0 2026-03-09T16:15:03.315 INFO:tasks.workunit.client.1.vm05.stdout:7/46: chown d1/d2/fe 16743616 1 2026-03-09T16:15:03.317 INFO:tasks.workunit.client.1.vm05.stdout:2/45: dread f5 [0,4194304] 0 2026-03-09T16:15:03.318 INFO:tasks.workunit.client.1.vm05.stdout:2/46: read f7 [1551808,10623] 0 2026-03-09T16:15:03.324 INFO:tasks.workunit.client.1.vm05.stdout:2/47: dread f3 [4194304,4194304] 0 2026-03-09T16:15:03.328 INFO:tasks.workunit.client.1.vm05.stdout:2/48: symlink db/dd/de/lf 0 2026-03-09T16:15:03.328 INFO:tasks.workunit.client.1.vm05.stdout:2/49: chown l4 43529 1 2026-03-09T16:15:03.407 INFO:tasks.workunit.client.1.vm05.stdout:8/23: write f0 [1847730,44752] 0 2026-03-09T16:15:03.408 INFO:tasks.workunit.client.1.vm05.stdout:6/23: fsync f1 0 2026-03-09T16:15:03.409 INFO:tasks.workunit.client.1.vm05.stdout:6/24: dread - f4 zero size 2026-03-09T16:15:03.409 INFO:tasks.workunit.client.1.vm05.stdout:6/25: write f1 [3212224,86460] 0 2026-03-09T16:15:03.414 INFO:tasks.workunit.client.1.vm05.stdout:6/26: creat f5 x:0 0 0 2026-03-09T16:15:03.414 INFO:tasks.workunit.client.1.vm05.stdout:8/24: dwrite f0 [0,4194304] 0 2026-03-09T16:15:03.416 INFO:tasks.workunit.client.1.vm05.stdout:6/27: symlink l6 0 2026-03-09T16:15:03.416 INFO:tasks.workunit.client.1.vm05.stdout:6/28: dread - f2 zero size 2026-03-09T16:15:03.423 INFO:tasks.workunit.client.1.vm05.stdout:6/29: rename f4 to f7 0 2026-03-09T16:15:03.431 INFO:tasks.workunit.client.1.vm05.stdout:2/50: fdatasync fa 0 2026-03-09T16:15:03.431 INFO:tasks.workunit.client.1.vm05.stdout:2/51: read f3 [1698578,103213] 0 2026-03-09T16:15:03.444 INFO:tasks.workunit.client.1.vm05.stdout:6/30: sync 2026-03-09T16:15:03.447 INFO:tasks.workunit.client.1.vm05.stdout:6/31: mknod c8 0 2026-03-09T16:15:03.448 INFO:tasks.workunit.client.1.vm05.stdout:6/32: write f1 [3516716,32900] 0 2026-03-09T16:15:03.450 INFO:tasks.workunit.client.1.vm05.stdout:6/33: sync 2026-03-09T16:15:03.450 INFO:tasks.workunit.client.1.vm05.stdout:6/34: dread - f7 zero size 2026-03-09T16:15:03.451 INFO:tasks.workunit.client.1.vm05.stdout:6/35: dread - f2 zero size 2026-03-09T16:15:03.451 INFO:tasks.workunit.client.1.vm05.stdout:6/36: dread - f2 zero size 2026-03-09T16:15:03.453 INFO:tasks.workunit.client.1.vm05.stdout:5/18: truncate f1 7355369 0 2026-03-09T16:15:03.454 INFO:tasks.workunit.client.1.vm05.stdout:5/19: write f5 [564317,71673] 0 2026-03-09T16:15:03.456 INFO:tasks.workunit.client.1.vm05.stdout:6/37: unlink c8 0 2026-03-09T16:15:03.456 INFO:tasks.workunit.client.1.vm05.stdout:6/38: rmdir - no directory 2026-03-09T16:15:03.457 INFO:tasks.workunit.client.1.vm05.stdout:4/28: truncate d5/f6 3809111 0 2026-03-09T16:15:03.458 INFO:tasks.workunit.client.1.vm05.stdout:9/35: truncate f0 1901480 0 2026-03-09T16:15:03.459 INFO:tasks.workunit.client.1.vm05.stdout:9/36: chown d4/f6 7834258 1 2026-03-09T16:15:03.461 INFO:tasks.workunit.client.1.vm05.stdout:6/39: dwrite f1 [0,4194304] 0 2026-03-09T16:15:03.462 INFO:tasks.workunit.client.1.vm05.stdout:6/40: chown f1 45 1 2026-03-09T16:15:03.464 INFO:tasks.workunit.client.1.vm05.stdout:4/29: mknod d5/c7 0 2026-03-09T16:15:03.464 INFO:tasks.workunit.client.1.vm05.stdout:4/30: chown f4 7945 1 2026-03-09T16:15:03.472 INFO:tasks.workunit.client.1.vm05.stdout:4/31: dwrite f4 [0,4194304] 0 2026-03-09T16:15:03.473 INFO:tasks.workunit.client.1.vm05.stdout:4/32: chown f1 20832 1 2026-03-09T16:15:03.478 INFO:tasks.workunit.client.1.vm05.stdout:9/37: mknod d4/c9 0 2026-03-09T16:15:03.479 INFO:tasks.workunit.client.1.vm05.stdout:9/38: dread - d4/f6 zero size 2026-03-09T16:15:03.479 INFO:tasks.workunit.client.1.vm05.stdout:1/28: getdents d7/dd 0 2026-03-09T16:15:03.479 INFO:tasks.workunit.client.1.vm05.stdout:5/20: mknod d8/c9 0 2026-03-09T16:15:03.480 INFO:tasks.workunit.client.1.vm05.stdout:7/47: truncate d1/d2/f5 3772103 0 2026-03-09T16:15:03.481 INFO:tasks.workunit.client.1.vm05.stdout:6/41: creat f9 x:0 0 0 2026-03-09T16:15:03.481 INFO:tasks.workunit.client.1.vm05.stdout:4/33: symlink d5/l8 0 2026-03-09T16:15:03.485 INFO:tasks.workunit.client.1.vm05.stdout:0/47: unlink d5/db/c14 0 2026-03-09T16:15:03.485 INFO:tasks.workunit.client.1.vm05.stdout:7/48: creat d1/d2/ff x:0 0 0 2026-03-09T16:15:03.486 INFO:tasks.workunit.client.1.vm05.stdout:5/21: sync 2026-03-09T16:15:03.490 INFO:tasks.workunit.client.1.vm05.stdout:6/42: creat fa x:0 0 0 2026-03-09T16:15:03.490 INFO:tasks.workunit.client.1.vm05.stdout:4/34: dread f0 [0,4194304] 0 2026-03-09T16:15:03.491 INFO:tasks.workunit.client.1.vm05.stdout:0/48: creat d5/d11/f16 x:0 0 0 2026-03-09T16:15:03.491 INFO:tasks.workunit.client.1.vm05.stdout:5/22: mknod d8/ca 0 2026-03-09T16:15:03.491 INFO:tasks.workunit.client.1.vm05.stdout:7/49: symlink d1/d2/d8/l10 0 2026-03-09T16:15:03.495 INFO:tasks.workunit.client.1.vm05.stdout:5/23: creat d8/fb x:0 0 0 2026-03-09T16:15:03.496 INFO:tasks.workunit.client.1.vm05.stdout:4/35: dread f0 [0,4194304] 0 2026-03-09T16:15:03.501 INFO:tasks.workunit.client.1.vm05.stdout:4/36: dread f4 [0,4194304] 0 2026-03-09T16:15:03.502 INFO:tasks.workunit.client.1.vm05.stdout:0/49: dwrite d5/f9 [0,4194304] 0 2026-03-09T16:15:03.517 INFO:tasks.workunit.client.1.vm05.stdout:5/24: unlink l7 0 2026-03-09T16:15:03.517 INFO:tasks.workunit.client.1.vm05.stdout:0/50: creat d5/f17 x:0 0 0 2026-03-09T16:15:03.518 INFO:tasks.workunit.client.1.vm05.stdout:5/25: chown f1 0 1 2026-03-09T16:15:03.519 INFO:tasks.workunit.client.1.vm05.stdout:0/51: write d5/d11/f16 [252401,49628] 0 2026-03-09T16:15:03.519 INFO:tasks.workunit.client.1.vm05.stdout:7/50: dwrite d1/d2/fe [0,4194304] 0 2026-03-09T16:15:03.519 INFO:tasks.workunit.client.1.vm05.stdout:5/26: rename d8 to d8/dc 22 2026-03-09T16:15:03.525 INFO:tasks.workunit.client.1.vm05.stdout:4/37: dwrite f1 [0,4194304] 0 2026-03-09T16:15:03.531 INFO:tasks.workunit.client.1.vm05.stdout:4/38: creat d5/f9 x:0 0 0 2026-03-09T16:15:03.532 INFO:tasks.workunit.client.1.vm05.stdout:4/39: write f4 [2879283,6334] 0 2026-03-09T16:15:03.532 INFO:tasks.workunit.client.1.vm05.stdout:7/51: mkdir d1/d2/d11 0 2026-03-09T16:15:03.533 INFO:tasks.workunit.client.1.vm05.stdout:5/27: creat d8/fd x:0 0 0 2026-03-09T16:15:03.533 INFO:tasks.workunit.client.1.vm05.stdout:7/52: truncate d1/d2/f4 4823700 0 2026-03-09T16:15:03.539 INFO:tasks.workunit.client.1.vm05.stdout:0/52: dwrite d5/f9 [0,4194304] 0 2026-03-09T16:15:03.542 INFO:tasks.workunit.client.1.vm05.stdout:4/40: dwrite f0 [0,4194304] 0 2026-03-09T16:15:03.554 INFO:tasks.workunit.client.1.vm05.stdout:0/53: write d5/f8 [463153,28864] 0 2026-03-09T16:15:03.554 INFO:tasks.workunit.client.1.vm05.stdout:4/41: creat d5/fa x:0 0 0 2026-03-09T16:15:03.559 INFO:tasks.workunit.client.1.vm05.stdout:4/42: dwrite d5/fa [0,4194304] 0 2026-03-09T16:15:03.567 INFO:tasks.workunit.client.1.vm05.stdout:4/43: dread f1 [0,4194304] 0 2026-03-09T16:15:03.570 INFO:tasks.workunit.client.1.vm05.stdout:0/54: unlink c4 0 2026-03-09T16:15:03.570 INFO:tasks.workunit.client.1.vm05.stdout:8/25: fdatasync f2 0 2026-03-09T16:15:03.577 INFO:tasks.workunit.client.1.vm05.stdout:8/26: mkdir d4/d6 0 2026-03-09T16:15:03.577 INFO:tasks.workunit.client.1.vm05.stdout:0/55: creat d5/d11/f18 x:0 0 0 2026-03-09T16:15:03.578 INFO:tasks.workunit.client.1.vm05.stdout:0/56: truncate d5/f17 561724 0 2026-03-09T16:15:03.578 INFO:tasks.workunit.client.1.vm05.stdout:8/27: mknod d4/d6/c7 0 2026-03-09T16:15:03.585 INFO:tasks.workunit.client.1.vm05.stdout:0/57: chown d5/c15 89042768 1 2026-03-09T16:15:03.585 INFO:tasks.workunit.client.1.vm05.stdout:8/28: write f0 [3880323,100017] 0 2026-03-09T16:15:03.585 INFO:tasks.workunit.client.1.vm05.stdout:8/29: rename d4 to d4/d8 22 2026-03-09T16:15:03.600 INFO:tasks.workunit.client.1.vm05.stdout:8/30: dread f0 [0,4194304] 0 2026-03-09T16:15:03.600 INFO:tasks.workunit.client.1.vm05.stdout:0/58: dwrite d5/f9 [0,4194304] 0 2026-03-09T16:15:03.607 INFO:tasks.workunit.client.1.vm05.stdout:0/59: truncate d5/d11/f18 218954 0 2026-03-09T16:15:03.613 INFO:tasks.workunit.client.1.vm05.stdout:4/44: sync 2026-03-09T16:15:03.615 INFO:tasks.workunit.client.1.vm05.stdout:4/45: readlink d5/l8 0 2026-03-09T16:15:03.618 INFO:tasks.workunit.client.1.vm05.stdout:4/46: creat d5/fb x:0 0 0 2026-03-09T16:15:03.619 INFO:tasks.workunit.client.1.vm05.stdout:3/21: dread d0/f5 [0,4194304] 0 2026-03-09T16:15:03.619 INFO:tasks.workunit.client.1.vm05.stdout:4/47: symlink d5/lc 0 2026-03-09T16:15:03.621 INFO:tasks.workunit.client.1.vm05.stdout:3/22: unlink d0/f5 0 2026-03-09T16:15:03.627 INFO:tasks.workunit.client.1.vm05.stdout:3/23: link d0/l3 d0/l6 0 2026-03-09T16:15:03.627 INFO:tasks.workunit.client.1.vm05.stdout:3/24: dread - no filename 2026-03-09T16:15:03.627 INFO:tasks.workunit.client.1.vm05.stdout:3/25: write - no filename 2026-03-09T16:15:03.627 INFO:tasks.workunit.client.1.vm05.stdout:3/26: fsync - no filename 2026-03-09T16:15:03.628 INFO:tasks.workunit.client.1.vm05.stdout:3/27: symlink d0/l7 0 2026-03-09T16:15:03.628 INFO:tasks.workunit.client.1.vm05.stdout:3/28: fsync - no filename 2026-03-09T16:15:03.630 INFO:tasks.workunit.client.1.vm05.stdout:3/29: mkdir d0/d8 0 2026-03-09T16:15:03.638 INFO:tasks.workunit.client.1.vm05.stdout:3/30: mkdir d0/d9 0 2026-03-09T16:15:03.638 INFO:tasks.workunit.client.1.vm05.stdout:3/31: chown d0/d8 606 1 2026-03-09T16:15:03.638 INFO:tasks.workunit.client.1.vm05.stdout:3/32: write - no filename 2026-03-09T16:15:03.654 INFO:tasks.workunit.client.1.vm05.stdout:3/33: creat d0/d9/fa x:0 0 0 2026-03-09T16:15:03.654 INFO:tasks.workunit.client.1.vm05.stdout:3/34: fdatasync d0/d9/fa 0 2026-03-09T16:15:03.655 INFO:tasks.workunit.client.1.vm05.stdout:6/43: rename f7 to fb 0 2026-03-09T16:15:03.655 INFO:tasks.workunit.client.1.vm05.stdout:6/44: truncate fb 739268 0 2026-03-09T16:15:03.656 INFO:tasks.workunit.client.1.vm05.stdout:6/45: dread - fa zero size 2026-03-09T16:15:03.660 INFO:tasks.workunit.client.1.vm05.stdout:3/35: mknod d0/d9/cb 0 2026-03-09T16:15:03.665 INFO:tasks.workunit.client.1.vm05.stdout:6/46: dwrite f2 [0,4194304] 0 2026-03-09T16:15:03.671 INFO:tasks.workunit.client.1.vm05.stdout:6/47: dwrite f9 [0,4194304] 0 2026-03-09T16:15:03.693 INFO:tasks.workunit.client.1.vm05.stdout:2/52: truncate f7 1441795 0 2026-03-09T16:15:03.698 INFO:tasks.workunit.client.1.vm05.stdout:4/48: fsync d5/f6 0 2026-03-09T16:15:03.700 INFO:tasks.workunit.client.1.vm05.stdout:9/39: dread f0 [0,4194304] 0 2026-03-09T16:15:03.706 INFO:tasks.workunit.client.1.vm05.stdout:7/53: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.710 INFO:tasks.workunit.client.1.vm05.stdout:5/28: rmdir d8 39 2026-03-09T16:15:03.715 INFO:tasks.workunit.client.1.vm05.stdout:8/31: getdents d4/d6 0 2026-03-09T16:15:03.716 INFO:tasks.workunit.client.1.vm05.stdout:0/60: truncate d5/db/fc 3030106 0 2026-03-09T16:15:03.718 INFO:tasks.workunit.client.1.vm05.stdout:3/36: rename d0/d8 to d0/d9/dc 0 2026-03-09T16:15:03.718 INFO:tasks.workunit.client.1.vm05.stdout:3/37: chown d0/d9/dc 255555 1 2026-03-09T16:15:03.719 INFO:tasks.workunit.client.1.vm05.stdout:3/38: write d0/d9/fa [834798,99745] 0 2026-03-09T16:15:03.719 INFO:tasks.workunit.client.1.vm05.stdout:6/48: unlink f2 0 2026-03-09T16:15:03.720 INFO:tasks.workunit.client.1.vm05.stdout:9/40: sync 2026-03-09T16:15:03.721 INFO:tasks.workunit.client.1.vm05.stdout:2/53: creat db/dd/f10 x:0 0 0 2026-03-09T16:15:03.722 INFO:tasks.workunit.client.1.vm05.stdout:9/41: write f2 [3612156,67732] 0 2026-03-09T16:15:03.723 INFO:tasks.workunit.client.1.vm05.stdout:7/54: write d1/d2/f5 [4222177,95383] 0 2026-03-09T16:15:03.724 INFO:tasks.workunit.client.1.vm05.stdout:1/29: write d7/dd/de/ff [121792,44475] 0 2026-03-09T16:15:03.725 INFO:tasks.workunit.client.1.vm05.stdout:5/29: truncate d8/fb 949119 0 2026-03-09T16:15:03.725 INFO:tasks.workunit.client.1.vm05.stdout:8/32: write f0 [269635,120147] 0 2026-03-09T16:15:03.726 INFO:tasks.workunit.client.1.vm05.stdout:0/61: symlink d5/d11/l19 0 2026-03-09T16:15:03.727 INFO:tasks.workunit.client.1.vm05.stdout:1/30: rename d7 to d7/dd/d10 22 2026-03-09T16:15:03.727 INFO:tasks.workunit.client.1.vm05.stdout:7/55: write d1/d2/fe [4598455,107454] 0 2026-03-09T16:15:03.728 INFO:tasks.workunit.client.1.vm05.stdout:2/54: unlink f3 0 2026-03-09T16:15:03.728 INFO:tasks.workunit.client.1.vm05.stdout:0/62: dread d5/f17 [0,4194304] 0 2026-03-09T16:15:03.729 INFO:tasks.workunit.client.1.vm05.stdout:7/56: readlink d1/d2/d8/lb 0 2026-03-09T16:15:03.730 INFO:tasks.workunit.client.1.vm05.stdout:3/39: dwrite d0/d9/fa [0,4194304] 0 2026-03-09T16:15:03.734 INFO:tasks.workunit.client.1.vm05.stdout:9/42: creat d4/fa x:0 0 0 2026-03-09T16:15:03.735 INFO:tasks.workunit.client.1.vm05.stdout:3/40: read d0/d9/fa [1642229,17309] 0 2026-03-09T16:15:03.738 INFO:tasks.workunit.client.1.vm05.stdout:5/30: dread f5 [0,4194304] 0 2026-03-09T16:15:03.740 INFO:tasks.workunit.client.1.vm05.stdout:3/41: read d0/d9/fa [1932643,93890] 0 2026-03-09T16:15:03.743 INFO:tasks.workunit.client.1.vm05.stdout:0/63: dwrite d5/d11/f18 [0,4194304] 0 2026-03-09T16:15:03.743 INFO:tasks.workunit.client.1.vm05.stdout:9/43: dread f2 [0,4194304] 0 2026-03-09T16:15:03.743 INFO:tasks.workunit.client.1.vm05.stdout:0/64: fsync d5/db/f12 0 2026-03-09T16:15:03.745 INFO:tasks.workunit.client.1.vm05.stdout:0/65: fsync d5/f9 0 2026-03-09T16:15:03.747 INFO:tasks.workunit.client.1.vm05.stdout:0/66: truncate d5/d11/f16 336714 0 2026-03-09T16:15:03.785 INFO:tasks.workunit.client.1.vm05.stdout:8/33: rmdir d4 39 2026-03-09T16:15:03.786 INFO:tasks.workunit.client.1.vm05.stdout:1/31: truncate d7/f9 173794 0 2026-03-09T16:15:03.787 INFO:tasks.workunit.client.1.vm05.stdout:6/49: link f1 fc 0 2026-03-09T16:15:03.789 INFO:tasks.workunit.client.1.vm05.stdout:5/31: rename d8/c9 to d8/ce 0 2026-03-09T16:15:03.790 INFO:tasks.workunit.client.1.vm05.stdout:3/42: creat d0/fd x:0 0 0 2026-03-09T16:15:03.790 INFO:tasks.workunit.client.1.vm05.stdout:9/44: rename d4/c7 to d4/d8/cb 0 2026-03-09T16:15:03.792 INFO:tasks.workunit.client.1.vm05.stdout:9/45: dread - d4/fa zero size 2026-03-09T16:15:03.792 INFO:tasks.workunit.client.1.vm05.stdout:9/46: read f0 [697157,104705] 0 2026-03-09T16:15:03.792 INFO:tasks.workunit.client.1.vm05.stdout:1/32: creat d7/dd/f11 x:0 0 0 2026-03-09T16:15:03.792 INFO:tasks.workunit.client.1.vm05.stdout:7/57: mknod d1/d2/d11/c12 0 2026-03-09T16:15:03.793 INFO:tasks.workunit.client.1.vm05.stdout:6/50: mknod cd 0 2026-03-09T16:15:03.794 INFO:tasks.workunit.client.1.vm05.stdout:6/51: chown f5 700 1 2026-03-09T16:15:03.796 INFO:tasks.workunit.client.1.vm05.stdout:6/52: chown cd 1 1 2026-03-09T16:15:03.796 INFO:tasks.workunit.client.1.vm05.stdout:8/34: creat d4/d6/f9 x:0 0 0 2026-03-09T16:15:03.796 INFO:tasks.workunit.client.1.vm05.stdout:9/47: symlink d4/lc 0 2026-03-09T16:15:03.797 INFO:tasks.workunit.client.1.vm05.stdout:1/33: fdatasync d7/fb 0 2026-03-09T16:15:03.799 INFO:tasks.workunit.client.1.vm05.stdout:8/35: write f0 [1292057,66010] 0 2026-03-09T16:15:03.800 INFO:tasks.workunit.client.1.vm05.stdout:6/53: symlink le 0 2026-03-09T16:15:03.801 INFO:tasks.workunit.client.1.vm05.stdout:2/55: getdents db 0 2026-03-09T16:15:03.801 INFO:tasks.workunit.client.1.vm05.stdout:1/34: mknod d7/dd/de/c12 0 2026-03-09T16:15:03.801 INFO:tasks.workunit.client.1.vm05.stdout:9/48: symlink d4/ld 0 2026-03-09T16:15:03.802 INFO:tasks.workunit.client.1.vm05.stdout:7/58: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.802 INFO:tasks.workunit.client.1.vm05.stdout:1/35: rename d7 to d7/dd/de/d13 22 2026-03-09T16:15:03.802 INFO:tasks.workunit.client.1.vm05.stdout:8/36: creat d4/d6/fa x:0 0 0 2026-03-09T16:15:03.803 INFO:tasks.workunit.client.1.vm05.stdout:1/36: chown d7/dd 1 1 2026-03-09T16:15:03.804 INFO:tasks.workunit.client.1.vm05.stdout:2/56: mknod db/dd/de/c11 0 2026-03-09T16:15:03.805 INFO:tasks.workunit.client.1.vm05.stdout:1/37: dread d7/f9 [0,4194304] 0 2026-03-09T16:15:03.805 INFO:tasks.workunit.client.1.vm05.stdout:8/37: chown d4/d6/f9 11101 1 2026-03-09T16:15:03.817 INFO:tasks.workunit.client.1.vm05.stdout:2/57: rmdir db 39 2026-03-09T16:15:03.817 INFO:tasks.workunit.client.1.vm05.stdout:9/49: dwrite f0 [0,4194304] 0 2026-03-09T16:15:03.824 INFO:tasks.workunit.client.1.vm05.stdout:7/59: dwrite d1/d2/f4 [4194304,4194304] 0 2026-03-09T16:15:03.836 INFO:tasks.workunit.client.1.vm05.stdout:1/38: symlink d7/dd/de/l14 0 2026-03-09T16:15:03.837 INFO:tasks.workunit.client.1.vm05.stdout:8/38: mkdir d4/d6/db 0 2026-03-09T16:15:03.844 INFO:tasks.workunit.client.1.vm05.stdout:2/58: unlink l0 0 2026-03-09T16:15:03.848 INFO:tasks.workunit.client.1.vm05.stdout:1/39: mkdir d7/d15 0 2026-03-09T16:15:03.856 INFO:tasks.workunit.client.1.vm05.stdout:9/50: dwrite f0 [0,4194304] 0 2026-03-09T16:15:03.856 INFO:tasks.workunit.client.1.vm05.stdout:0/67: dread d5/f8 [0,4194304] 0 2026-03-09T16:15:03.857 INFO:tasks.workunit.client.1.vm05.stdout:0/68: stat d5/d11/l19 0 2026-03-09T16:15:03.858 INFO:tasks.workunit.client.1.vm05.stdout:7/60: dwrite d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:03.858 INFO:tasks.workunit.client.1.vm05.stdout:1/40: stat d7/dd/de/l14 0 2026-03-09T16:15:03.867 INFO:tasks.workunit.client.1.vm05.stdout:7/61: write d1/d2/ff [607725,3649] 0 2026-03-09T16:15:03.868 INFO:tasks.workunit.client.1.vm05.stdout:7/62: truncate d1/d2/ff 1305453 0 2026-03-09T16:15:03.869 INFO:tasks.workunit.client.1.vm05.stdout:9/51: symlink d4/le 0 2026-03-09T16:15:03.870 INFO:tasks.workunit.client.1.vm05.stdout:0/69: mkdir d5/d11/d1a 0 2026-03-09T16:15:03.870 INFO:tasks.workunit.client.1.vm05.stdout:9/52: rename d4 to d4/d8/df 22 2026-03-09T16:15:03.871 INFO:tasks.workunit.client.1.vm05.stdout:9/53: write f0 [3770497,5556] 0 2026-03-09T16:15:03.871 INFO:tasks.workunit.client.1.vm05.stdout:1/41: getdents d7/d15 0 2026-03-09T16:15:03.873 INFO:tasks.workunit.client.1.vm05.stdout:1/42: write d7/dd/f11 [770666,68445] 0 2026-03-09T16:15:03.873 INFO:tasks.workunit.client.1.vm05.stdout:1/43: truncate f0 654866 0 2026-03-09T16:15:03.873 INFO:tasks.workunit.client.1.vm05.stdout:1/44: chown d7 66 1 2026-03-09T16:15:03.874 INFO:tasks.workunit.client.1.vm05.stdout:7/63: mknod d1/d2/d6/c13 0 2026-03-09T16:15:03.875 INFO:tasks.workunit.client.1.vm05.stdout:9/54: mkdir d4/d10 0 2026-03-09T16:15:03.882 INFO:tasks.workunit.client.1.vm05.stdout:0/70: dwrite d5/db/fc [0,4194304] 0 2026-03-09T16:15:03.883 INFO:tasks.workunit.client.1.vm05.stdout:9/55: read - d4/f6 zero size 2026-03-09T16:15:03.888 INFO:tasks.workunit.client.1.vm05.stdout:0/71: fsync d5/d11/f18 0 2026-03-09T16:15:03.892 INFO:tasks.workunit.client.1.vm05.stdout:9/56: mknod d4/c11 0 2026-03-09T16:15:03.894 INFO:tasks.workunit.client.1.vm05.stdout:1/45: dwrite d7/fc [0,4194304] 0 2026-03-09T16:15:03.894 INFO:tasks.workunit.client.1.vm05.stdout:9/57: dread - d4/fa zero size 2026-03-09T16:15:03.896 INFO:tasks.workunit.client.1.vm05.stdout:0/72: dread d5/f8 [0,4194304] 0 2026-03-09T16:15:03.898 INFO:tasks.workunit.client.1.vm05.stdout:9/58: truncate d4/fa 523611 0 2026-03-09T16:15:03.901 INFO:tasks.workunit.client.1.vm05.stdout:9/59: chown f2 6 1 2026-03-09T16:15:03.901 INFO:tasks.workunit.client.1.vm05.stdout:1/46: unlink l4 0 2026-03-09T16:15:03.902 INFO:tasks.workunit.client.1.vm05.stdout:9/60: truncate f2 4840275 0 2026-03-09T16:15:03.902 INFO:tasks.workunit.client.1.vm05.stdout:0/73: fdatasync d5/f17 0 2026-03-09T16:15:03.907 INFO:tasks.workunit.client.1.vm05.stdout:1/47: mkdir d7/d15/d16 0 2026-03-09T16:15:03.911 INFO:tasks.workunit.client.1.vm05.stdout:1/48: rename d7/dd/de/l14 to d7/dd/de/l17 0 2026-03-09T16:15:03.916 INFO:tasks.workunit.client.1.vm05.stdout:9/61: dwrite d4/fa [0,4194304] 0 2026-03-09T16:15:03.916 INFO:tasks.workunit.client.1.vm05.stdout:9/62: dread - d4/f6 zero size 2026-03-09T16:15:03.917 INFO:tasks.workunit.client.1.vm05.stdout:9/63: unlink f0 0 2026-03-09T16:15:03.926 INFO:tasks.workunit.client.1.vm05.stdout:9/64: write d4/f6 [965224,12997] 0 2026-03-09T16:15:03.937 INFO:tasks.workunit.client.1.vm05.stdout:6/54: fsync f1 0 2026-03-09T16:15:03.942 INFO:tasks.workunit.client.1.vm05.stdout:5/32: getdents d8 0 2026-03-09T16:15:03.943 INFO:tasks.workunit.client.1.vm05.stdout:3/43: fsync d0/fd 0 2026-03-09T16:15:03.943 INFO:tasks.workunit.client.1.vm05.stdout:5/33: readlink l3 0 2026-03-09T16:15:03.946 INFO:tasks.workunit.client.1.vm05.stdout:6/55: dwrite f5 [0,4194304] 0 2026-03-09T16:15:03.947 INFO:tasks.workunit.client.1.vm05.stdout:7/64: fsync d1/d2/f5 0 2026-03-09T16:15:03.947 INFO:tasks.workunit.client.1.vm05.stdout:3/44: dread - d0/fd zero size 2026-03-09T16:15:03.947 INFO:tasks.workunit.client.1.vm05.stdout:6/56: symlink lf 0 2026-03-09T16:15:03.948 INFO:tasks.workunit.client.1.vm05.stdout:6/57: read - fa zero size 2026-03-09T16:15:03.948 INFO:tasks.workunit.client.1.vm05.stdout:6/58: read fc [3691636,82025] 0 2026-03-09T16:15:03.948 INFO:tasks.workunit.client.1.vm05.stdout:6/59: stat l6 0 2026-03-09T16:15:03.950 INFO:tasks.workunit.client.1.vm05.stdout:5/34: mknod d8/cf 0 2026-03-09T16:15:03.952 INFO:tasks.workunit.client.1.vm05.stdout:6/60: unlink lf 0 2026-03-09T16:15:03.952 INFO:tasks.workunit.client.1.vm05.stdout:6/61: chown cd 1066275145 1 2026-03-09T16:15:03.952 INFO:tasks.workunit.client.1.vm05.stdout:6/62: fdatasync f9 0 2026-03-09T16:15:03.953 INFO:tasks.workunit.client.1.vm05.stdout:6/63: dread - fa zero size 2026-03-09T16:15:03.955 INFO:tasks.workunit.client.1.vm05.stdout:5/35: rmdir d8 39 2026-03-09T16:15:03.955 INFO:tasks.workunit.client.1.vm05.stdout:6/64: unlink f1 0 2026-03-09T16:15:03.958 INFO:tasks.workunit.client.1.vm05.stdout:3/45: getdents d0/d9 0 2026-03-09T16:15:03.962 INFO:tasks.workunit.client.1.vm05.stdout:6/65: dwrite fb [0,4194304] 0 2026-03-09T16:15:03.962 INFO:tasks.workunit.client.1.vm05.stdout:6/66: stat f5 0 2026-03-09T16:15:03.963 INFO:tasks.workunit.client.1.vm05.stdout:3/46: unlink d0/l3 0 2026-03-09T16:15:03.964 INFO:tasks.workunit.client.1.vm05.stdout:6/67: mknod c10 0 2026-03-09T16:15:03.964 INFO:tasks.workunit.client.1.vm05.stdout:3/47: stat d0/l7 0 2026-03-09T16:15:03.966 INFO:tasks.workunit.client.1.vm05.stdout:6/68: dread f5 [0,4194304] 0 2026-03-09T16:15:03.967 INFO:tasks.workunit.client.1.vm05.stdout:9/65: fsync d4/fa 0 2026-03-09T16:15:03.968 INFO:tasks.workunit.client.1.vm05.stdout:6/69: creat f11 x:0 0 0 2026-03-09T16:15:03.969 INFO:tasks.workunit.client.1.vm05.stdout:6/70: write fa [941106,22737] 0 2026-03-09T16:15:03.971 INFO:tasks.workunit.client.1.vm05.stdout:1/49: dread d7/dd/f11 [0,4194304] 0 2026-03-09T16:15:03.971 INFO:tasks.workunit.client.1.vm05.stdout:1/50: stat f1 0 2026-03-09T16:15:03.973 INFO:tasks.workunit.client.1.vm05.stdout:6/71: rename c3 to c12 0 2026-03-09T16:15:03.973 INFO:tasks.workunit.client.1.vm05.stdout:1/51: write d7/dd/de/ff [146863,77785] 0 2026-03-09T16:15:03.979 INFO:tasks.workunit.client.1.vm05.stdout:6/72: symlink l13 0 2026-03-09T16:15:03.979 INFO:tasks.workunit.client.1.vm05.stdout:6/73: chown l13 47631873 1 2026-03-09T16:15:03.984 INFO:tasks.workunit.client.1.vm05.stdout:6/74: dwrite fa [0,4194304] 0 2026-03-09T16:15:03.990 INFO:tasks.workunit.client.1.vm05.stdout:9/66: dwrite d4/fa [0,4194304] 0 2026-03-09T16:15:03.996 INFO:tasks.workunit.client.1.vm05.stdout:6/75: unlink l13 0 2026-03-09T16:15:03.996 INFO:tasks.workunit.client.1.vm05.stdout:8/39: fsync f2 0 2026-03-09T16:15:03.997 INFO:tasks.workunit.client.1.vm05.stdout:1/52: creat d7/d15/d16/f18 x:0 0 0 2026-03-09T16:15:04.004 INFO:tasks.workunit.client.1.vm05.stdout:9/67: unlink d4/lc 0 2026-03-09T16:15:04.004 INFO:tasks.workunit.client.1.vm05.stdout:4/49: truncate d5/f6 530244 0 2026-03-09T16:15:04.005 INFO:tasks.workunit.client.1.vm05.stdout:4/50: write f0 [156814,62948] 0 2026-03-09T16:15:04.005 INFO:tasks.workunit.client.1.vm05.stdout:8/40: mkdir d4/d6/db/dc 0 2026-03-09T16:15:04.005 INFO:tasks.workunit.client.1.vm05.stdout:1/53: creat d7/dd/f19 x:0 0 0 2026-03-09T16:15:04.018 INFO:tasks.workunit.client.1.vm05.stdout:9/68: dwrite d4/f6 [0,4194304] 0 2026-03-09T16:15:04.022 INFO:tasks.workunit.client.1.vm05.stdout:8/41: dwrite d4/d6/f9 [0,4194304] 0 2026-03-09T16:15:04.024 INFO:tasks.workunit.client.1.vm05.stdout:4/51: dwrite f0 [0,4194304] 0 2026-03-09T16:15:04.024 INFO:tasks.workunit.client.1.vm05.stdout:4/52: readlink d5/lc 0 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:03 vm05.local ceph-mon[58702]: pgmap v11: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 55 MiB/s wr, 182 op/s 2026-03-09T16:15:04.031 INFO:tasks.workunit.client.1.vm05.stdout:8/42: creat d4/d6/db/fd x:0 0 0 2026-03-09T16:15:04.034 INFO:tasks.workunit.client.1.vm05.stdout:2/59: write f5 [1051900,6879] 0 2026-03-09T16:15:04.035 INFO:tasks.workunit.client.1.vm05.stdout:7/65: rename d1/d2/d6 to d1/d2/d8/dc/d14 0 2026-03-09T16:15:04.035 INFO:tasks.workunit.client.1.vm05.stdout:9/69: dwrite d4/fa [0,4194304] 0 2026-03-09T16:15:04.041 INFO:tasks.workunit.client.1.vm05.stdout:4/53: creat d5/fd x:0 0 0 2026-03-09T16:15:04.041 INFO:tasks.workunit.client.1.vm05.stdout:1/54: fsync d7/f9 0 2026-03-09T16:15:04.045 INFO:tasks.workunit.client.1.vm05.stdout:7/66: unlink l0 0 2026-03-09T16:15:04.046 INFO:tasks.workunit.client.1.vm05.stdout:7/67: write d1/d2/fe [9340897,17855] 0 2026-03-09T16:15:04.047 INFO:tasks.workunit.client.1.vm05.stdout:4/54: mkdir d5/de 0 2026-03-09T16:15:04.050 INFO:tasks.workunit.client.1.vm05.stdout:4/55: fsync d5/f9 0 2026-03-09T16:15:04.050 INFO:tasks.workunit.client.1.vm05.stdout:4/56: symlink d5/lf 0 2026-03-09T16:15:04.050 INFO:tasks.workunit.client.1.vm05.stdout:1/55: chown d7/c8 267 1 2026-03-09T16:15:04.050 INFO:tasks.workunit.client.1.vm05.stdout:7/68: write d1/d2/fe [1741014,106672] 0 2026-03-09T16:15:04.057 INFO:tasks.workunit.client.1.vm05.stdout:4/57: write d5/f9 [91511,76036] 0 2026-03-09T16:15:04.059 INFO:tasks.workunit.client.1.vm05.stdout:4/58: chown d5 432654 1 2026-03-09T16:15:04.060 INFO:tasks.workunit.client.1.vm05.stdout:8/43: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:15:04.061 INFO:tasks.workunit.client.1.vm05.stdout:9/70: dwrite d4/f6 [0,4194304] 0 2026-03-09T16:15:04.063 INFO:tasks.workunit.client.1.vm05.stdout:1/56: link l5 d7/d15/l1a 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:9/71: mknod d4/c12 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:1/57: fdatasync d7/dd/f11 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:1/58: unlink d7/dd/de/ff 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:9/72: mknod d4/d8/c13 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:9/73: chown d4 1 1 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:1/59: chown l5 1607 1 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:4/59: dwrite f4 [0,4194304] 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:8/44: dread f0 [0,4194304] 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:0/74: truncate d5/f7 227458 0 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:9/74: chown d4/f6 79485390 1 2026-03-09T16:15:04.081 INFO:tasks.workunit.client.1.vm05.stdout:1/60: unlink l5 0 2026-03-09T16:15:04.082 INFO:tasks.workunit.client.1.vm05.stdout:1/61: symlink d7/dd/de/l1b 0 2026-03-09T16:15:04.084 INFO:tasks.workunit.client.1.vm05.stdout:9/75: rename d4/le to d4/d10/l14 0 2026-03-09T16:15:04.085 INFO:tasks.workunit.client.1.vm05.stdout:0/75: rmdir d5/d11/d1a 0 2026-03-09T16:15:04.086 INFO:tasks.workunit.client.1.vm05.stdout:8/45: rename f2 to d4/d6/db/fe 0 2026-03-09T16:15:04.091 INFO:tasks.workunit.client.1.vm05.stdout:1/62: dwrite d7/fb [0,4194304] 0 2026-03-09T16:15:04.093 INFO:tasks.workunit.client.1.vm05.stdout:9/76: rmdir d4/d8 39 2026-03-09T16:15:04.094 INFO:tasks.workunit.client.1.vm05.stdout:0/76: mkdir d5/d1b 0 2026-03-09T16:15:04.097 INFO:tasks.workunit.client.1.vm05.stdout:1/63: creat d7/d15/d16/f1c x:0 0 0 2026-03-09T16:15:04.097 INFO:tasks.workunit.client.1.vm05.stdout:9/77: creat d4/d10/f15 x:0 0 0 2026-03-09T16:15:04.103 INFO:tasks.workunit.client.1.vm05.stdout:8/46: getdents d4 0 2026-03-09T16:15:04.107 INFO:tasks.workunit.client.1.vm05.stdout:1/64: dwrite f1 [4194304,4194304] 0 2026-03-09T16:15:04.109 INFO:tasks.workunit.client.1.vm05.stdout:1/65: symlink d7/l1d 0 2026-03-09T16:15:04.110 INFO:tasks.workunit.client.1.vm05.stdout:8/47: write f0 [3156879,69610] 0 2026-03-09T16:15:04.110 INFO:tasks.workunit.client.1.vm05.stdout:7/69: sync 2026-03-09T16:15:04.112 INFO:tasks.workunit.client.1.vm05.stdout:4/60: sync 2026-03-09T16:15:04.116 INFO:tasks.workunit.client.1.vm05.stdout:4/61: creat d5/f10 x:0 0 0 2026-03-09T16:15:04.118 INFO:tasks.workunit.client.1.vm05.stdout:7/70: dread d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.122 INFO:tasks.workunit.client.1.vm05.stdout:4/62: rename d5 to d5/de/d11 22 2026-03-09T16:15:04.125 INFO:tasks.workunit.client.1.vm05.stdout:7/71: mkdir d1/d2/d8/dc/d15 0 2026-03-09T16:15:04.126 INFO:tasks.workunit.client.1.vm05.stdout:7/72: mknod d1/d2/d8/c16 0 2026-03-09T16:15:04.132 INFO:tasks.workunit.client.1.vm05.stdout:7/73: dread d1/d2/fe [8388608,4194304] 0 2026-03-09T16:15:04.132 INFO:tasks.workunit.client.1.vm05.stdout:7/74: chown d1/d2/d8/dc 3979329 1 2026-03-09T16:15:04.134 INFO:tasks.workunit.client.1.vm05.stdout:7/75: rmdir d1/d2/d8 39 2026-03-09T16:15:04.138 INFO:tasks.workunit.client.1.vm05.stdout:4/63: sync 2026-03-09T16:15:04.139 INFO:tasks.workunit.client.1.vm05.stdout:7/76: rename d1/d2/f4 to d1/d2/d8/f17 0 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:03 vm03.local ceph-mon[51019]: pgmap v11: 65 pgs: 65 active+clean; 2.0 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 55 MiB/s wr, 182 op/s 2026-03-09T16:15:04.142 INFO:tasks.workunit.client.1.vm05.stdout:7/77: stat d1/d2/d8/f17 0 2026-03-09T16:15:04.144 INFO:tasks.workunit.client.1.vm05.stdout:7/78: chown d1/d2/d8/dc/d15 11889043 1 2026-03-09T16:15:04.154 INFO:tasks.workunit.client.1.vm05.stdout:4/64: dwrite d5/f10 [0,4194304] 0 2026-03-09T16:15:04.158 INFO:tasks.workunit.client.1.vm05.stdout:7/79: dwrite d1/d2/fe [4194304,4194304] 0 2026-03-09T16:15:04.166 INFO:tasks.workunit.client.1.vm05.stdout:4/65: mknod d5/c12 0 2026-03-09T16:15:04.168 INFO:tasks.workunit.client.1.vm05.stdout:7/80: getdents d1/d2/d11 0 2026-03-09T16:15:04.175 INFO:tasks.workunit.client.1.vm05.stdout:7/81: dwrite d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.184 INFO:tasks.workunit.client.1.vm05.stdout:7/82: mkdir d1/d2/d8/dc/d18 0 2026-03-09T16:15:04.185 INFO:tasks.workunit.client.1.vm05.stdout:7/83: mkdir d1/d19 0 2026-03-09T16:15:04.186 INFO:tasks.workunit.client.1.vm05.stdout:7/84: chown d1/d2/d8/dc/d15 30914777 1 2026-03-09T16:15:04.187 INFO:tasks.workunit.client.1.vm05.stdout:7/85: chown d1/d2/l9 6522913 1 2026-03-09T16:15:04.187 INFO:tasks.workunit.client.1.vm05.stdout:7/86: chown d1/d2/d8 3336022 1 2026-03-09T16:15:04.194 INFO:tasks.workunit.client.1.vm05.stdout:7/87: dwrite d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:04.203 INFO:tasks.workunit.client.1.vm05.stdout:7/88: getdents d1/d2/d11 0 2026-03-09T16:15:04.235 INFO:tasks.workunit.client.1.vm05.stdout:7/89: fdatasync d1/d2/f5 0 2026-03-09T16:15:04.241 INFO:tasks.workunit.client.1.vm05.stdout:7/90: dread d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.249 INFO:tasks.workunit.client.1.vm05.stdout:5/36: getdents d8 0 2026-03-09T16:15:04.251 INFO:tasks.workunit.client.1.vm05.stdout:7/91: sync 2026-03-09T16:15:04.260 INFO:tasks.workunit.client.1.vm05.stdout:5/37: symlink d8/l10 0 2026-03-09T16:15:04.269 INFO:tasks.workunit.client.1.vm05.stdout:5/38: write f6 [760074,103055] 0 2026-03-09T16:15:04.272 INFO:tasks.workunit.client.1.vm05.stdout:3/48: dwrite d0/d9/fa [0,4194304] 0 2026-03-09T16:15:04.274 INFO:tasks.workunit.client.1.vm05.stdout:5/39: creat d8/f11 x:0 0 0 2026-03-09T16:15:04.276 INFO:tasks.workunit.client.1.vm05.stdout:5/40: chown c0 151 1 2026-03-09T16:15:04.277 INFO:tasks.workunit.client.1.vm05.stdout:7/92: dwrite d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.288 INFO:tasks.workunit.client.1.vm05.stdout:7/93: dwrite d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:04.294 INFO:tasks.workunit.client.1.vm05.stdout:5/41: dread f6 [0,4194304] 0 2026-03-09T16:15:04.297 INFO:tasks.workunit.client.1.vm05.stdout:7/94: dwrite d1/d2/fe [4194304,4194304] 0 2026-03-09T16:15:04.303 INFO:tasks.workunit.client.1.vm05.stdout:5/42: write d8/fb [719697,5600] 0 2026-03-09T16:15:04.304 INFO:tasks.workunit.client.1.vm05.stdout:5/43: read - d8/f11 zero size 2026-03-09T16:15:04.304 INFO:tasks.workunit.client.1.vm05.stdout:7/95: dread d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.319 INFO:tasks.workunit.client.1.vm05.stdout:7/96: dwrite d1/d2/d8/f17 [0,4194304] 0 2026-03-09T16:15:04.322 INFO:tasks.workunit.client.1.vm05.stdout:8/48: write d4/d6/f9 [4661416,128817] 0 2026-03-09T16:15:04.337 INFO:tasks.workunit.client.1.vm05.stdout:2/60: dread f5 [0,4194304] 0 2026-03-09T16:15:04.340 INFO:tasks.workunit.client.1.vm05.stdout:7/97: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:04.345 INFO:tasks.workunit.client.1.vm05.stdout:7/98: dread d1/d2/d8/f17 [4194304,4194304] 0 2026-03-09T16:15:04.380 INFO:tasks.workunit.client.1.vm05.stdout:0/77: dread d5/f7 [0,4194304] 0 2026-03-09T16:15:04.382 INFO:tasks.workunit.client.1.vm05.stdout:0/78: rename d5/db/c10 to d5/db/c1c 0 2026-03-09T16:15:04.389 INFO:tasks.workunit.client.1.vm05.stdout:0/79: mkdir d5/db/d1d 0 2026-03-09T16:15:04.390 INFO:tasks.workunit.client.1.vm05.stdout:0/80: creat d5/d11/f1e x:0 0 0 2026-03-09T16:15:04.391 INFO:tasks.workunit.client.1.vm05.stdout:0/81: mkdir d5/d1f 0 2026-03-09T16:15:04.392 INFO:tasks.workunit.client.1.vm05.stdout:0/82: truncate d5/d11/f16 338260 0 2026-03-09T16:15:04.393 INFO:tasks.workunit.client.1.vm05.stdout:0/83: stat d5/d11/f18 0 2026-03-09T16:15:04.397 INFO:tasks.workunit.client.1.vm05.stdout:0/84: dread d5/f9 [0,4194304] 0 2026-03-09T16:15:04.398 INFO:tasks.workunit.client.1.vm05.stdout:0/85: rename d5 to d5/db/d20 22 2026-03-09T16:15:04.403 INFO:tasks.workunit.client.1.vm05.stdout:8/49: fsync d4/d6/db/fe 0 2026-03-09T16:15:04.403 INFO:tasks.workunit.client.1.vm05.stdout:9/78: readlink d4/d10/l14 0 2026-03-09T16:15:04.403 INFO:tasks.workunit.client.1.vm05.stdout:9/79: stat f2 0 2026-03-09T16:15:04.403 INFO:tasks.workunit.client.1.vm05.stdout:0/86: rename d5/db to d5/db/d21 22 2026-03-09T16:15:04.404 INFO:tasks.workunit.client.1.vm05.stdout:8/50: fsync d4/d6/db/fd 0 2026-03-09T16:15:04.405 INFO:tasks.workunit.client.1.vm05.stdout:9/80: chown d4/d10/l14 13548 1 2026-03-09T16:15:04.405 INFO:tasks.workunit.client.1.vm05.stdout:0/87: write d5/db/fc [904496,49565] 0 2026-03-09T16:15:04.408 INFO:tasks.workunit.client.1.vm05.stdout:8/51: mkdir d4/d6/db/df 0 2026-03-09T16:15:04.408 INFO:tasks.workunit.client.1.vm05.stdout:1/66: write d7/f9 [204988,76323] 0 2026-03-09T16:15:04.409 INFO:tasks.workunit.client.1.vm05.stdout:9/81: chown d4/d10/f15 355695703 1 2026-03-09T16:15:04.410 INFO:tasks.workunit.client.1.vm05.stdout:9/82: write d4/d10/f15 [555343,19726] 0 2026-03-09T16:15:04.411 INFO:tasks.workunit.client.1.vm05.stdout:4/66: rmdir d5 39 2026-03-09T16:15:04.412 INFO:tasks.workunit.client.1.vm05.stdout:9/83: readlink d4/ld 0 2026-03-09T16:15:04.412 INFO:tasks.workunit.client.1.vm05.stdout:7/99: getdents d1/d2/d8/dc 0 2026-03-09T16:15:04.413 INFO:tasks.workunit.client.1.vm05.stdout:6/76: rename c12 to c14 0 2026-03-09T16:15:04.418 INFO:tasks.workunit.client.1.vm05.stdout:0/88: sync 2026-03-09T16:15:04.419 INFO:tasks.workunit.client.1.vm05.stdout:0/89: chown d5/db 108617 1 2026-03-09T16:15:04.421 INFO:tasks.workunit.client.1.vm05.stdout:1/67: dwrite f1 [4194304,4194304] 0 2026-03-09T16:15:04.424 INFO:tasks.workunit.client.1.vm05.stdout:5/44: fsync d8/f11 0 2026-03-09T16:15:04.426 INFO:tasks.workunit.client.1.vm05.stdout:7/100: creat d1/d2/d8/dc/f1a x:0 0 0 2026-03-09T16:15:04.431 INFO:tasks.workunit.client.1.vm05.stdout:2/61: truncate f5 278273 0 2026-03-09T16:15:04.431 INFO:tasks.workunit.client.1.vm05.stdout:6/77: mknod c15 0 2026-03-09T16:15:04.432 INFO:tasks.workunit.client.1.vm05.stdout:5/45: fsync f1 0 2026-03-09T16:15:04.435 INFO:tasks.workunit.client.1.vm05.stdout:0/90: symlink d5/db/d1d/l22 0 2026-03-09T16:15:04.435 INFO:tasks.workunit.client.1.vm05.stdout:3/49: rmdir d0/d9 39 2026-03-09T16:15:04.435 INFO:tasks.workunit.client.1.vm05.stdout:8/52: dwrite f0 [0,4194304] 0 2026-03-09T16:15:04.437 INFO:tasks.workunit.client.1.vm05.stdout:6/78: fsync fb 0 2026-03-09T16:15:04.437 INFO:tasks.workunit.client.1.vm05.stdout:7/101: write d1/d2/ff [445062,9784] 0 2026-03-09T16:15:04.439 INFO:tasks.workunit.client.1.vm05.stdout:4/67: link c3 d5/c13 0 2026-03-09T16:15:04.444 INFO:tasks.workunit.client.1.vm05.stdout:4/68: readlink d5/l8 0 2026-03-09T16:15:04.444 INFO:tasks.workunit.client.1.vm05.stdout:2/62: creat db/f12 x:0 0 0 2026-03-09T16:15:04.446 INFO:tasks.workunit.client.1.vm05.stdout:4/69: write d5/f9 [178923,120239] 0 2026-03-09T16:15:04.447 INFO:tasks.workunit.client.1.vm05.stdout:6/79: stat f11 0 2026-03-09T16:15:04.447 INFO:tasks.workunit.client.1.vm05.stdout:4/70: fsync d5/fa 0 2026-03-09T16:15:04.448 INFO:tasks.workunit.client.1.vm05.stdout:6/80: chown f11 8 1 2026-03-09T16:15:04.449 INFO:tasks.workunit.client.1.vm05.stdout:0/91: dwrite d5/db/f12 [0,4194304] 0 2026-03-09T16:15:04.455 INFO:tasks.workunit.client.1.vm05.stdout:7/102: mkdir d1/d2/d8/dc/d1b 0 2026-03-09T16:15:04.457 INFO:tasks.workunit.client.1.vm05.stdout:5/46: mknod d8/c12 0 2026-03-09T16:15:04.457 INFO:tasks.workunit.client.1.vm05.stdout:3/50: fsync d0/fd 0 2026-03-09T16:15:04.458 INFO:tasks.workunit.client.1.vm05.stdout:0/92: write d5/d11/f16 [527146,127600] 0 2026-03-09T16:15:04.460 INFO:tasks.workunit.client.1.vm05.stdout:6/81: stat cd 0 2026-03-09T16:15:04.460 INFO:tasks.workunit.client.1.vm05.stdout:0/93: chown d5/db/d1d 58321846 1 2026-03-09T16:15:04.461 INFO:tasks.workunit.client.1.vm05.stdout:4/71: rename d5/c7 to d5/c14 0 2026-03-09T16:15:04.462 INFO:tasks.workunit.client.1.vm05.stdout:7/103: dwrite d1/d2/ff [0,4194304] 0 2026-03-09T16:15:04.463 INFO:tasks.workunit.client.1.vm05.stdout:8/53: dread d4/d6/db/fe [0,4194304] 0 2026-03-09T16:15:04.463 INFO:tasks.workunit.client.1.vm05.stdout:0/94: truncate d5/f7 1335182 0 2026-03-09T16:15:04.468 INFO:tasks.workunit.client.1.vm05.stdout:5/47: dread d8/fb [0,4194304] 0 2026-03-09T16:15:04.469 INFO:tasks.workunit.client.1.vm05.stdout:3/51: dwrite d0/fd [0,4194304] 0 2026-03-09T16:15:04.471 INFO:tasks.workunit.client.1.vm05.stdout:7/104: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:04.471 INFO:tasks.workunit.client.1.vm05.stdout:7/105: chown d1/d2/f5 39237 1 2026-03-09T16:15:04.478 INFO:tasks.workunit.client.1.vm05.stdout:8/54: creat d4/f10 x:0 0 0 2026-03-09T16:15:04.489 INFO:tasks.workunit.client.1.vm05.stdout:5/48: creat d8/f13 x:0 0 0 2026-03-09T16:15:04.503 INFO:tasks.workunit.client.1.vm05.stdout:7/106: creat d1/d2/d11/f1c x:0 0 0 2026-03-09T16:15:04.507 INFO:tasks.workunit.client.1.vm05.stdout:7/107: write d1/d2/d8/dc/f1a [196234,74879] 0 2026-03-09T16:15:04.510 INFO:tasks.workunit.client.1.vm05.stdout:4/72: mkdir d5/de/d15 0 2026-03-09T16:15:04.510 INFO:tasks.workunit.client.1.vm05.stdout:4/73: fdatasync f4 0 2026-03-09T16:15:04.513 INFO:tasks.workunit.client.1.vm05.stdout:0/95: dread d5/f8 [0,4194304] 0 2026-03-09T16:15:04.514 INFO:tasks.workunit.client.1.vm05.stdout:8/55: mknod d4/c11 0 2026-03-09T16:15:04.514 INFO:tasks.workunit.client.1.vm05.stdout:0/96: chown d5/ce 3180 1 2026-03-09T16:15:04.533 INFO:tasks.workunit.client.1.vm05.stdout:3/52: rmdir d0/d9 39 2026-03-09T16:15:04.533 INFO:tasks.workunit.client.1.vm05.stdout:5/49: write d8/fb [1544119,106092] 0 2026-03-09T16:15:04.536 INFO:tasks.workunit.client.1.vm05.stdout:3/53: write d0/fd [1877885,103448] 0 2026-03-09T16:15:04.545 INFO:tasks.workunit.client.1.vm05.stdout:7/108: mknod d1/d2/d8/c1d 0 2026-03-09T16:15:04.546 INFO:tasks.workunit.client.1.vm05.stdout:4/74: creat d5/de/f16 x:0 0 0 2026-03-09T16:15:04.550 INFO:tasks.workunit.client.1.vm05.stdout:8/56: creat d4/d6/f12 x:0 0 0 2026-03-09T16:15:04.552 INFO:tasks.workunit.client.1.vm05.stdout:5/50: fsync f5 0 2026-03-09T16:15:04.552 INFO:tasks.workunit.client.1.vm05.stdout:5/51: chown f1 3 1 2026-03-09T16:15:04.553 INFO:tasks.workunit.client.1.vm05.stdout:7/109: creat d1/d2/d8/dc/f1e x:0 0 0 2026-03-09T16:15:04.555 INFO:tasks.workunit.client.1.vm05.stdout:3/54: write d0/d9/fa [1611477,96444] 0 2026-03-09T16:15:04.560 INFO:tasks.workunit.client.1.vm05.stdout:8/57: creat d4/f13 x:0 0 0 2026-03-09T16:15:04.563 INFO:tasks.workunit.client.1.vm05.stdout:9/84: truncate d4/f6 2580829 0 2026-03-09T16:15:04.565 INFO:tasks.workunit.client.1.vm05.stdout:1/68: dwrite d7/d15/d16/f18 [0,4194304] 0 2026-03-09T16:15:04.572 INFO:tasks.workunit.client.1.vm05.stdout:7/110: symlink d1/l1f 0 2026-03-09T16:15:04.580 INFO:tasks.workunit.client.1.vm05.stdout:1/69: dread f1 [4194304,4194304] 0 2026-03-09T16:15:04.582 INFO:tasks.workunit.client.1.vm05.stdout:1/70: write d7/fb [3680149,91021] 0 2026-03-09T16:15:04.590 INFO:tasks.workunit.client.1.vm05.stdout:8/58: mkdir d4/d6/db/d14 0 2026-03-09T16:15:04.590 INFO:tasks.workunit.client.1.vm05.stdout:9/85: unlink d4/c9 0 2026-03-09T16:15:04.592 INFO:tasks.workunit.client.1.vm05.stdout:7/111: write d1/d2/fe [4447788,118358] 0 2026-03-09T16:15:04.592 INFO:tasks.workunit.client.1.vm05.stdout:8/59: truncate d4/f10 273060 0 2026-03-09T16:15:04.601 INFO:tasks.workunit.client.1.vm05.stdout:2/63: write fa [2936358,49583] 0 2026-03-09T16:15:04.603 INFO:tasks.workunit.client.1.vm05.stdout:0/97: getdents d5/d11 0 2026-03-09T16:15:04.605 INFO:tasks.workunit.client.1.vm05.stdout:7/112: sync 2026-03-09T16:15:04.613 INFO:tasks.workunit.client.1.vm05.stdout:9/86: dwrite f2 [0,4194304] 0 2026-03-09T16:15:04.613 INFO:tasks.workunit.client.1.vm05.stdout:1/71: mknod d7/dd/de/c1e 0 2026-03-09T16:15:04.614 INFO:tasks.workunit.client.1.vm05.stdout:0/98: creat d5/d11/f23 x:0 0 0 2026-03-09T16:15:04.624 INFO:tasks.workunit.client.1.vm05.stdout:8/60: dwrite d4/d6/db/fd [0,4194304] 0 2026-03-09T16:15:04.624 INFO:tasks.workunit.client.1.vm05.stdout:1/72: readlink d7/dd/de/l1b 0 2026-03-09T16:15:04.624 INFO:tasks.workunit.client.1.vm05.stdout:8/61: dread - d4/d6/fa zero size 2026-03-09T16:15:04.631 INFO:tasks.workunit.client.1.vm05.stdout:0/99: dread d5/f7 [0,4194304] 0 2026-03-09T16:15:04.632 INFO:tasks.workunit.client.1.vm05.stdout:2/64: dread f7 [0,4194304] 0 2026-03-09T16:15:04.633 INFO:tasks.workunit.client.1.vm05.stdout:7/113: mknod d1/d2/c20 0 2026-03-09T16:15:04.637 INFO:tasks.workunit.client.1.vm05.stdout:1/73: truncate d7/dd/f11 965821 0 2026-03-09T16:15:04.639 INFO:tasks.workunit.client.1.vm05.stdout:6/82: dwrite fa [4194304,4194304] 0 2026-03-09T16:15:04.645 INFO:tasks.workunit.client.1.vm05.stdout:4/75: fsync d5/de/f16 0 2026-03-09T16:15:04.645 INFO:tasks.workunit.client.1.vm05.stdout:2/65: symlink db/l13 0 2026-03-09T16:15:04.649 INFO:tasks.workunit.client.1.vm05.stdout:7/114: write d1/d2/ff [3466533,35265] 0 2026-03-09T16:15:04.650 INFO:tasks.workunit.client.1.vm05.stdout:9/87: rename d4/ld to d4/l16 0 2026-03-09T16:15:04.651 INFO:tasks.workunit.client.1.vm05.stdout:7/115: readlink d1/l1f 0 2026-03-09T16:15:04.654 INFO:tasks.workunit.client.1.vm05.stdout:8/62: mkdir d4/d6/db/d14/d15 0 2026-03-09T16:15:04.654 INFO:tasks.workunit.client.1.vm05.stdout:7/116: chown d1/d2/d8/lb 10824 1 2026-03-09T16:15:04.656 INFO:tasks.workunit.client.1.vm05.stdout:7/117: fsync d1/d2/d11/f1c 0 2026-03-09T16:15:04.657 INFO:tasks.workunit.client.1.vm05.stdout:1/74: readlink d7/d15/l1a 0 2026-03-09T16:15:04.657 INFO:tasks.workunit.client.1.vm05.stdout:1/75: readlink d7/d15/l1a 0 2026-03-09T16:15:04.659 INFO:tasks.workunit.client.1.vm05.stdout:4/76: link d5/f9 d5/de/d15/f17 0 2026-03-09T16:15:04.664 INFO:tasks.workunit.client.1.vm05.stdout:8/63: mknod d4/c16 0 2026-03-09T16:15:04.664 INFO:tasks.workunit.client.1.vm05.stdout:6/83: dread f5 [0,4194304] 0 2026-03-09T16:15:04.664 INFO:tasks.workunit.client.1.vm05.stdout:7/118: dwrite d1/d2/ff [4194304,4194304] 0 2026-03-09T16:15:04.676 INFO:tasks.workunit.client.1.vm05.stdout:8/64: chown d4/d6/db/fe 1432 1 2026-03-09T16:15:04.700 INFO:tasks.workunit.client.1.vm05.stdout:4/77: mknod d5/c18 0 2026-03-09T16:15:04.700 INFO:tasks.workunit.client.1.vm05.stdout:4/78: stat d5/fb 0 2026-03-09T16:15:04.704 INFO:tasks.workunit.client.1.vm05.stdout:9/88: creat d4/f17 x:0 0 0 2026-03-09T16:15:04.709 INFO:tasks.workunit.client.1.vm05.stdout:4/79: dwrite f0 [0,4194304] 0 2026-03-09T16:15:04.709 INFO:tasks.workunit.client.1.vm05.stdout:4/80: stat d5 0 2026-03-09T16:15:04.710 INFO:tasks.workunit.client.1.vm05.stdout:1/76: creat d7/dd/f1f x:0 0 0 2026-03-09T16:15:04.712 INFO:tasks.workunit.client.1.vm05.stdout:9/89: write d4/d10/f15 [1623158,40474] 0 2026-03-09T16:15:04.716 INFO:tasks.workunit.client.1.vm05.stdout:4/81: mkdir d5/d19 0 2026-03-09T16:15:04.725 INFO:tasks.workunit.client.1.vm05.stdout:7/119: symlink d1/d19/l21 0 2026-03-09T16:15:04.725 INFO:tasks.workunit.client.1.vm05.stdout:8/65: creat d4/d6/db/dc/f17 x:0 0 0 2026-03-09T16:15:04.726 INFO:tasks.workunit.client.1.vm05.stdout:5/52: write f5 [154682,58232] 0 2026-03-09T16:15:04.728 INFO:tasks.workunit.client.1.vm05.stdout:7/120: chown d1/d2/c20 12 1 2026-03-09T16:15:04.729 INFO:tasks.workunit.client.1.vm05.stdout:8/66: fsync d4/f10 0 2026-03-09T16:15:04.731 INFO:tasks.workunit.client.1.vm05.stdout:6/84: creat f16 x:0 0 0 2026-03-09T16:15:04.732 INFO:tasks.workunit.client.1.vm05.stdout:4/82: symlink d5/d19/l1a 0 2026-03-09T16:15:04.736 INFO:tasks.workunit.client.1.vm05.stdout:1/77: dwrite d7/f9 [0,4194304] 0 2026-03-09T16:15:04.739 INFO:tasks.workunit.client.1.vm05.stdout:4/83: rename d5/fa to d5/de/d15/f1b 0 2026-03-09T16:15:04.739 INFO:tasks.workunit.client.1.vm05.stdout:1/78: fdatasync d7/dd/f1f 0 2026-03-09T16:15:04.739 INFO:tasks.workunit.client.1.vm05.stdout:5/53: link d8/l10 d8/l14 0 2026-03-09T16:15:04.739 INFO:tasks.workunit.client.1.vm05.stdout:6/85: mkdir d17 0 2026-03-09T16:15:04.739 INFO:tasks.workunit.client.1.vm05.stdout:4/84: rename d5/de/d15 to d5/de/d15/d1c 22 2026-03-09T16:15:04.740 INFO:tasks.workunit.client.1.vm05.stdout:1/79: chown d7/d15/d16/f1c 355 1 2026-03-09T16:15:04.741 INFO:tasks.workunit.client.1.vm05.stdout:7/121: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:04.744 INFO:tasks.workunit.client.1.vm05.stdout:1/80: read d7/dd/f11 [920842,25873] 0 2026-03-09T16:15:04.745 INFO:tasks.workunit.client.1.vm05.stdout:8/67: link d4/d6/db/dc/f17 d4/d6/db/df/f18 0 2026-03-09T16:15:04.745 INFO:tasks.workunit.client.1.vm05.stdout:4/85: symlink d5/l1d 0 2026-03-09T16:15:04.747 INFO:tasks.workunit.client.1.vm05.stdout:4/86: write d5/de/f16 [139528,94169] 0 2026-03-09T16:15:04.749 INFO:tasks.workunit.client.1.vm05.stdout:6/86: write f5 [4931352,121697] 0 2026-03-09T16:15:04.750 INFO:tasks.workunit.client.1.vm05.stdout:1/81: rmdir d7/d15/d16 39 2026-03-09T16:15:04.750 INFO:tasks.workunit.client.1.vm05.stdout:5/54: link d8/c12 d8/c15 0 2026-03-09T16:15:04.754 INFO:tasks.workunit.client.1.vm05.stdout:5/55: symlink d8/l16 0 2026-03-09T16:15:04.755 INFO:tasks.workunit.client.1.vm05.stdout:7/122: creat d1/d2/f22 x:0 0 0 2026-03-09T16:15:04.756 INFO:tasks.workunit.client.1.vm05.stdout:5/56: write d8/fd [151732,108999] 0 2026-03-09T16:15:04.758 INFO:tasks.workunit.client.1.vm05.stdout:6/87: creat d17/f18 x:0 0 0 2026-03-09T16:15:04.758 INFO:tasks.workunit.client.1.vm05.stdout:4/87: dwrite d5/fd [0,4194304] 0 2026-03-09T16:15:04.758 INFO:tasks.workunit.client.1.vm05.stdout:1/82: sync 2026-03-09T16:15:04.759 INFO:tasks.workunit.client.1.vm05.stdout:7/123: sync 2026-03-09T16:15:04.763 INFO:tasks.workunit.client.1.vm05.stdout:7/124: write d1/d2/d8/f17 [1654554,80150] 0 2026-03-09T16:15:04.767 INFO:tasks.workunit.client.1.vm05.stdout:7/125: readlink d1/l1f 0 2026-03-09T16:15:04.768 INFO:tasks.workunit.client.1.vm05.stdout:7/126: chown d1/d2 165356724 1 2026-03-09T16:15:04.770 INFO:tasks.workunit.client.1.vm05.stdout:5/57: symlink d8/l17 0 2026-03-09T16:15:04.771 INFO:tasks.workunit.client.1.vm05.stdout:5/58: write d8/fb [2132307,57919] 0 2026-03-09T16:15:04.771 INFO:tasks.workunit.client.1.vm05.stdout:8/68: link d4/l5 d4/d6/l19 0 2026-03-09T16:15:04.775 INFO:tasks.workunit.client.1.vm05.stdout:5/59: write f5 [1433064,1159] 0 2026-03-09T16:15:04.780 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.780 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:04.780 INFO:tasks.workunit.client.1.vm05.stdout:5/60: write f1 [3333785,25312] 0 2026-03-09T16:15:04.780 INFO:tasks.workunit.client.1.vm05.stdout:7/127: dwrite d1/d2/fe [4194304,4194304] 0 2026-03-09T16:15:04.783 INFO:tasks.workunit.client.1.vm05.stdout:1/83: rmdir d7/d15/d16 39 2026-03-09T16:15:04.787 INFO:tasks.workunit.client.1.vm05.stdout:8/69: rename d4/c11 to d4/d6/c1a 0 2026-03-09T16:15:04.788 INFO:tasks.workunit.client.1.vm05.stdout:8/70: fsync d4/d6/f9 0 2026-03-09T16:15:04.791 INFO:tasks.workunit.client.1.vm05.stdout:7/128: truncate d1/d2/f22 659469 0 2026-03-09T16:15:04.791 INFO:tasks.workunit.client.1.vm05.stdout:5/61: fsync f6 0 2026-03-09T16:15:04.792 INFO:tasks.workunit.client.1.vm05.stdout:8/71: creat d4/d6/f1b x:0 0 0 2026-03-09T16:15:04.795 INFO:tasks.workunit.client.1.vm05.stdout:7/129: stat d1/d2/d8/dc/d14 0 2026-03-09T16:15:04.797 INFO:tasks.workunit.client.1.vm05.stdout:1/84: dread f1 [0,4194304] 0 2026-03-09T16:15:04.798 INFO:tasks.workunit.client.1.vm05.stdout:5/62: mkdir d8/d18 0 2026-03-09T16:15:04.799 INFO:tasks.workunit.client.1.vm05.stdout:5/63: dread - d8/f11 zero size 2026-03-09T16:15:04.799 INFO:tasks.workunit.client.1.vm05.stdout:7/130: rmdir d1/d2/d11 39 2026-03-09T16:15:04.803 INFO:tasks.workunit.client.1.vm05.stdout:8/72: link d4/d6/f9 d4/f1c 0 2026-03-09T16:15:04.806 INFO:tasks.workunit.client.1.vm05.stdout:7/131: mknod d1/d2/c23 0 2026-03-09T16:15:04.806 INFO:tasks.workunit.client.1.vm05.stdout:1/85: sync 2026-03-09T16:15:04.809 INFO:tasks.workunit.client.1.vm05.stdout:7/132: write d1/d2/d8/dc/f1a [709617,6712] 0 2026-03-09T16:15:04.810 INFO:tasks.workunit.client.1.vm05.stdout:5/64: chown d8/l10 7695 1 2026-03-09T16:15:04.811 INFO:tasks.workunit.client.1.vm05.stdout:8/73: rename d4/d6/f12 to d4/d6/db/df/f1d 0 2026-03-09T16:15:04.819 INFO:tasks.workunit.client.1.vm05.stdout:5/65: getdents d8/d18 0 2026-03-09T16:15:04.820 INFO:tasks.workunit.client.1.vm05.stdout:1/86: dwrite d7/dd/f11 [0,4194304] 0 2026-03-09T16:15:04.825 INFO:tasks.workunit.client.1.vm05.stdout:7/133: link d1/d2/la d1/l24 0 2026-03-09T16:15:04.827 INFO:tasks.workunit.client.1.vm05.stdout:1/87: symlink d7/dd/de/l20 0 2026-03-09T16:15:04.828 INFO:tasks.workunit.client.1.vm05.stdout:5/66: chown d8/d18 4455 1 2026-03-09T16:15:04.828 INFO:tasks.workunit.client.1.vm05.stdout:1/88: chown d7/dd/f1f 54 1 2026-03-09T16:15:04.832 INFO:tasks.workunit.client.1.vm05.stdout:1/89: dread - d7/dd/f1f zero size 2026-03-09T16:15:04.836 INFO:tasks.workunit.client.1.vm05.stdout:5/67: unlink f6 0 2026-03-09T16:15:04.838 INFO:tasks.workunit.client.1.vm05.stdout:7/134: creat d1/d2/d11/f25 x:0 0 0 2026-03-09T16:15:04.838 INFO:tasks.workunit.client.1.vm05.stdout:7/135: chown d1 499184903 1 2026-03-09T16:15:04.839 INFO:tasks.workunit.client.1.vm05.stdout:7/136: write d1/d2/ff [7284285,130452] 0 2026-03-09T16:15:04.840 INFO:tasks.workunit.client.1.vm05.stdout:1/90: mkdir d7/dd/d21 0 2026-03-09T16:15:04.848 INFO:tasks.workunit.client.1.vm05.stdout:5/68: rename d8/ca to d8/d18/c19 0 2026-03-09T16:15:04.855 INFO:tasks.workunit.client.1.vm05.stdout:5/69: mknod d8/d18/c1a 0 2026-03-09T16:15:04.861 INFO:tasks.workunit.client.1.vm05.stdout:7/137: link d1/d2/d11/f1c d1/f26 0 2026-03-09T16:15:04.862 INFO:tasks.workunit.client.1.vm05.stdout:3/55: truncate d0/d9/fa 2991445 0 2026-03-09T16:15:04.864 INFO:tasks.workunit.client.1.vm05.stdout:0/100: truncate d5/db/fc 1131640 0 2026-03-09T16:15:04.864 INFO:tasks.workunit.client.1.vm05.stdout:3/56: write d0/fd [3295000,117565] 0 2026-03-09T16:15:04.870 INFO:tasks.workunit.client.1.vm05.stdout:3/57: sync 2026-03-09T16:15:04.870 INFO:tasks.workunit.client.1.vm05.stdout:3/58: chown d0/d9 1751686 1 2026-03-09T16:15:04.874 INFO:tasks.workunit.client.1.vm05.stdout:5/70: rmdir d8/d18 39 2026-03-09T16:15:04.879 INFO:tasks.workunit.client.1.vm05.stdout:0/101: mkdir d5/d24 0 2026-03-09T16:15:04.880 INFO:tasks.workunit.client.1.vm05.stdout:5/71: readlink d8/l10 0 2026-03-09T16:15:04.881 INFO:tasks.workunit.client.1.vm05.stdout:7/138: mknod d1/d2/c27 0 2026-03-09T16:15:04.884 INFO:tasks.workunit.client.1.vm05.stdout:1/91: rename d7/d15/d16/f18 to d7/d15/f22 0 2026-03-09T16:15:04.888 INFO:tasks.workunit.client.1.vm05.stdout:7/139: mknod d1/c28 0 2026-03-09T16:15:04.888 INFO:tasks.workunit.client.1.vm05.stdout:3/59: link d0/d9/cb d0/d9/dc/ce 0 2026-03-09T16:15:04.889 INFO:tasks.workunit.client.1.vm05.stdout:1/92: chown d7/d15/f22 13788297 1 2026-03-09T16:15:04.889 INFO:tasks.workunit.client.1.vm05.stdout:1/93: chown f1 1057 1 2026-03-09T16:15:04.894 INFO:tasks.workunit.client.1.vm05.stdout:5/72: dread d8/fb [0,4194304] 0 2026-03-09T16:15:04.896 INFO:tasks.workunit.client.1.vm05.stdout:7/140: write d1/d2/d11/f1c [641518,103696] 0 2026-03-09T16:15:04.896 INFO:tasks.workunit.client.1.vm05.stdout:2/66: dwrite f7 [0,4194304] 0 2026-03-09T16:15:04.909 INFO:tasks.workunit.client.1.vm05.stdout:3/60: chown d0/d9/fa 61 1 2026-03-09T16:15:04.909 INFO:tasks.workunit.client.1.vm05.stdout:5/73: readlink d8/l14 0 2026-03-09T16:15:04.910 INFO:tasks.workunit.client.1.vm05.stdout:5/74: fsync f5 0 2026-03-09T16:15:04.910 INFO:tasks.workunit.client.1.vm05.stdout:5/75: dread - d8/f11 zero size 2026-03-09T16:15:04.910 INFO:tasks.workunit.client.1.vm05.stdout:2/67: symlink db/dd/de/l14 0 2026-03-09T16:15:04.911 INFO:tasks.workunit.client.1.vm05.stdout:2/68: dread - db/f12 zero size 2026-03-09T16:15:04.913 INFO:tasks.workunit.client.1.vm05.stdout:3/61: mknod d0/d9/dc/cf 0 2026-03-09T16:15:04.914 INFO:tasks.workunit.client.1.vm05.stdout:2/69: mkdir db/dd/d15 0 2026-03-09T16:15:04.915 INFO:tasks.workunit.client.1.vm05.stdout:5/76: dread f1 [4194304,4194304] 0 2026-03-09T16:15:04.918 INFO:tasks.workunit.client.1.vm05.stdout:7/141: symlink d1/d2/d8/dc/d15/l29 0 2026-03-09T16:15:04.918 INFO:tasks.workunit.client.1.vm05.stdout:3/62: mkdir d0/d9/d10 0 2026-03-09T16:15:04.918 INFO:tasks.workunit.client.1.vm05.stdout:2/70: rename db/dd/de/lf to db/dd/l16 0 2026-03-09T16:15:04.918 INFO:tasks.workunit.client.1.vm05.stdout:2/71: chown fa 181293 1 2026-03-09T16:15:04.922 INFO:tasks.workunit.client.1.vm05.stdout:7/142: dread d1/d2/d8/f17 [4194304,4194304] 0 2026-03-09T16:15:04.925 INFO:tasks.workunit.client.1.vm05.stdout:9/90: truncate f2 1292947 0 2026-03-09T16:15:04.926 INFO:tasks.workunit.client.1.vm05.stdout:4/88: getdents d5/de/d15 0 2026-03-09T16:15:04.927 INFO:tasks.workunit.client.1.vm05.stdout:6/88: getdents d17 0 2026-03-09T16:15:04.929 INFO:tasks.workunit.client.1.vm05.stdout:7/143: dread d1/d2/d11/f1c [0,4194304] 0 2026-03-09T16:15:04.930 INFO:tasks.workunit.client.1.vm05.stdout:2/72: creat db/f17 x:0 0 0 2026-03-09T16:15:04.932 INFO:tasks.workunit.client.1.vm05.stdout:2/73: dread fa [0,4194304] 0 2026-03-09T16:15:04.936 INFO:tasks.workunit.client.1.vm05.stdout:7/144: unlink d1/d2/ff 0 2026-03-09T16:15:04.936 INFO:tasks.workunit.client.1.vm05.stdout:7/145: stat d1/d2/f5 0 2026-03-09T16:15:04.939 INFO:tasks.workunit.client.1.vm05.stdout:4/89: creat d5/d19/f1e x:0 0 0 2026-03-09T16:15:04.942 INFO:tasks.workunit.client.1.vm05.stdout:8/74: truncate d4/d6/db/fd 2988397 0 2026-03-09T16:15:04.942 INFO:tasks.workunit.client.1.vm05.stdout:9/91: creat d4/d10/f18 x:0 0 0 2026-03-09T16:15:04.943 INFO:tasks.workunit.client.1.vm05.stdout:6/89: dwrite f9 [0,4194304] 0 2026-03-09T16:15:04.943 INFO:tasks.workunit.client.1.vm05.stdout:5/77: mkdir d8/d18/d1b 0 2026-03-09T16:15:04.948 INFO:tasks.workunit.client.1.vm05.stdout:2/74: mknod db/dd/d15/c18 0 2026-03-09T16:15:04.948 INFO:tasks.workunit.client.1.vm05.stdout:6/90: mknod d17/c19 0 2026-03-09T16:15:04.948 INFO:tasks.workunit.client.1.vm05.stdout:2/75: chown db 3437027 1 2026-03-09T16:15:04.948 INFO:tasks.workunit.client.1.vm05.stdout:9/92: write d4/f17 [987342,100918] 0 2026-03-09T16:15:04.950 INFO:tasks.workunit.client.1.vm05.stdout:2/76: dread f7 [0,4194304] 0 2026-03-09T16:15:04.956 INFO:tasks.workunit.client.1.vm05.stdout:5/78: mknod d8/d18/c1c 0 2026-03-09T16:15:04.956 INFO:tasks.workunit.client.1.vm05.stdout:1/94: truncate d7/dd/f11 1918266 0 2026-03-09T16:15:04.957 INFO:tasks.workunit.client.1.vm05.stdout:2/77: dwrite db/dd/f10 [0,4194304] 0 2026-03-09T16:15:04.966 INFO:tasks.workunit.client.1.vm05.stdout:7/146: truncate d1/d2/fe 3834713 0 2026-03-09T16:15:04.966 INFO:tasks.workunit.client.1.vm05.stdout:2/78: dwrite db/f12 [0,4194304] 0 2026-03-09T16:15:04.969 INFO:tasks.workunit.client.1.vm05.stdout:3/63: truncate d0/fd 1078897 0 2026-03-09T16:15:04.969 INFO:tasks.workunit.client.1.vm05.stdout:8/75: creat d4/d6/db/df/f1e x:0 0 0 2026-03-09T16:15:04.971 INFO:tasks.workunit.client.1.vm05.stdout:6/91: creat d17/f1a x:0 0 0 2026-03-09T16:15:04.983 INFO:tasks.workunit.client.1.vm05.stdout:1/95: creat d7/dd/de/f23 x:0 0 0 2026-03-09T16:15:04.985 INFO:tasks.workunit.client.1.vm05.stdout:6/92: dwrite f16 [0,4194304] 0 2026-03-09T16:15:04.986 INFO:tasks.workunit.client.1.vm05.stdout:6/93: dread - f11 zero size 2026-03-09T16:15:04.986 INFO:tasks.workunit.client.1.vm05.stdout:6/94: write d17/f18 [546402,100839] 0 2026-03-09T16:15:04.988 INFO:tasks.workunit.client.1.vm05.stdout:1/96: truncate d7/dd/de/f23 300977 0 2026-03-09T16:15:04.988 INFO:tasks.workunit.client.1.vm05.stdout:1/97: write d7/f9 [3036435,34648] 0 2026-03-09T16:15:04.989 INFO:tasks.workunit.client.1.vm05.stdout:1/98: write d7/dd/de/f23 [376623,45840] 0 2026-03-09T16:15:04.999 INFO:tasks.workunit.client.1.vm05.stdout:6/95: dwrite f16 [0,4194304] 0 2026-03-09T16:15:04.999 INFO:tasks.workunit.client.1.vm05.stdout:6/96: chown c15 33508570 1 2026-03-09T16:15:05.004 INFO:tasks.workunit.client.1.vm05.stdout:1/99: dwrite d7/dd/de/f23 [0,4194304] 0 2026-03-09T16:15:05.016 INFO:tasks.workunit.client.1.vm05.stdout:6/97: dread f9 [0,4194304] 0 2026-03-09T16:15:05.016 INFO:tasks.workunit.client.1.vm05.stdout:7/147: mkdir d1/d19/d2a 0 2026-03-09T16:15:05.016 INFO:tasks.workunit.client.1.vm05.stdout:5/79: mkdir d8/d1d 0 2026-03-09T16:15:05.016 INFO:tasks.workunit.client.1.vm05.stdout:2/79: fsync f5 0 2026-03-09T16:15:05.016 INFO:tasks.workunit.client.1.vm05.stdout:1/100: symlink d7/dd/l24 0 2026-03-09T16:15:05.017 INFO:tasks.workunit.client.1.vm05.stdout:2/80: chown db/dd/d15 0 1 2026-03-09T16:15:05.017 INFO:tasks.workunit.client.1.vm05.stdout:7/148: unlink d1/d2/d8/c1d 0 2026-03-09T16:15:05.017 INFO:tasks.workunit.client.1.vm05.stdout:3/64: chown d0/fd 365293297 1 2026-03-09T16:15:05.018 INFO:tasks.workunit.client.1.vm05.stdout:6/98: rename f9 to d17/f1b 0 2026-03-09T16:15:05.019 INFO:tasks.workunit.client.1.vm05.stdout:1/101: symlink d7/l25 0 2026-03-09T16:15:05.020 INFO:tasks.workunit.client.1.vm05.stdout:2/81: dread f7 [0,4194304] 0 2026-03-09T16:15:05.021 INFO:tasks.workunit.client.1.vm05.stdout:7/149: rename d1/d2/c3 to d1/d2/d8/dc/d14/c2b 0 2026-03-09T16:15:05.024 INFO:tasks.workunit.client.1.vm05.stdout:3/65: symlink d0/d9/d10/l11 0 2026-03-09T16:15:05.035 INFO:tasks.workunit.client.1.vm05.stdout:7/150: mknod d1/d2/d8/dc/d1b/c2c 0 2026-03-09T16:15:05.036 INFO:tasks.workunit.client.1.vm05.stdout:3/66: rename d0/d9/dc/cf to d0/d9/d10/c12 0 2026-03-09T16:15:05.038 INFO:tasks.workunit.client.1.vm05.stdout:7/151: mknod d1/d2/d8/dc/d14/c2d 0 2026-03-09T16:15:05.042 INFO:tasks.workunit.client.1.vm05.stdout:7/152: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:05.047 INFO:tasks.workunit.client.1.vm05.stdout:3/67: creat d0/f13 x:0 0 0 2026-03-09T16:15:05.056 INFO:tasks.workunit.client.1.vm05.stdout:7/153: dread d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:05.061 INFO:tasks.workunit.client.1.vm05.stdout:7/154: creat d1/d2/d8/dc/d18/f2e x:0 0 0 2026-03-09T16:15:05.061 INFO:tasks.workunit.client.1.vm05.stdout:7/155: readlink d1/l1f 0 2026-03-09T16:15:05.064 INFO:tasks.workunit.client.1.vm05.stdout:7/156: unlink d1/d2/l9 0 2026-03-09T16:15:05.065 INFO:tasks.workunit.client.1.vm05.stdout:7/157: write d1/d2/d11/f1c [207485,60816] 0 2026-03-09T16:15:05.069 INFO:tasks.workunit.client.1.vm05.stdout:7/158: link d1/c28 d1/d2/c2f 0 2026-03-09T16:15:05.073 INFO:tasks.workunit.client.1.vm05.stdout:4/90: write d5/f9 [1221044,64370] 0 2026-03-09T16:15:05.074 INFO:tasks.workunit.client.1.vm05.stdout:2/82: rmdir db 39 2026-03-09T16:15:05.078 INFO:tasks.workunit.client.1.vm05.stdout:2/83: stat db/lc 0 2026-03-09T16:15:05.078 INFO:tasks.workunit.client.1.vm05.stdout:0/102: truncate d5/f8 360947 0 2026-03-09T16:15:05.079 INFO:tasks.workunit.client.1.vm05.stdout:1/102: read d7/dd/f11 [800537,69497] 0 2026-03-09T16:15:05.079 INFO:tasks.workunit.client.1.vm05.stdout:9/93: dread d4/f17 [0,4194304] 0 2026-03-09T16:15:05.079 INFO:tasks.workunit.client.1.vm05.stdout:0/103: fdatasync d5/d11/f23 0 2026-03-09T16:15:05.079 INFO:tasks.workunit.client.1.vm05.stdout:4/91: creat d5/d19/f1f x:0 0 0 2026-03-09T16:15:05.081 INFO:tasks.workunit.client.1.vm05.stdout:2/84: mknod db/c19 0 2026-03-09T16:15:05.097 INFO:tasks.workunit.client.1.vm05.stdout:1/103: rename f1 to d7/d15/d16/f26 0 2026-03-09T16:15:05.098 INFO:tasks.workunit.client.1.vm05.stdout:0/104: unlink d5/d11/f16 0 2026-03-09T16:15:05.099 INFO:tasks.workunit.client.1.vm05.stdout:4/92: mknod d5/c20 0 2026-03-09T16:15:05.104 INFO:tasks.workunit.client.1.vm05.stdout:2/85: fdatasync f5 0 2026-03-09T16:15:05.107 INFO:tasks.workunit.client.1.vm05.stdout:1/104: dread d7/dd/de/f23 [0,4194304] 0 2026-03-09T16:15:05.110 INFO:tasks.workunit.client.1.vm05.stdout:2/86: creat db/dd/de/f1a x:0 0 0 2026-03-09T16:15:05.110 INFO:tasks.workunit.client.1.vm05.stdout:4/93: mkdir d5/de/d15/d21 0 2026-03-09T16:15:05.111 INFO:tasks.workunit.client.1.vm05.stdout:0/105: creat d5/d1b/f25 x:0 0 0 2026-03-09T16:15:05.119 INFO:tasks.workunit.client.1.vm05.stdout:0/106: dread - d5/d11/f23 zero size 2026-03-09T16:15:05.119 INFO:tasks.workunit.client.1.vm05.stdout:9/94: link d4/d8/c13 d4/d8/c19 0 2026-03-09T16:15:05.121 INFO:tasks.workunit.client.1.vm05.stdout:4/94: unlink f4 0 2026-03-09T16:15:05.121 INFO:tasks.workunit.client.1.vm05.stdout:0/107: write d5/f9 [3866637,77716] 0 2026-03-09T16:15:05.122 INFO:tasks.workunit.client.1.vm05.stdout:2/87: link db/dd/de/f1a db/dd/f1b 0 2026-03-09T16:15:05.122 INFO:tasks.workunit.client.1.vm05.stdout:2/88: readlink db/l13 0 2026-03-09T16:15:05.122 INFO:tasks.workunit.client.1.vm05.stdout:0/108: dread - d5/d1b/f25 zero size 2026-03-09T16:15:05.128 INFO:tasks.workunit.client.1.vm05.stdout:9/95: rmdir d4/d8 39 2026-03-09T16:15:05.129 INFO:tasks.workunit.client.1.vm05.stdout:0/109: mknod d5/d1f/c26 0 2026-03-09T16:15:05.130 INFO:tasks.workunit.client.1.vm05.stdout:0/110: creat d5/d1f/f27 x:0 0 0 2026-03-09T16:15:05.131 INFO:tasks.workunit.client.1.vm05.stdout:0/111: dread d5/f17 [0,4194304] 0 2026-03-09T16:15:05.132 INFO:tasks.workunit.client.1.vm05.stdout:2/89: link db/dd/d15/c18 db/dd/d15/c1c 0 2026-03-09T16:15:05.133 INFO:tasks.workunit.client.1.vm05.stdout:0/112: rename d5/f9 to d5/d24/f28 0 2026-03-09T16:15:05.134 INFO:tasks.workunit.client.1.vm05.stdout:2/90: symlink db/dd/de/l1d 0 2026-03-09T16:15:05.136 INFO:tasks.workunit.client.1.vm05.stdout:2/91: mknod db/c1e 0 2026-03-09T16:15:05.138 INFO:tasks.workunit.client.1.vm05.stdout:2/92: mkdir db/dd/d15/d1f 0 2026-03-09T16:15:05.139 INFO:tasks.workunit.client.1.vm05.stdout:9/96: link d4/d8/c13 d4/d10/c1a 0 2026-03-09T16:15:05.139 INFO:tasks.workunit.client.1.vm05.stdout:0/113: dread d5/db/f12 [0,4194304] 0 2026-03-09T16:15:05.139 INFO:tasks.workunit.client.1.vm05.stdout:2/93: mkdir db/dd/d15/d1f/d20 0 2026-03-09T16:15:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:05.150 INFO:tasks.workunit.client.1.vm05.stdout:9/97: dread d4/f17 [0,4194304] 0 2026-03-09T16:15:05.152 INFO:tasks.workunit.client.1.vm05.stdout:0/114: dwrite d5/f17 [0,4194304] 0 2026-03-09T16:15:05.154 INFO:tasks.workunit.client.1.vm05.stdout:0/115: write d5/d11/f1e [365954,7544] 0 2026-03-09T16:15:05.157 INFO:tasks.workunit.client.1.vm05.stdout:2/94: mkdir db/dd/d15/d1f/d21 0 2026-03-09T16:15:05.158 INFO:tasks.workunit.client.1.vm05.stdout:3/68: chown d0/d9/fa 75909478 1 2026-03-09T16:15:05.158 INFO:tasks.workunit.client.1.vm05.stdout:2/95: truncate db/f17 367711 0 2026-03-09T16:15:05.158 INFO:tasks.workunit.client.1.vm05.stdout:5/80: rmdir d8 39 2026-03-09T16:15:05.158 INFO:tasks.workunit.client.1.vm05.stdout:4/95: dwrite d5/de/d15/f1b [4194304,4194304] 0 2026-03-09T16:15:05.161 INFO:tasks.workunit.client.1.vm05.stdout:9/98: mknod d4/d8/c1b 0 2026-03-09T16:15:05.162 INFO:tasks.workunit.client.1.vm05.stdout:0/116: creat d5/d1f/f29 x:0 0 0 2026-03-09T16:15:05.164 INFO:tasks.workunit.client.1.vm05.stdout:2/96: write db/dd/f1b [284727,52731] 0 2026-03-09T16:15:05.168 INFO:tasks.workunit.client.1.vm05.stdout:5/81: dwrite d8/f11 [0,4194304] 0 2026-03-09T16:15:05.170 INFO:tasks.workunit.client.1.vm05.stdout:4/96: link d5/fb d5/de/f22 0 2026-03-09T16:15:05.171 INFO:tasks.workunit.client.1.vm05.stdout:8/76: dread d4/d6/db/fd [0,4194304] 0 2026-03-09T16:15:05.176 INFO:tasks.workunit.client.1.vm05.stdout:0/117: write d5/d24/f28 [1062783,19154] 0 2026-03-09T16:15:05.176 INFO:tasks.workunit.client.1.vm05.stdout:4/97: creat d5/de/f23 x:0 0 0 2026-03-09T16:15:05.177 INFO:tasks.workunit.client.1.vm05.stdout:0/118: write d5/d1f/f29 [591428,128684] 0 2026-03-09T16:15:05.180 INFO:tasks.workunit.client.1.vm05.stdout:8/77: dread - d4/d6/db/df/f1e zero size 2026-03-09T16:15:05.181 INFO:tasks.workunit.client.1.vm05.stdout:2/97: getdents db/dd/d15/d1f/d20 0 2026-03-09T16:15:05.185 INFO:tasks.workunit.client.1.vm05.stdout:0/119: dread - d5/d1f/f27 zero size 2026-03-09T16:15:05.187 INFO:tasks.workunit.client.1.vm05.stdout:0/120: fsync d5/d11/f23 0 2026-03-09T16:15:05.188 INFO:tasks.workunit.client.1.vm05.stdout:6/99: getdents d17 0 2026-03-09T16:15:05.188 INFO:tasks.workunit.client.1.vm05.stdout:6/100: stat d17 0 2026-03-09T16:15:05.188 INFO:tasks.workunit.client.1.vm05.stdout:0/121: readlink d5/db/d1d/l22 0 2026-03-09T16:15:05.196 INFO:tasks.workunit.client.1.vm05.stdout:5/82: creat d8/d1d/f1e x:0 0 0 2026-03-09T16:15:05.200 INFO:tasks.workunit.client.1.vm05.stdout:7/159: truncate d1/d2/d8/dc/f1e 46837 0 2026-03-09T16:15:05.200 INFO:tasks.workunit.client.1.vm05.stdout:8/78: dread d4/d6/db/fd [0,4194304] 0 2026-03-09T16:15:05.201 INFO:tasks.workunit.client.1.vm05.stdout:2/98: dread f7 [0,4194304] 0 2026-03-09T16:15:05.206 INFO:tasks.workunit.client.1.vm05.stdout:6/101: creat d17/f1c x:0 0 0 2026-03-09T16:15:05.206 INFO:tasks.workunit.client.1.vm05.stdout:6/102: chown f5 28 1 2026-03-09T16:15:05.214 INFO:tasks.workunit.client.1.vm05.stdout:2/99: readlink db/dd/l16 0 2026-03-09T16:15:05.214 INFO:tasks.workunit.client.1.vm05.stdout:8/79: creat d4/d6/f1f x:0 0 0 2026-03-09T16:15:05.219 INFO:tasks.workunit.client.1.vm05.stdout:2/100: rmdir db/dd/d15 39 2026-03-09T16:15:05.220 INFO:tasks.workunit.client.1.vm05.stdout:8/80: mknod d4/d6/db/d14/c20 0 2026-03-09T16:15:05.223 INFO:tasks.workunit.client.1.vm05.stdout:5/83: creat d8/f1f x:0 0 0 2026-03-09T16:15:05.224 INFO:tasks.workunit.client.1.vm05.stdout:5/84: dread - d8/d1d/f1e zero size 2026-03-09T16:15:05.226 INFO:tasks.workunit.client.1.vm05.stdout:8/81: mknod d4/d6/db/dc/c21 0 2026-03-09T16:15:05.227 INFO:tasks.workunit.client.1.vm05.stdout:8/82: chown d4/d6/db/d14 273271236 1 2026-03-09T16:15:05.228 INFO:tasks.workunit.client.1.vm05.stdout:1/105: truncate d7/f9 3577149 0 2026-03-09T16:15:05.232 INFO:tasks.workunit.client.1.vm05.stdout:5/85: readlink d8/l14 0 2026-03-09T16:15:05.233 INFO:tasks.workunit.client.1.vm05.stdout:5/86: chown d8/d18 7626132 1 2026-03-09T16:15:05.233 INFO:tasks.workunit.client.1.vm05.stdout:2/101: mknod db/dd/d15/d1f/c22 0 2026-03-09T16:15:05.236 INFO:tasks.workunit.client.1.vm05.stdout:9/99: getdents d4/d10 0 2026-03-09T16:15:05.239 INFO:tasks.workunit.client.1.vm05.stdout:5/87: unlink d8/l10 0 2026-03-09T16:15:05.240 INFO:tasks.workunit.client.1.vm05.stdout:8/83: creat d4/d6/db/d14/d15/f22 x:0 0 0 2026-03-09T16:15:05.242 INFO:tasks.workunit.client.1.vm05.stdout:3/69: read d0/fd [797220,7662] 0 2026-03-09T16:15:05.242 INFO:tasks.workunit.client.1.vm05.stdout:1/106: mkdir d7/d27 0 2026-03-09T16:15:05.243 INFO:tasks.workunit.client.1.vm05.stdout:9/100: mknod d4/d10/c1c 0 2026-03-09T16:15:05.245 INFO:tasks.workunit.client.1.vm05.stdout:5/88: creat d8/d18/f20 x:0 0 0 2026-03-09T16:15:05.246 INFO:tasks.workunit.client.1.vm05.stdout:8/84: readlink l3 0 2026-03-09T16:15:05.249 INFO:tasks.workunit.client.1.vm05.stdout:7/160: dread d1/d2/fe [0,4194304] 0 2026-03-09T16:15:05.251 INFO:tasks.workunit.client.1.vm05.stdout:4/98: truncate d5/f9 1239959 0 2026-03-09T16:15:05.254 INFO:tasks.workunit.client.1.vm05.stdout:2/102: sync 2026-03-09T16:15:05.256 INFO:tasks.workunit.client.1.vm05.stdout:0/122: truncate d5/db/f12 2065585 0 2026-03-09T16:15:05.258 INFO:tasks.workunit.client.1.vm05.stdout:9/101: creat d4/d10/f1d x:0 0 0 2026-03-09T16:15:05.259 INFO:tasks.workunit.client.1.vm05.stdout:8/85: creat d4/f23 x:0 0 0 2026-03-09T16:15:05.263 INFO:tasks.workunit.client.1.vm05.stdout:5/89: creat d8/d1d/f21 x:0 0 0 2026-03-09T16:15:05.263 INFO:tasks.workunit.client.1.vm05.stdout:6/103: dwrite fc [0,4194304] 0 2026-03-09T16:15:05.264 INFO:tasks.workunit.client.1.vm05.stdout:0/123: dwrite d5/d1f/f27 [0,4194304] 0 2026-03-09T16:15:05.264 INFO:tasks.workunit.client.1.vm05.stdout:4/99: rename d5/de/f22 to d5/de/f24 0 2026-03-09T16:15:05.264 INFO:tasks.workunit.client.1.vm05.stdout:1/107: creat d7/d27/f28 x:0 0 0 2026-03-09T16:15:05.266 INFO:tasks.workunit.client.1.vm05.stdout:8/86: write d4/f13 [930609,59333] 0 2026-03-09T16:15:05.268 INFO:tasks.workunit.client.1.vm05.stdout:6/104: chown fc 105679 1 2026-03-09T16:15:05.269 INFO:tasks.workunit.client.1.vm05.stdout:0/124: sync 2026-03-09T16:15:05.274 INFO:tasks.workunit.client.1.vm05.stdout:4/100: read - d5/de/f23 zero size 2026-03-09T16:15:05.274 INFO:tasks.workunit.client.1.vm05.stdout:2/103: rename db/dd/de to db/dd/d15/d1f/d20/d23 0 2026-03-09T16:15:05.276 INFO:tasks.workunit.client.1.vm05.stdout:9/102: unlink d4/d10/l14 0 2026-03-09T16:15:05.276 INFO:tasks.workunit.client.1.vm05.stdout:4/101: readlink d5/lf 0 2026-03-09T16:15:05.280 INFO:tasks.workunit.client.1.vm05.stdout:9/103: write d4/d10/f15 [287897,106628] 0 2026-03-09T16:15:05.280 INFO:tasks.workunit.client.1.vm05.stdout:4/102: fsync d5/f10 0 2026-03-09T16:15:05.280 INFO:tasks.workunit.client.1.vm05.stdout:7/161: stat d1/d2/c27 0 2026-03-09T16:15:05.281 INFO:tasks.workunit.client.1.vm05.stdout:6/105: rmdir d17 39 2026-03-09T16:15:05.281 INFO:tasks.workunit.client.1.vm05.stdout:0/125: creat d5/d1f/f2a x:0 0 0 2026-03-09T16:15:05.281 INFO:tasks.workunit.client.1.vm05.stdout:5/90: mknod d8/d1d/c22 0 2026-03-09T16:15:05.283 INFO:tasks.workunit.client.1.vm05.stdout:5/91: write d8/d18/f20 [99675,126671] 0 2026-03-09T16:15:05.284 INFO:tasks.workunit.client.1.vm05.stdout:1/108: rename d7/d27/f28 to d7/d15/d16/f29 0 2026-03-09T16:15:05.286 INFO:tasks.workunit.client.1.vm05.stdout:4/103: truncate d5/d19/f1e 774701 0 2026-03-09T16:15:05.291 INFO:tasks.workunit.client.1.vm05.stdout:9/104: dread d4/fa [0,4194304] 0 2026-03-09T16:15:05.295 INFO:tasks.workunit.client.1.vm05.stdout:2/104: dread db/f12 [0,4194304] 0 2026-03-09T16:15:05.296 INFO:tasks.workunit.client.1.vm05.stdout:6/106: chown d17 346777 1 2026-03-09T16:15:05.297 INFO:tasks.workunit.client.1.vm05.stdout:8/87: dwrite d4/d6/db/fd [0,4194304] 0 2026-03-09T16:15:05.302 INFO:tasks.workunit.client.1.vm05.stdout:2/105: sync 2026-03-09T16:15:05.305 INFO:tasks.workunit.client.1.vm05.stdout:8/88: chown d4/c16 63 1 2026-03-09T16:15:05.306 INFO:tasks.workunit.client.1.vm05.stdout:3/70: dread d0/fd [0,4194304] 0 2026-03-09T16:15:05.306 INFO:tasks.workunit.client.1.vm05.stdout:0/126: symlink d5/d1b/l2b 0 2026-03-09T16:15:05.307 INFO:tasks.workunit.client.1.vm05.stdout:7/162: mkdir d1/d2/d8/dc/d1b/d30 0 2026-03-09T16:15:05.308 INFO:tasks.workunit.client.1.vm05.stdout:0/127: truncate d5/d1f/f29 1030792 0 2026-03-09T16:15:05.312 INFO:tasks.workunit.client.1.vm05.stdout:3/71: dread d0/fd [0,4194304] 0 2026-03-09T16:15:05.312 INFO:tasks.workunit.client.1.vm05.stdout:1/109: dwrite d7/dd/f19 [0,4194304] 0 2026-03-09T16:15:05.312 INFO:tasks.workunit.client.1.vm05.stdout:6/107: mkdir d17/d1d 0 2026-03-09T16:15:05.319 INFO:tasks.workunit.client.1.vm05.stdout:6/108: write d17/f18 [261992,35736] 0 2026-03-09T16:15:05.319 INFO:tasks.workunit.client.1.vm05.stdout:2/106: write db/f12 [2658815,61548] 0 2026-03-09T16:15:05.319 INFO:tasks.workunit.client.1.vm05.stdout:7/163: write d1/d2/f5 [3538255,111931] 0 2026-03-09T16:15:05.324 INFO:tasks.workunit.client.1.vm05.stdout:0/128: rename d5/d24 to d5/d2c 0 2026-03-09T16:15:05.330 INFO:tasks.workunit.client.1.vm05.stdout:3/72: creat d0/d9/dc/f14 x:0 0 0 2026-03-09T16:15:05.330 INFO:tasks.workunit.client.1.vm05.stdout:1/110: fsync d7/f9 0 2026-03-09T16:15:05.330 INFO:tasks.workunit.client.1.vm05.stdout:7/164: mkdir d1/d2/d8/d31 0 2026-03-09T16:15:05.330 INFO:tasks.workunit.client.1.vm05.stdout:1/111: write f0 [586436,13357] 0 2026-03-09T16:15:05.334 INFO:tasks.workunit.client.1.vm05.stdout:2/107: creat db/dd/d15/d1f/f24 x:0 0 0 2026-03-09T16:15:05.337 INFO:tasks.workunit.client.1.vm05.stdout:9/105: link d4/d8/c1b d4/d8/c1e 0 2026-03-09T16:15:05.339 INFO:tasks.workunit.client.1.vm05.stdout:9/106: write d4/d10/f18 [967101,120557] 0 2026-03-09T16:15:05.339 INFO:tasks.workunit.client.1.vm05.stdout:6/109: creat d17/d1d/f1e x:0 0 0 2026-03-09T16:15:05.339 INFO:tasks.workunit.client.1.vm05.stdout:7/165: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:05.343 INFO:tasks.workunit.client.1.vm05.stdout:4/104: getdents d5 0 2026-03-09T16:15:05.344 INFO:tasks.workunit.client.1.vm05.stdout:4/105: write d5/d19/f1f [259563,11852] 0 2026-03-09T16:15:05.346 INFO:tasks.workunit.client.1.vm05.stdout:1/112: creat d7/d15/f2a x:0 0 0 2026-03-09T16:15:05.347 INFO:tasks.workunit.client.1.vm05.stdout:9/107: symlink d4/l1f 0 2026-03-09T16:15:05.348 INFO:tasks.workunit.client.1.vm05.stdout:1/113: write d7/fb [4383189,65187] 0 2026-03-09T16:15:05.348 INFO:tasks.workunit.client.1.vm05.stdout:7/166: mknod d1/d19/c32 0 2026-03-09T16:15:05.348 INFO:tasks.workunit.client.1.vm05.stdout:7/167: chown d1/d2/f5 2410386 1 2026-03-09T16:15:05.349 INFO:tasks.workunit.client.1.vm05.stdout:9/108: chown f2 178111 1 2026-03-09T16:15:05.349 INFO:tasks.workunit.client.1.vm05.stdout:4/106: creat d5/de/d15/f25 x:0 0 0 2026-03-09T16:15:05.349 INFO:tasks.workunit.client.1.vm05.stdout:7/168: readlink d1/d2/d8/lb 0 2026-03-09T16:15:05.350 INFO:tasks.workunit.client.1.vm05.stdout:7/169: chown d1/d19/c32 2945 1 2026-03-09T16:15:05.352 INFO:tasks.workunit.client.1.vm05.stdout:9/109: write d4/d10/f15 [153837,116709] 0 2026-03-09T16:15:05.352 INFO:tasks.workunit.client.1.vm05.stdout:1/114: link d7/d15/d16/f1c d7/dd/d21/f2b 0 2026-03-09T16:15:05.355 INFO:tasks.workunit.client.1.vm05.stdout:4/107: creat d5/de/d15/d21/f26 x:0 0 0 2026-03-09T16:15:05.355 INFO:tasks.workunit.client.1.vm05.stdout:7/170: mkdir d1/d2/d8/dc/d33 0 2026-03-09T16:15:05.356 INFO:tasks.workunit.client.1.vm05.stdout:9/110: sync 2026-03-09T16:15:05.356 INFO:tasks.workunit.client.1.vm05.stdout:4/108: chown d5/f6 1973946055 1 2026-03-09T16:15:05.356 INFO:tasks.workunit.client.1.vm05.stdout:1/115: symlink d7/d15/d16/l2c 0 2026-03-09T16:15:05.356 INFO:tasks.workunit.client.1.vm05.stdout:7/171: chown d1/d2/d8/d31 0 1 2026-03-09T16:15:05.356 INFO:tasks.workunit.client.1.vm05.stdout:9/111: chown d4/d10/f18 1078975 1 2026-03-09T16:15:05.357 INFO:tasks.workunit.client.1.vm05.stdout:9/112: stat d4/d10 0 2026-03-09T16:15:05.358 INFO:tasks.workunit.client.1.vm05.stdout:1/116: dread d7/dd/f11 [0,4194304] 0 2026-03-09T16:15:05.358 INFO:tasks.workunit.client.1.vm05.stdout:1/117: dread - d7/dd/d21/f2b zero size 2026-03-09T16:15:05.362 INFO:tasks.workunit.client.1.vm05.stdout:5/92: mknod d8/c23 0 2026-03-09T16:15:05.366 INFO:tasks.workunit.client.1.vm05.stdout:7/172: mknod d1/d2/d8/dc/d15/c34 0 2026-03-09T16:15:05.366 INFO:tasks.workunit.client.1.vm05.stdout:9/113: fsync f2 0 2026-03-09T16:15:05.367 INFO:tasks.workunit.client.1.vm05.stdout:5/93: write d8/d18/f20 [617532,106837] 0 2026-03-09T16:15:05.368 INFO:tasks.workunit.client.1.vm05.stdout:1/118: write d7/d15/d16/f29 [687506,45139] 0 2026-03-09T16:15:05.370 INFO:tasks.workunit.client.1.vm05.stdout:5/94: chown d8/d18/c1c 8728024 1 2026-03-09T16:15:05.373 INFO:tasks.workunit.client.1.vm05.stdout:7/173: rename d1/d2/d8/f17 to d1/d2/d8/dc/d14/f35 0 2026-03-09T16:15:05.383 INFO:tasks.workunit.client.1.vm05.stdout:7/174: unlink d1/d2/c27 0 2026-03-09T16:15:05.383 INFO:tasks.workunit.client.1.vm05.stdout:7/175: chown d1/d2/d11 5146892 1 2026-03-09T16:15:05.383 INFO:tasks.workunit.client.1.vm05.stdout:7/176: chown d1/d2/d8/lb 360605 1 2026-03-09T16:15:05.384 INFO:tasks.workunit.client.1.vm05.stdout:7/177: dread d1/d2/f22 [0,4194304] 0 2026-03-09T16:15:05.388 INFO:tasks.workunit.client.1.vm05.stdout:5/95: rmdir d8/d1d 39 2026-03-09T16:15:05.389 INFO:tasks.workunit.client.1.vm05.stdout:7/178: write d1/d2/d8/dc/f1a [1199056,40507] 0 2026-03-09T16:15:05.390 INFO:tasks.workunit.client.1.vm05.stdout:1/119: dwrite d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:05.391 INFO:tasks.workunit.client.1.vm05.stdout:1/120: stat d7/fc 0 2026-03-09T16:15:05.391 INFO:tasks.workunit.client.1.vm05.stdout:1/121: read d7/d15/d16/f1c [3960972,105200] 0 2026-03-09T16:15:05.391 INFO:tasks.workunit.client.1.vm05.stdout:4/109: dread f1 [0,4194304] 0 2026-03-09T16:15:05.400 INFO:tasks.workunit.client.1.vm05.stdout:5/96: symlink d8/l24 0 2026-03-09T16:15:05.406 INFO:tasks.workunit.client.1.vm05.stdout:1/122: mkdir d7/dd/d21/d2d 0 2026-03-09T16:15:05.406 INFO:tasks.workunit.client.1.vm05.stdout:7/179: link d1/d2/d8/c16 d1/d2/d8/dc/c36 0 2026-03-09T16:15:05.406 INFO:tasks.workunit.client.1.vm05.stdout:5/97: dread - d8/f13 zero size 2026-03-09T16:15:05.409 INFO:tasks.workunit.client.1.vm05.stdout:9/114: read d4/d10/f18 [715752,93863] 0 2026-03-09T16:15:05.410 INFO:tasks.workunit.client.1.vm05.stdout:7/180: fsync d1/d2/f22 0 2026-03-09T16:15:05.410 INFO:tasks.workunit.client.1.vm05.stdout:1/123: creat d7/dd/de/f2e x:0 0 0 2026-03-09T16:15:05.411 INFO:tasks.workunit.client.1.vm05.stdout:9/115: write f2 [232011,113973] 0 2026-03-09T16:15:05.412 INFO:tasks.workunit.client.1.vm05.stdout:5/98: getdents d8/d18/d1b 0 2026-03-09T16:15:05.412 INFO:tasks.workunit.client.1.vm05.stdout:7/181: mknod d1/d19/c37 0 2026-03-09T16:15:05.420 INFO:tasks.workunit.client.1.vm05.stdout:5/99: mknod d8/d18/d1b/c25 0 2026-03-09T16:15:05.421 INFO:tasks.workunit.client.1.vm05.stdout:1/124: dwrite d7/fb [4194304,4194304] 0 2026-03-09T16:15:05.443 INFO:tasks.workunit.client.1.vm05.stdout:9/116: dread d4/d10/f18 [0,4194304] 0 2026-03-09T16:15:05.443 INFO:tasks.workunit.client.1.vm05.stdout:7/182: link d1/d2/d8/dc/d14/c13 d1/d19/d2a/c38 0 2026-03-09T16:15:05.445 INFO:tasks.workunit.client.1.vm05.stdout:8/89: truncate f0 971094 0 2026-03-09T16:15:05.448 INFO:tasks.workunit.client.1.vm05.stdout:2/108: write f7 [783285,11361] 0 2026-03-09T16:15:05.448 INFO:tasks.workunit.client.1.vm05.stdout:3/73: write d0/d9/fa [1679001,66128] 0 2026-03-09T16:15:05.451 INFO:tasks.workunit.client.1.vm05.stdout:2/109: write db/f17 [1391447,13778] 0 2026-03-09T16:15:05.454 INFO:tasks.workunit.client.1.vm05.stdout:7/183: stat d1/c28 0 2026-03-09T16:15:05.454 INFO:tasks.workunit.client.1.vm05.stdout:8/90: rename d4/d6/db/df/f1e to d4/d6/f24 0 2026-03-09T16:15:05.457 INFO:tasks.workunit.client.1.vm05.stdout:1/125: symlink d7/dd/d21/d2d/l2f 0 2026-03-09T16:15:05.457 INFO:tasks.workunit.client.1.vm05.stdout:0/129: dwrite d5/f7 [0,4194304] 0 2026-03-09T16:15:05.458 INFO:tasks.workunit.client.1.vm05.stdout:1/126: chown d7 241377 1 2026-03-09T16:15:05.463 INFO:tasks.workunit.client.1.vm05.stdout:9/117: creat d4/f20 x:0 0 0 2026-03-09T16:15:05.463 INFO:tasks.workunit.client.1.vm05.stdout:2/110: link db/dd/d15/d1f/f24 db/dd/d15/d1f/f25 0 2026-03-09T16:15:05.469 INFO:tasks.workunit.client.1.vm05.stdout:3/74: rename d0/l6 to d0/d9/dc/l15 0 2026-03-09T16:15:05.469 INFO:tasks.workunit.client.1.vm05.stdout:7/184: creat d1/d2/d8/d31/f39 x:0 0 0 2026-03-09T16:15:05.470 INFO:tasks.workunit.client.1.vm05.stdout:9/118: fsync d4/fa 0 2026-03-09T16:15:05.470 INFO:tasks.workunit.client.1.vm05.stdout:1/127: symlink d7/d15/l30 0 2026-03-09T16:15:05.473 INFO:tasks.workunit.client.1.vm05.stdout:2/111: dread db/dd/f10 [0,4194304] 0 2026-03-09T16:15:05.474 INFO:tasks.workunit.client.1.vm05.stdout:3/75: mkdir d0/d9/d16 0 2026-03-09T16:15:05.479 INFO:tasks.workunit.client.1.vm05.stdout:9/119: creat d4/d8/f21 x:0 0 0 2026-03-09T16:15:05.479 INFO:tasks.workunit.client.1.vm05.stdout:7/185: rename d1/d2/c7 to d1/d2/d8/dc/d15/c3a 0 2026-03-09T16:15:05.480 INFO:tasks.workunit.client.1.vm05.stdout:0/130: link d5/ca d5/db/d1d/c2d 0 2026-03-09T16:15:05.481 INFO:tasks.workunit.client.1.vm05.stdout:3/76: truncate d0/f13 964222 0 2026-03-09T16:15:05.481 INFO:tasks.workunit.client.1.vm05.stdout:2/112: mknod db/dd/d15/d1f/d20/d23/c26 0 2026-03-09T16:15:05.481 INFO:tasks.workunit.client.1.vm05.stdout:3/77: rename d0 to d0/d9/d17 22 2026-03-09T16:15:05.482 INFO:tasks.workunit.client.1.vm05.stdout:9/120: creat d4/d10/f22 x:0 0 0 2026-03-09T16:15:05.483 INFO:tasks.workunit.client.1.vm05.stdout:3/78: write d0/f13 [1077383,5354] 0 2026-03-09T16:15:05.486 INFO:tasks.workunit.client.1.vm05.stdout:2/113: mknod db/dd/d15/d1f/d20/d23/c27 0 2026-03-09T16:15:05.488 INFO:tasks.workunit.client.1.vm05.stdout:0/131: dwrite d5/d11/f23 [0,4194304] 0 2026-03-09T16:15:05.501 INFO:tasks.workunit.client.1.vm05.stdout:3/79: dwrite d0/d9/dc/f14 [0,4194304] 0 2026-03-09T16:15:05.501 INFO:tasks.workunit.client.1.vm05.stdout:3/80: chown d0/d9/d10 0 1 2026-03-09T16:15:05.530 INFO:tasks.workunit.client.1.vm05.stdout:6/110: truncate f16 3056850 0 2026-03-09T16:15:05.531 INFO:tasks.workunit.client.1.vm05.stdout:6/111: write d17/f1b [472840,19538] 0 2026-03-09T16:15:05.533 INFO:tasks.workunit.client.1.vm05.stdout:6/112: mknod d17/d1d/c1f 0 2026-03-09T16:15:05.534 INFO:tasks.workunit.client.1.vm05.stdout:6/113: fsync d17/f1c 0 2026-03-09T16:15:05.534 INFO:tasks.workunit.client.1.vm05.stdout:6/114: write d17/d1d/f1e [847113,3209] 0 2026-03-09T16:15:05.536 INFO:tasks.workunit.client.1.vm05.stdout:4/110: unlink d5/f9 0 2026-03-09T16:15:05.536 INFO:tasks.workunit.client.1.vm05.stdout:4/111: stat f1 0 2026-03-09T16:15:05.537 INFO:tasks.workunit.client.1.vm05.stdout:6/115: dread fb [0,4194304] 0 2026-03-09T16:15:05.548 INFO:tasks.workunit.client.1.vm05.stdout:4/112: unlink d5/de/d15/f17 0 2026-03-09T16:15:05.549 INFO:tasks.workunit.client.1.vm05.stdout:4/113: chown d5/l8 6215739 1 2026-03-09T16:15:05.550 INFO:tasks.workunit.client.1.vm05.stdout:1/128: truncate d7/dd/d21/f2b 2050282 0 2026-03-09T16:15:05.551 INFO:tasks.workunit.client.1.vm05.stdout:4/114: mkdir d5/de/d15/d21/d27 0 2026-03-09T16:15:05.552 INFO:tasks.workunit.client.1.vm05.stdout:5/100: getdents d8 0 2026-03-09T16:15:05.552 INFO:tasks.workunit.client.1.vm05.stdout:5/101: chown d8/fd 19800 1 2026-03-09T16:15:05.555 INFO:tasks.workunit.client.1.vm05.stdout:1/129: symlink d7/l31 0 2026-03-09T16:15:05.555 INFO:tasks.workunit.client.1.vm05.stdout:6/116: sync 2026-03-09T16:15:05.561 INFO:tasks.workunit.client.1.vm05.stdout:1/130: fdatasync d7/d15/d16/f26 0 2026-03-09T16:15:05.561 INFO:tasks.workunit.client.1.vm05.stdout:5/102: dwrite d8/f13 [0,4194304] 0 2026-03-09T16:15:05.563 INFO:tasks.workunit.client.1.vm05.stdout:1/131: sync 2026-03-09T16:15:05.569 INFO:tasks.workunit.client.1.vm05.stdout:1/132: creat d7/dd/de/f32 x:0 0 0 2026-03-09T16:15:05.569 INFO:tasks.workunit.client.1.vm05.stdout:4/115: link d5/c13 d5/de/d15/c28 0 2026-03-09T16:15:05.573 INFO:tasks.workunit.client.1.vm05.stdout:5/103: dwrite d8/d1d/f1e [0,4194304] 0 2026-03-09T16:15:05.574 INFO:tasks.workunit.client.1.vm05.stdout:5/104: write f5 [1868344,6437] 0 2026-03-09T16:15:05.582 INFO:tasks.workunit.client.1.vm05.stdout:1/133: read d7/dd/de/f23 [3279244,89775] 0 2026-03-09T16:15:05.593 INFO:tasks.workunit.client.1.vm05.stdout:4/116: creat d5/de/d15/d21/d27/f29 x:0 0 0 2026-03-09T16:15:05.594 INFO:tasks.workunit.client.1.vm05.stdout:4/117: fsync d5/de/d15/f25 0 2026-03-09T16:15:05.599 INFO:tasks.workunit.client.1.vm05.stdout:8/91: getdents d4/d6 0 2026-03-09T16:15:05.608 INFO:tasks.workunit.client.1.vm05.stdout:8/92: creat d4/d6/db/d14/f25 x:0 0 0 2026-03-09T16:15:05.611 INFO:tasks.workunit.client.1.vm05.stdout:8/93: dread - d4/f23 zero size 2026-03-09T16:15:05.612 INFO:tasks.workunit.client.1.vm05.stdout:4/118: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:05.614 INFO:tasks.workunit.client.1.vm05.stdout:8/94: sync 2026-03-09T16:15:05.621 INFO:tasks.workunit.client.1.vm05.stdout:8/95: creat d4/d6/db/dc/f26 x:0 0 0 2026-03-09T16:15:05.625 INFO:tasks.workunit.client.1.vm05.stdout:8/96: creat d4/d6/f27 x:0 0 0 2026-03-09T16:15:05.625 INFO:tasks.workunit.client.1.vm05.stdout:4/119: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:05.625 INFO:tasks.workunit.client.1.vm05.stdout:2/114: fdatasync f7 0 2026-03-09T16:15:05.629 INFO:tasks.workunit.client.1.vm05.stdout:8/97: creat d4/d6/db/d14/f28 x:0 0 0 2026-03-09T16:15:05.632 INFO:tasks.workunit.client.1.vm05.stdout:2/115: dread db/dd/f10 [0,4194304] 0 2026-03-09T16:15:05.632 INFO:tasks.workunit.client.1.vm05.stdout:4/120: creat d5/de/d15/d21/f2a x:0 0 0 2026-03-09T16:15:05.639 INFO:tasks.workunit.client.1.vm05.stdout:4/121: readlink d5/lc 0 2026-03-09T16:15:05.640 INFO:tasks.workunit.client.1.vm05.stdout:7/186: rmdir d1/d2 39 2026-03-09T16:15:05.640 INFO:tasks.workunit.client.1.vm05.stdout:2/116: creat db/dd/d15/f28 x:0 0 0 2026-03-09T16:15:05.641 INFO:tasks.workunit.client.1.vm05.stdout:2/117: write db/f17 [881134,59768] 0 2026-03-09T16:15:05.641 INFO:tasks.workunit.client.1.vm05.stdout:2/118: dread f5 [0,4194304] 0 2026-03-09T16:15:05.642 INFO:tasks.workunit.client.1.vm05.stdout:2/119: dread - db/dd/d15/d1f/f24 zero size 2026-03-09T16:15:05.642 INFO:tasks.workunit.client.1.vm05.stdout:2/120: readlink db/l13 0 2026-03-09T16:15:05.644 INFO:tasks.workunit.client.1.vm05.stdout:7/187: truncate d1/d2/fe 4719393 0 2026-03-09T16:15:05.645 INFO:tasks.workunit.client.1.vm05.stdout:7/188: fdatasync d1/d2/d11/f25 0 2026-03-09T16:15:05.645 INFO:tasks.workunit.client.1.vm05.stdout:7/189: readlink d1/d2/d8/lb 0 2026-03-09T16:15:05.646 INFO:tasks.workunit.client.1.vm05.stdout:7/190: dread - d1/d2/d8/dc/d18/f2e zero size 2026-03-09T16:15:05.653 INFO:tasks.workunit.client.1.vm05.stdout:2/121: dwrite f7 [0,4194304] 0 2026-03-09T16:15:05.654 INFO:tasks.workunit.client.1.vm05.stdout:7/191: sync 2026-03-09T16:15:05.665 INFO:tasks.workunit.client.1.vm05.stdout:7/192: creat d1/d2/d8/dc/f3b x:0 0 0 2026-03-09T16:15:05.674 INFO:tasks.workunit.client.1.vm05.stdout:2/122: creat db/dd/d15/d1f/d21/f29 x:0 0 0 2026-03-09T16:15:05.697 INFO:tasks.workunit.client.1.vm05.stdout:9/121: write d4/fa [1162101,63502] 0 2026-03-09T16:15:05.699 INFO:tasks.workunit.client.1.vm05.stdout:0/132: truncate d5/d1f/f27 3299548 0 2026-03-09T16:15:05.700 INFO:tasks.workunit.client.1.vm05.stdout:3/81: truncate d0/d9/dc/f14 499996 0 2026-03-09T16:15:05.701 INFO:tasks.workunit.client.1.vm05.stdout:6/117: rmdir d17/d1d 39 2026-03-09T16:15:05.706 INFO:tasks.workunit.client.1.vm05.stdout:6/118: dwrite d17/f1c [0,4194304] 0 2026-03-09T16:15:05.730 INFO:tasks.workunit.client.1.vm05.stdout:7/193: mkdir d1/d19/d3c 0 2026-03-09T16:15:05.730 INFO:tasks.workunit.client.1.vm05.stdout:5/105: write f1 [1245243,89207] 0 2026-03-09T16:15:05.730 INFO:tasks.workunit.client.1.vm05.stdout:7/194: fdatasync d1/d2/d8/d31/f39 0 2026-03-09T16:15:05.731 INFO:tasks.workunit.client.1.vm05.stdout:7/195: write d1/d2/d8/dc/f1a [455623,70561] 0 2026-03-09T16:15:05.736 INFO:tasks.workunit.client.1.vm05.stdout:9/122: symlink d4/d10/l23 0 2026-03-09T16:15:05.736 INFO:tasks.workunit.client.1.vm05.stdout:2/123: read fa [1557171,110158] 0 2026-03-09T16:15:05.739 INFO:tasks.workunit.client.1.vm05.stdout:2/124: chown db/dd/d15/d1f/d20/d23/f1a 222902 1 2026-03-09T16:15:05.744 INFO:tasks.workunit.client.1.vm05.stdout:9/123: chown d4/d8/c1b 1 1 2026-03-09T16:15:05.745 INFO:tasks.workunit.client.1.vm05.stdout:5/106: dread d8/f11 [0,4194304] 0 2026-03-09T16:15:05.745 INFO:tasks.workunit.client.1.vm05.stdout:2/125: fdatasync db/dd/d15/d1f/f25 0 2026-03-09T16:15:05.745 INFO:tasks.workunit.client.1.vm05.stdout:7/196: mknod d1/d2/c3d 0 2026-03-09T16:15:05.746 INFO:tasks.workunit.client.1.vm05.stdout:6/119: link d17/f1b d17/f20 0 2026-03-09T16:15:05.747 INFO:tasks.workunit.client.1.vm05.stdout:5/107: rmdir d8/d1d 39 2026-03-09T16:15:05.753 INFO:tasks.workunit.client.1.vm05.stdout:2/126: dwrite db/dd/d15/d1f/d20/d23/f1a [0,4194304] 0 2026-03-09T16:15:05.756 INFO:tasks.workunit.client.1.vm05.stdout:7/197: mkdir d1/d2/d8/dc/d15/d3e 0 2026-03-09T16:15:05.759 INFO:tasks.workunit.client.1.vm05.stdout:7/198: write d1/d2/d11/f1c [759530,67589] 0 2026-03-09T16:15:05.763 INFO:tasks.workunit.client.1.vm05.stdout:5/108: dread f5 [0,4194304] 0 2026-03-09T16:15:05.764 INFO:tasks.workunit.client.1.vm05.stdout:9/124: dread d4/f6 [0,4194304] 0 2026-03-09T16:15:05.764 INFO:tasks.workunit.client.1.vm05.stdout:6/120: link le d17/l21 0 2026-03-09T16:15:05.765 INFO:tasks.workunit.client.1.vm05.stdout:9/125: write d4/d10/f15 [2209269,52050] 0 2026-03-09T16:15:05.772 INFO:tasks.workunit.client.1.vm05.stdout:7/199: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:05.777 INFO:tasks.workunit.client.1.vm05.stdout:6/121: mkdir d17/d22 0 2026-03-09T16:15:05.788 INFO:tasks.workunit.client.1.vm05.stdout:7/200: symlink d1/d2/d8/dc/d18/l3f 0 2026-03-09T16:15:05.788 INFO:tasks.workunit.client.1.vm05.stdout:5/109: rename d8/d18/c1c to d8/c26 0 2026-03-09T16:15:05.789 INFO:tasks.workunit.client.1.vm05.stdout:7/201: fdatasync d1/d2/f22 0 2026-03-09T16:15:05.791 INFO:tasks.workunit.client.1.vm05.stdout:4/122: dwrite d5/f6 [0,4194304] 0 2026-03-09T16:15:05.798 INFO:tasks.workunit.client.1.vm05.stdout:9/126: dwrite f2 [0,4194304] 0 2026-03-09T16:15:05.800 INFO:tasks.workunit.client.1.vm05.stdout:6/122: creat d17/f23 x:0 0 0 2026-03-09T16:15:05.802 INFO:tasks.workunit.client.1.vm05.stdout:4/123: dwrite d5/fd [0,4194304] 0 2026-03-09T16:15:05.807 INFO:tasks.workunit.client.1.vm05.stdout:2/127: fdatasync db/dd/f1b 0 2026-03-09T16:15:05.809 INFO:tasks.workunit.client.1.vm05.stdout:5/110: mknod d8/d18/c27 0 2026-03-09T16:15:05.812 INFO:tasks.workunit.client.1.vm05.stdout:7/202: mknod d1/d2/c40 0 2026-03-09T16:15:05.812 INFO:tasks.workunit.client.1.vm05.stdout:6/123: chown d17/l21 18 1 2026-03-09T16:15:05.813 INFO:tasks.workunit.client.1.vm05.stdout:2/128: creat db/dd/d15/d1f/d20/f2a x:0 0 0 2026-03-09T16:15:05.813 INFO:tasks.workunit.client.1.vm05.stdout:2/129: write db/f12 [1274097,16968] 0 2026-03-09T16:15:05.814 INFO:tasks.workunit.client.1.vm05.stdout:2/130: chown db 29235 1 2026-03-09T16:15:05.814 INFO:tasks.workunit.client.1.vm05.stdout:6/124: read d17/f1c [717747,81392] 0 2026-03-09T16:15:05.814 INFO:tasks.workunit.client.1.vm05.stdout:6/125: fdatasync f5 0 2026-03-09T16:15:05.818 INFO:tasks.workunit.client.1.vm05.stdout:5/111: chown d8/fd 23964545 1 2026-03-09T16:15:05.820 INFO:tasks.workunit.client.1.vm05.stdout:2/131: rmdir db 39 2026-03-09T16:15:05.826 INFO:tasks.workunit.client.1.vm05.stdout:7/203: creat d1/d2/d8/dc/d14/f41 x:0 0 0 2026-03-09T16:15:05.826 INFO:tasks.workunit.client.1.vm05.stdout:9/127: link d4/c12 d4/d8/c24 0 2026-03-09T16:15:05.826 INFO:tasks.workunit.client.1.vm05.stdout:5/112: link f1 d8/d18/d1b/f28 0 2026-03-09T16:15:05.833 INFO:tasks.workunit.client.1.vm05.stdout:6/126: dwrite d17/f18 [0,4194304] 0 2026-03-09T16:15:05.833 INFO:tasks.workunit.client.1.vm05.stdout:6/127: dwrite fa [0,4194304] 0 2026-03-09T16:15:05.834 INFO:tasks.workunit.client.1.vm05.stdout:9/128: write d4/d10/f1d [463977,26546] 0 2026-03-09T16:15:05.839 INFO:tasks.workunit.client.1.vm05.stdout:7/204: creat d1/d2/d8/dc/d1b/f42 x:0 0 0 2026-03-09T16:15:05.842 INFO:tasks.workunit.client.1.vm05.stdout:2/132: rename db/dd/d15/d1f/d20/f2a to db/dd/d15/d1f/f2b 0 2026-03-09T16:15:05.847 INFO:tasks.workunit.client.1.vm05.stdout:5/113: dwrite d8/d18/d1b/f28 [0,4194304] 0 2026-03-09T16:15:05.849 INFO:tasks.workunit.client.1.vm05.stdout:2/133: write f7 [1836296,108297] 0 2026-03-09T16:15:05.851 INFO:tasks.workunit.client.1.vm05.stdout:5/114: write f1 [6814575,101316] 0 2026-03-09T16:15:05.857 INFO:tasks.workunit.client.1.vm05.stdout:6/128: link d17/f20 d17/d1d/f24 0 2026-03-09T16:15:05.858 INFO:tasks.workunit.client.1.vm05.stdout:2/134: chown db/dd/d15/c1c 491789 1 2026-03-09T16:15:05.860 INFO:tasks.workunit.client.1.vm05.stdout:9/129: sync 2026-03-09T16:15:05.870 INFO:tasks.workunit.client.1.vm05.stdout:5/115: creat d8/f29 x:0 0 0 2026-03-09T16:15:05.871 INFO:tasks.workunit.client.1.vm05.stdout:6/129: rename d17/l21 to d17/d1d/l25 0 2026-03-09T16:15:05.871 INFO:tasks.workunit.client.1.vm05.stdout:2/135: link db/dd/d15/d1f/d20/d23/c26 db/dd/d15/c2c 0 2026-03-09T16:15:05.872 INFO:tasks.workunit.client.1.vm05.stdout:6/130: unlink d17/f20 0 2026-03-09T16:15:05.873 INFO:tasks.workunit.client.1.vm05.stdout:2/136: creat db/f2d x:0 0 0 2026-03-09T16:15:05.873 INFO:tasks.workunit.client.1.vm05.stdout:6/131: write fa [5079910,89993] 0 2026-03-09T16:15:05.873 INFO:tasks.workunit.client.1.vm05.stdout:5/116: creat d8/d18/d1b/f2a x:0 0 0 2026-03-09T16:15:05.877 INFO:tasks.workunit.client.1.vm05.stdout:9/130: dwrite d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:05.882 INFO:tasks.workunit.client.1.vm05.stdout:2/137: rename db/dd/d15/c1c to db/dd/d15/d1f/d20/c2e 0 2026-03-09T16:15:05.885 INFO:tasks.workunit.client.1.vm05.stdout:5/117: write d8/f11 [2144514,33605] 0 2026-03-09T16:15:05.890 INFO:tasks.workunit.client.1.vm05.stdout:5/118: symlink d8/d18/d1b/l2b 0 2026-03-09T16:15:05.893 INFO:tasks.workunit.client.1.vm05.stdout:6/132: dwrite d17/f23 [0,4194304] 0 2026-03-09T16:15:05.900 INFO:tasks.workunit.client.1.vm05.stdout:2/138: dwrite db/f12 [0,4194304] 0 2026-03-09T16:15:05.904 INFO:tasks.workunit.client.1.vm05.stdout:5/119: dwrite d8/d18/d1b/f2a [0,4194304] 0 2026-03-09T16:15:05.906 INFO:tasks.workunit.client.1.vm05.stdout:2/139: mknod db/dd/d15/c2f 0 2026-03-09T16:15:05.912 INFO:tasks.workunit.client.1.vm05.stdout:6/133: symlink d17/l26 0 2026-03-09T16:15:05.913 INFO:tasks.workunit.client.1.vm05.stdout:2/140: fsync f5 0 2026-03-09T16:15:05.915 INFO:tasks.workunit.client.1.vm05.stdout:6/134: chown d17/f1a 1093 1 2026-03-09T16:15:05.915 INFO:tasks.workunit.client.1.vm05.stdout:2/141: mknod db/c30 0 2026-03-09T16:15:05.916 INFO:tasks.workunit.client.1.vm05.stdout:2/142: chown db/dd/d15/d1f/f2b 1562 1 2026-03-09T16:15:05.916 INFO:tasks.workunit.client.1.vm05.stdout:2/143: read - db/dd/d15/d1f/f24 zero size 2026-03-09T16:15:05.917 INFO:tasks.workunit.client.1.vm05.stdout:2/144: read db/dd/f1b [2866283,47033] 0 2026-03-09T16:15:05.932 INFO:tasks.workunit.client.1.vm05.stdout:2/145: chown db/dd/d15/c2c 14 1 2026-03-09T16:15:05.932 INFO:tasks.workunit.client.1.vm05.stdout:2/146: readlink db/dd/d15/d1f/d20/d23/l14 0 2026-03-09T16:15:05.936 INFO:tasks.workunit.client.1.vm05.stdout:5/120: dwrite f1 [0,4194304] 0 2026-03-09T16:15:05.941 INFO:tasks.workunit.client.1.vm05.stdout:5/121: chown l3 132580860 1 2026-03-09T16:15:05.946 INFO:tasks.workunit.client.1.vm05.stdout:5/122: creat d8/d18/d1b/f2c x:0 0 0 2026-03-09T16:15:05.946 INFO:tasks.workunit.client.1.vm05.stdout:2/147: dwrite f7 [0,4194304] 0 2026-03-09T16:15:05.956 INFO:tasks.workunit.client.1.vm05.stdout:2/148: symlink db/dd/d15/d1f/d21/l31 0 2026-03-09T16:15:05.959 INFO:tasks.workunit.client.1.vm05.stdout:2/149: chown db/dd/d15/d1f/d20/d23/l14 27991078 1 2026-03-09T16:15:05.961 INFO:tasks.workunit.client.1.vm05.stdout:5/123: dwrite f1 [0,4194304] 0 2026-03-09T16:15:05.966 INFO:tasks.workunit.client.1.vm05.stdout:2/150: creat db/dd/f32 x:0 0 0 2026-03-09T16:15:05.972 INFO:tasks.workunit.client.1.vm05.stdout:2/151: fsync db/dd/d15/d1f/f25 0 2026-03-09T16:15:05.972 INFO:tasks.workunit.client.1.vm05.stdout:5/124: link d8/d1d/f1e d8/d18/d1b/f2d 0 2026-03-09T16:15:05.975 INFO:tasks.workunit.client.1.vm05.stdout:5/125: read d8/f13 [2589788,125958] 0 2026-03-09T16:15:05.981 INFO:tasks.workunit.client.1.vm05.stdout:5/126: chown d8/l24 26873 1 2026-03-09T16:15:05.981 INFO:tasks.workunit.client.1.vm05.stdout:5/127: unlink l3 0 2026-03-09T16:15:05.981 INFO:tasks.workunit.client.1.vm05.stdout:5/128: mkdir d8/d18/d1b/d2e 0 2026-03-09T16:15:05.982 INFO:tasks.workunit.client.1.vm05.stdout:5/129: mknod d8/c2f 0 2026-03-09T16:15:05.993 INFO:tasks.workunit.client.1.vm05.stdout:5/130: dwrite d8/fd [0,4194304] 0 2026-03-09T16:15:05.995 INFO:tasks.workunit.client.1.vm05.stdout:5/131: dread - d8/d1d/f21 zero size 2026-03-09T16:15:06.007 INFO:tasks.workunit.client.1.vm05.stdout:5/132: dread d8/d18/f20 [0,4194304] 0 2026-03-09T16:15:06.012 INFO:tasks.workunit.client.1.vm05.stdout:5/133: dread f5 [0,4194304] 0 2026-03-09T16:15:06.014 INFO:tasks.workunit.client.1.vm05.stdout:5/134: creat d8/d18/d1b/f30 x:0 0 0 2026-03-09T16:15:06.015 INFO:tasks.workunit.client.1.vm05.stdout:5/135: write d8/f1f [30081,1298] 0 2026-03-09T16:15:06.016 INFO:tasks.workunit.client.1.vm05.stdout:5/136: chown d8/d1d/f1e 2140735705 1 2026-03-09T16:15:06.018 INFO:tasks.workunit.client.1.vm05.stdout:5/137: creat d8/d18/d1b/f31 x:0 0 0 2026-03-09T16:15:06.019 INFO:tasks.workunit.client.1.vm05.stdout:5/138: write d8/d18/d1b/f31 [130004,16637] 0 2026-03-09T16:15:06.020 INFO:tasks.workunit.client.1.vm05.stdout:5/139: rmdir d8/d1d 39 2026-03-09T16:15:06.029 INFO:tasks.workunit.client.1.vm05.stdout:5/140: dwrite d8/d1d/f1e [0,4194304] 0 2026-03-09T16:15:06.030 INFO:tasks.workunit.client.1.vm05.stdout:5/141: write d8/f11 [3122912,69625] 0 2026-03-09T16:15:06.032 INFO:tasks.workunit.client.1.vm05.stdout:5/142: write d8/d18/d1b/f31 [290370,62219] 0 2026-03-09T16:15:06.034 INFO:tasks.workunit.client.1.vm05.stdout:5/143: write d8/d18/d1b/f31 [1387909,55724] 0 2026-03-09T16:15:06.035 INFO:tasks.workunit.client.1.vm05.stdout:5/144: chown d8/d18/c27 1841866 1 2026-03-09T16:15:06.039 INFO:tasks.workunit.client.1.vm05.stdout:5/145: creat d8/d18/d1b/f32 x:0 0 0 2026-03-09T16:15:06.040 INFO:tasks.workunit.client.1.vm05.stdout:5/146: mknod d8/d18/d1b/c33 0 2026-03-09T16:15:06.041 INFO:tasks.workunit.client.1.vm05.stdout:5/147: write f5 [1728806,118411] 0 2026-03-09T16:15:06.047 INFO:tasks.workunit.client.1.vm05.stdout:5/148: dread d8/fd [0,4194304] 0 2026-03-09T16:15:06.048 INFO:tasks.workunit.client.1.vm05.stdout:5/149: rename d8 to d8/d34 22 2026-03-09T16:15:06.054 INFO:tasks.workunit.client.1.vm05.stdout:5/150: dwrite d8/f1f [0,4194304] 0 2026-03-09T16:15:06.065 INFO:tasks.workunit.client.1.vm05.stdout:5/151: link f5 d8/d18/d1b/d2e/f35 0 2026-03-09T16:15:06.066 INFO:tasks.workunit.client.1.vm05.stdout:5/152: creat d8/d18/d1b/f36 x:0 0 0 2026-03-09T16:15:06.067 INFO:tasks.workunit.client.1.vm05.stdout:5/153: symlink d8/d18/l37 0 2026-03-09T16:15:06.068 INFO:tasks.workunit.client.1.vm05.stdout:5/154: unlink d8/d18/d1b/l2b 0 2026-03-09T16:15:06.070 INFO:tasks.workunit.client.1.vm05.stdout:5/155: dread d8/d18/d1b/f28 [4194304,4194304] 0 2026-03-09T16:15:06.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:05 vm03.local ceph-mon[51019]: pgmap v12: 65 pgs: 65 active+clean; 2.1 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 40 MiB/s rd, 81 MiB/s wr, 292 op/s 2026-03-09T16:15:06.144 INFO:tasks.workunit.client.1.vm05.stdout:0/133: truncate d5/d2c/f28 701517 0 2026-03-09T16:15:06.144 INFO:tasks.workunit.client.1.vm05.stdout:3/82: truncate d0/d9/fa 1600362 0 2026-03-09T16:15:06.145 INFO:tasks.workunit.client.1.vm05.stdout:0/134: fsync d5/db/fc 0 2026-03-09T16:15:06.145 INFO:tasks.workunit.client.1.vm05.stdout:0/135: dread - d5/d1b/f25 zero size 2026-03-09T16:15:06.146 INFO:tasks.workunit.client.1.vm05.stdout:3/83: dread d0/d9/dc/f14 [0,4194304] 0 2026-03-09T16:15:06.146 INFO:tasks.workunit.client.1.vm05.stdout:1/134: truncate d7/d15/d16/f1c 166645 0 2026-03-09T16:15:06.147 INFO:tasks.workunit.client.1.vm05.stdout:1/135: dread - d7/dd/f1f zero size 2026-03-09T16:15:06.151 INFO:tasks.workunit.client.1.vm05.stdout:3/84: creat d0/d9/dc/f18 x:0 0 0 2026-03-09T16:15:06.152 INFO:tasks.workunit.client.1.vm05.stdout:0/136: getdents d5/d1b 0 2026-03-09T16:15:06.152 INFO:tasks.workunit.client.1.vm05.stdout:3/85: chown d0/d9/dc/l15 9833 1 2026-03-09T16:15:06.155 INFO:tasks.workunit.client.1.vm05.stdout:1/136: dwrite d7/fc [0,4194304] 0 2026-03-09T16:15:06.159 INFO:tasks.workunit.client.1.vm05.stdout:3/86: truncate d0/f13 1693021 0 2026-03-09T16:15:06.159 INFO:tasks.workunit.client.1.vm05.stdout:1/137: dread - d7/dd/f1f zero size 2026-03-09T16:15:06.164 INFO:tasks.workunit.client.1.vm05.stdout:3/87: rmdir d0/d9/d16 0 2026-03-09T16:15:06.165 INFO:tasks.workunit.client.1.vm05.stdout:0/137: dwrite d5/db/fc [4194304,4194304] 0 2026-03-09T16:15:06.166 INFO:tasks.workunit.client.1.vm05.stdout:1/138: dread d7/f9 [0,4194304] 0 2026-03-09T16:15:06.173 INFO:tasks.workunit.client.1.vm05.stdout:3/88: chown d0/d9/fa 0 1 2026-03-09T16:15:06.188 INFO:tasks.workunit.client.1.vm05.stdout:4/124: write d5/fb [328686,103902] 0 2026-03-09T16:15:06.188 INFO:tasks.workunit.client.1.vm05.stdout:9/131: rmdir d4 39 2026-03-09T16:15:06.194 INFO:tasks.workunit.client.1.vm05.stdout:4/125: symlink d5/l2b 0 2026-03-09T16:15:06.204 INFO:tasks.workunit.client.1.vm05.stdout:7/205: truncate d1/d2/d8/dc/d14/f35 1483233 0 2026-03-09T16:15:06.204 INFO:tasks.workunit.client.1.vm05.stdout:7/206: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:06.205 INFO:tasks.workunit.client.1.vm05.stdout:4/126: creat d5/de/d15/d21/d27/f2c x:0 0 0 2026-03-09T16:15:06.209 INFO:tasks.workunit.client.1.vm05.stdout:9/132: getdents d4 0 2026-03-09T16:15:06.209 INFO:tasks.workunit.client.1.vm05.stdout:3/89: sync 2026-03-09T16:15:06.209 INFO:tasks.workunit.client.1.vm05.stdout:4/127: creat d5/f2d x:0 0 0 2026-03-09T16:15:06.209 INFO:tasks.workunit.client.1.vm05.stdout:1/139: sync 2026-03-09T16:15:06.210 INFO:tasks.workunit.client.1.vm05.stdout:8/98: dwrite d4/d6/db/fe [0,4194304] 0 2026-03-09T16:15:06.213 INFO:tasks.workunit.client.1.vm05.stdout:4/128: write d5/f2d [219649,30304] 0 2026-03-09T16:15:06.213 INFO:tasks.workunit.client.1.vm05.stdout:9/133: symlink d4/d10/l25 0 2026-03-09T16:15:06.214 INFO:tasks.workunit.client.1.vm05.stdout:3/90: mknod d0/d9/d10/c19 0 2026-03-09T16:15:06.215 INFO:tasks.workunit.client.1.vm05.stdout:1/140: chown d7/dd/f19 3803889 1 2026-03-09T16:15:06.217 INFO:tasks.workunit.client.1.vm05.stdout:8/99: creat d4/d6/f29 x:0 0 0 2026-03-09T16:15:06.220 INFO:tasks.workunit.client.1.vm05.stdout:1/141: dread d7/fb [4194304,4194304] 0 2026-03-09T16:15:06.232 INFO:tasks.workunit.client.1.vm05.stdout:8/100: fdatasync d4/d6/f24 0 2026-03-09T16:15:06.232 INFO:tasks.workunit.client.1.vm05.stdout:7/207: link d1/d2/d8/dc/d14/c13 d1/d2/d11/c43 0 2026-03-09T16:15:06.232 INFO:tasks.workunit.client.1.vm05.stdout:1/142: write d7/d15/f22 [2465887,122023] 0 2026-03-09T16:15:06.233 INFO:tasks.workunit.client.1.vm05.stdout:1/143: dread - d7/dd/de/f2e zero size 2026-03-09T16:15:06.234 INFO:tasks.workunit.client.1.vm05.stdout:1/144: dread - d7/d15/f2a zero size 2026-03-09T16:15:06.235 INFO:tasks.workunit.client.1.vm05.stdout:9/134: dwrite d4/d8/f21 [0,4194304] 0 2026-03-09T16:15:06.237 INFO:tasks.workunit.client.1.vm05.stdout:7/208: sync 2026-03-09T16:15:06.245 INFO:tasks.workunit.client.1.vm05.stdout:1/145: creat d7/d27/f33 x:0 0 0 2026-03-09T16:15:06.250 INFO:tasks.workunit.client.1.vm05.stdout:4/129: getdents d5/d19 0 2026-03-09T16:15:06.250 INFO:tasks.workunit.client.1.vm05.stdout:8/101: creat d4/d6/db/dc/f2a x:0 0 0 2026-03-09T16:15:06.252 INFO:tasks.workunit.client.1.vm05.stdout:9/135: dwrite d4/f20 [0,4194304] 0 2026-03-09T16:15:06.263 INFO:tasks.workunit.client.1.vm05.stdout:4/130: link d5/de/f16 d5/f2e 0 2026-03-09T16:15:06.264 INFO:tasks.workunit.client.1.vm05.stdout:4/131: chown d5/lf 836348 1 2026-03-09T16:15:06.265 INFO:tasks.workunit.client.1.vm05.stdout:1/146: dwrite d7/fb [4194304,4194304] 0 2026-03-09T16:15:06.266 INFO:tasks.workunit.client.1.vm05.stdout:4/132: write d5/f10 [429197,26929] 0 2026-03-09T16:15:06.273 INFO:tasks.workunit.client.1.vm05.stdout:4/133: mkdir d5/de/d2f 0 2026-03-09T16:15:06.284 INFO:tasks.workunit.client.1.vm05.stdout:4/134: readlink d5/l2b 0 2026-03-09T16:15:06.284 INFO:tasks.workunit.client.1.vm05.stdout:4/135: creat d5/de/d15/d21/d27/f30 x:0 0 0 2026-03-09T16:15:06.284 INFO:tasks.workunit.client.1.vm05.stdout:4/136: fdatasync f0 0 2026-03-09T16:15:06.284 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:05 vm05.local ceph-mon[58702]: pgmap v12: 65 pgs: 65 active+clean; 2.1 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 40 MiB/s rd, 81 MiB/s wr, 292 op/s 2026-03-09T16:15:06.285 INFO:tasks.workunit.client.1.vm05.stdout:6/135: rmdir d17 39 2026-03-09T16:15:06.285 INFO:tasks.workunit.client.1.vm05.stdout:2/152: getdents db 0 2026-03-09T16:15:06.287 INFO:tasks.workunit.client.1.vm05.stdout:2/153: write db/dd/d15/d1f/f2b [538342,117564] 0 2026-03-09T16:15:06.287 INFO:tasks.workunit.client.1.vm05.stdout:5/156: truncate d8/fd 3806194 0 2026-03-09T16:15:06.290 INFO:tasks.workunit.client.1.vm05.stdout:6/136: mkdir d17/d22/d27 0 2026-03-09T16:15:06.290 INFO:tasks.workunit.client.1.vm05.stdout:2/154: symlink db/dd/d15/d1f/d21/l33 0 2026-03-09T16:15:06.294 INFO:tasks.workunit.client.1.vm05.stdout:0/138: rename d5/d1f/f27 to d5/db/d1d/f2e 0 2026-03-09T16:15:06.296 INFO:tasks.workunit.client.1.vm05.stdout:5/157: link d8/c2f d8/d18/d1b/d2e/c38 0 2026-03-09T16:15:06.296 INFO:tasks.workunit.client.1.vm05.stdout:0/139: unlink d5/db/l13 0 2026-03-09T16:15:06.298 INFO:tasks.workunit.client.1.vm05.stdout:0/140: creat d5/d1f/f2f x:0 0 0 2026-03-09T16:15:06.298 INFO:tasks.workunit.client.1.vm05.stdout:2/155: dread f7 [0,4194304] 0 2026-03-09T16:15:06.299 INFO:tasks.workunit.client.1.vm05.stdout:6/137: getdents d17/d1d 0 2026-03-09T16:15:06.299 INFO:tasks.workunit.client.1.vm05.stdout:2/156: read - db/dd/d15/d1f/f25 zero size 2026-03-09T16:15:06.300 INFO:tasks.workunit.client.1.vm05.stdout:2/157: readlink db/dd/d15/d1f/d20/d23/l1d 0 2026-03-09T16:15:06.303 INFO:tasks.workunit.client.1.vm05.stdout:6/138: truncate f5 2913513 0 2026-03-09T16:15:06.306 INFO:tasks.workunit.client.1.vm05.stdout:4/137: dread d5/fb [0,4194304] 0 2026-03-09T16:15:06.307 INFO:tasks.workunit.client.1.vm05.stdout:6/139: dread d17/f1c [0,4194304] 0 2026-03-09T16:15:06.311 INFO:tasks.workunit.client.1.vm05.stdout:6/140: mkdir d17/d28 0 2026-03-09T16:15:06.312 INFO:tasks.workunit.client.1.vm05.stdout:6/141: mknod d17/d1d/c29 0 2026-03-09T16:15:06.314 INFO:tasks.workunit.client.1.vm05.stdout:6/142: rmdir d17/d28 0 2026-03-09T16:15:06.316 INFO:tasks.workunit.client.1.vm05.stdout:6/143: link f16 d17/d22/d27/f2a 0 2026-03-09T16:15:06.319 INFO:tasks.workunit.client.1.vm05.stdout:6/144: dread fa [4194304,4194304] 0 2026-03-09T16:15:06.320 INFO:tasks.workunit.client.1.vm05.stdout:6/145: mknod d17/d1d/c2b 0 2026-03-09T16:15:06.330 INFO:tasks.workunit.client.1.vm05.stdout:2/158: dread db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:06.331 INFO:tasks.workunit.client.1.vm05.stdout:2/159: fsync f5 0 2026-03-09T16:15:06.333 INFO:tasks.workunit.client.1.vm05.stdout:2/160: symlink db/l34 0 2026-03-09T16:15:06.344 INFO:tasks.workunit.client.1.vm05.stdout:7/209: rename d1/d2/fe to d1/d2/d8/dc/d14/f44 0 2026-03-09T16:15:06.359 INFO:tasks.workunit.client.1.vm05.stdout:4/138: getdents d5 0 2026-03-09T16:15:06.360 INFO:tasks.workunit.client.1.vm05.stdout:8/102: dwrite d4/d6/db/dc/f17 [0,4194304] 0 2026-03-09T16:15:06.361 INFO:tasks.workunit.client.1.vm05.stdout:9/136: truncate d4/f20 3928449 0 2026-03-09T16:15:06.361 INFO:tasks.workunit.client.1.vm05.stdout:5/158: write d8/fd [2154877,85858] 0 2026-03-09T16:15:06.364 INFO:tasks.workunit.client.1.vm05.stdout:9/137: write d4/d10/f1d [412334,112352] 0 2026-03-09T16:15:06.365 INFO:tasks.workunit.client.1.vm05.stdout:5/159: write d8/d18/d1b/f2a [4042907,68106] 0 2026-03-09T16:15:06.371 INFO:tasks.workunit.client.1.vm05.stdout:9/138: fdatasync d4/fa 0 2026-03-09T16:15:06.374 INFO:tasks.workunit.client.1.vm05.stdout:1/147: dwrite d7/d15/d16/f26 [0,4194304] 0 2026-03-09T16:15:06.378 INFO:tasks.workunit.client.1.vm05.stdout:3/91: truncate d0/d9/fa 1446016 0 2026-03-09T16:15:06.380 INFO:tasks.workunit.client.1.vm05.stdout:5/160: sync 2026-03-09T16:15:06.381 INFO:tasks.workunit.client.1.vm05.stdout:2/161: write db/dd/d15/d1f/f24 [906084,20832] 0 2026-03-09T16:15:06.381 INFO:tasks.workunit.client.1.vm05.stdout:4/139: dwrite d5/fd [0,4194304] 0 2026-03-09T16:15:06.381 INFO:tasks.workunit.client.1.vm05.stdout:0/141: rename d5/d1f to d5/d1b/d30 0 2026-03-09T16:15:06.382 INFO:tasks.workunit.client.1.vm05.stdout:8/103: fdatasync d4/f1c 0 2026-03-09T16:15:06.384 INFO:tasks.workunit.client.1.vm05.stdout:9/139: mknod d4/d10/c26 0 2026-03-09T16:15:06.388 INFO:tasks.workunit.client.1.vm05.stdout:3/92: fsync d0/fd 0 2026-03-09T16:15:06.392 INFO:tasks.workunit.client.1.vm05.stdout:6/146: rename d17/f23 to d17/d22/f2c 0 2026-03-09T16:15:06.395 INFO:tasks.workunit.client.1.vm05.stdout:2/162: rename db/dd/d15 to db/dd/d15/d1f/d35 22 2026-03-09T16:15:06.395 INFO:tasks.workunit.client.1.vm05.stdout:6/147: chown d17 910880 1 2026-03-09T16:15:06.395 INFO:tasks.workunit.client.1.vm05.stdout:4/140: mkdir d5/de/d15/d21/d31 0 2026-03-09T16:15:06.396 INFO:tasks.workunit.client.1.vm05.stdout:6/148: chown d17/d1d/c2b 33115 1 2026-03-09T16:15:06.397 INFO:tasks.workunit.client.1.vm05.stdout:8/104: mknod d4/d6/c2b 0 2026-03-09T16:15:06.399 INFO:tasks.workunit.client.1.vm05.stdout:5/161: symlink d8/d18/d1b/d2e/l39 0 2026-03-09T16:15:06.402 INFO:tasks.workunit.client.1.vm05.stdout:2/163: creat db/dd/d15/d1f/f36 x:0 0 0 2026-03-09T16:15:06.404 INFO:tasks.workunit.client.1.vm05.stdout:4/141: dread f1 [0,4194304] 0 2026-03-09T16:15:06.404 INFO:tasks.workunit.client.1.vm05.stdout:8/105: dwrite d4/d6/f27 [0,4194304] 0 2026-03-09T16:15:06.407 INFO:tasks.workunit.client.1.vm05.stdout:0/142: link d5/c15 d5/d1b/d30/c31 0 2026-03-09T16:15:06.407 INFO:tasks.workunit.client.1.vm05.stdout:8/106: readlink d4/l5 0 2026-03-09T16:15:06.408 INFO:tasks.workunit.client.1.vm05.stdout:8/107: readlink d4/d6/l19 0 2026-03-09T16:15:06.412 INFO:tasks.workunit.client.1.vm05.stdout:5/162: link d8/d18/d1b/f28 d8/d18/f3a 0 2026-03-09T16:15:06.412 INFO:tasks.workunit.client.1.vm05.stdout:2/164: symlink db/dd/d15/d1f/d20/d23/l37 0 2026-03-09T16:15:06.413 INFO:tasks.workunit.client.1.vm05.stdout:4/142: rmdir d5/de/d15/d21/d27 39 2026-03-09T16:15:06.413 INFO:tasks.workunit.client.1.vm05.stdout:0/143: mknod d5/d11/c32 0 2026-03-09T16:15:06.414 INFO:tasks.workunit.client.1.vm05.stdout:0/144: fsync d5/d11/f18 0 2026-03-09T16:15:06.418 INFO:tasks.workunit.client.1.vm05.stdout:6/149: link d17/f1c d17/f2d 0 2026-03-09T16:15:06.418 INFO:tasks.workunit.client.1.vm05.stdout:0/145: write d5/d1b/d30/f2a [574661,64794] 0 2026-03-09T16:15:06.420 INFO:tasks.workunit.client.1.vm05.stdout:2/165: mknod db/dd/d15/d1f/c38 0 2026-03-09T16:15:06.424 INFO:tasks.workunit.client.1.vm05.stdout:3/93: getdents d0/d9 0 2026-03-09T16:15:06.425 INFO:tasks.workunit.client.1.vm05.stdout:8/108: dwrite f0 [0,4194304] 0 2026-03-09T16:15:06.431 INFO:tasks.workunit.client.1.vm05.stdout:4/143: dwrite d5/d19/f1f [0,4194304] 0 2026-03-09T16:15:06.432 INFO:tasks.workunit.client.1.vm05.stdout:8/109: write d4/f10 [323646,92729] 0 2026-03-09T16:15:06.437 INFO:tasks.workunit.client.1.vm05.stdout:6/150: rename fb to d17/f2e 0 2026-03-09T16:15:06.439 INFO:tasks.workunit.client.1.vm05.stdout:0/146: dwrite d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:06.442 INFO:tasks.workunit.client.1.vm05.stdout:2/166: dwrite db/dd/f1b [0,4194304] 0 2026-03-09T16:15:06.443 INFO:tasks.workunit.client.1.vm05.stdout:3/94: mknod d0/d9/dc/c1a 0 2026-03-09T16:15:06.443 INFO:tasks.workunit.client.1.vm05.stdout:8/110: mknod d4/d6/db/d14/c2c 0 2026-03-09T16:15:06.446 INFO:tasks.workunit.client.1.vm05.stdout:3/95: write d0/d9/dc/f18 [374924,90106] 0 2026-03-09T16:15:06.447 INFO:tasks.workunit.client.1.vm05.stdout:3/96: chown d0/d9/dc/c1a 64043870 1 2026-03-09T16:15:06.452 INFO:tasks.workunit.client.1.vm05.stdout:2/167: dwrite db/dd/d15/d1f/f36 [0,4194304] 0 2026-03-09T16:15:06.455 INFO:tasks.workunit.client.1.vm05.stdout:4/144: rename d5/d19/f1e to d5/d19/f32 0 2026-03-09T16:15:06.458 INFO:tasks.workunit.client.1.vm05.stdout:4/145: chown d5/d19 876360 1 2026-03-09T16:15:06.461 INFO:tasks.workunit.client.1.vm05.stdout:0/147: symlink d5/db/l33 0 2026-03-09T16:15:06.461 INFO:tasks.workunit.client.1.vm05.stdout:4/146: write f0 [1752876,126332] 0 2026-03-09T16:15:06.463 INFO:tasks.workunit.client.1.vm05.stdout:0/148: readlink d5/d11/l19 0 2026-03-09T16:15:06.464 INFO:tasks.workunit.client.1.vm05.stdout:3/97: stat d0/d9/dc/ce 0 2026-03-09T16:15:06.464 INFO:tasks.workunit.client.1.vm05.stdout:4/147: symlink d5/d19/l33 0 2026-03-09T16:15:06.465 INFO:tasks.workunit.client.1.vm05.stdout:2/168: link db/dd/d15/d1f/f24 db/dd/d15/d1f/d21/f39 0 2026-03-09T16:15:06.466 INFO:tasks.workunit.client.1.vm05.stdout:0/149: mkdir d5/d34 0 2026-03-09T16:15:06.471 INFO:tasks.workunit.client.1.vm05.stdout:3/98: mkdir d0/d1b 0 2026-03-09T16:15:06.471 INFO:tasks.workunit.client.1.vm05.stdout:2/169: write db/dd/d15/d1f/f2b [999982,49701] 0 2026-03-09T16:15:06.472 INFO:tasks.workunit.client.1.vm05.stdout:0/150: creat d5/d34/f35 x:0 0 0 2026-03-09T16:15:06.472 INFO:tasks.workunit.client.1.vm05.stdout:4/148: creat d5/de/d15/f34 x:0 0 0 2026-03-09T16:15:06.473 INFO:tasks.workunit.client.1.vm05.stdout:0/151: chown d5/f17 29710 1 2026-03-09T16:15:06.473 INFO:tasks.workunit.client.1.vm05.stdout:0/152: write d5/d34/f35 [813503,42885] 0 2026-03-09T16:15:06.475 INFO:tasks.workunit.client.1.vm05.stdout:4/149: write d5/de/d15/f1b [804488,39056] 0 2026-03-09T16:15:06.479 INFO:tasks.workunit.client.1.vm05.stdout:4/150: creat d5/f35 x:0 0 0 2026-03-09T16:15:06.480 INFO:tasks.workunit.client.1.vm05.stdout:2/170: getdents db/dd 0 2026-03-09T16:15:06.481 INFO:tasks.workunit.client.1.vm05.stdout:2/171: mknod db/dd/d15/d1f/c3a 0 2026-03-09T16:15:06.487 INFO:tasks.workunit.client.1.vm05.stdout:2/172: dwrite db/dd/f32 [0,4194304] 0 2026-03-09T16:15:06.488 INFO:tasks.workunit.client.1.vm05.stdout:7/210: truncate d1/d2/d8/dc/f1a 570682 0 2026-03-09T16:15:06.494 INFO:tasks.workunit.client.1.vm05.stdout:2/173: fdatasync f7 0 2026-03-09T16:15:06.494 INFO:tasks.workunit.client.1.vm05.stdout:2/174: read - db/dd/d15/f28 zero size 2026-03-09T16:15:06.495 INFO:tasks.workunit.client.1.vm05.stdout:7/211: creat d1/d2/d8/dc/f45 x:0 0 0 2026-03-09T16:15:06.495 INFO:tasks.workunit.client.1.vm05.stdout:2/175: write db/f17 [614671,60006] 0 2026-03-09T16:15:06.496 INFO:tasks.workunit.client.1.vm05.stdout:2/176: read - db/dd/d15/f28 zero size 2026-03-09T16:15:06.498 INFO:tasks.workunit.client.1.vm05.stdout:2/177: read db/dd/f10 [1820227,8118] 0 2026-03-09T16:15:06.498 INFO:tasks.workunit.client.1.vm05.stdout:7/212: rmdir d1/d2/d8/d31 39 2026-03-09T16:15:06.500 INFO:tasks.workunit.client.1.vm05.stdout:7/213: creat d1/d19/f46 x:0 0 0 2026-03-09T16:15:06.501 INFO:tasks.workunit.client.1.vm05.stdout:7/214: chown d1/d19/d3c 7 1 2026-03-09T16:15:06.501 INFO:tasks.workunit.client.1.vm05.stdout:2/178: mknod db/dd/d15/c3b 0 2026-03-09T16:15:06.502 INFO:tasks.workunit.client.1.vm05.stdout:7/215: rmdir d1/d2/d8 39 2026-03-09T16:15:06.503 INFO:tasks.workunit.client.1.vm05.stdout:2/179: mknod db/dd/d15/d1f/c3c 0 2026-03-09T16:15:06.510 INFO:tasks.workunit.client.1.vm05.stdout:7/216: fsync d1/d2/d8/dc/d1b/f42 0 2026-03-09T16:15:06.510 INFO:tasks.workunit.client.1.vm05.stdout:2/180: creat db/dd/d15/d1f/d20/f3d x:0 0 0 2026-03-09T16:15:06.510 INFO:tasks.workunit.client.1.vm05.stdout:2/181: dread db/dd/f1b [0,4194304] 0 2026-03-09T16:15:06.520 INFO:tasks.workunit.client.1.vm05.stdout:2/182: dwrite db/dd/d15/f28 [0,4194304] 0 2026-03-09T16:15:06.524 INFO:tasks.workunit.client.1.vm05.stdout:7/217: getdents d1/d2/d8/dc/d15 0 2026-03-09T16:15:06.525 INFO:tasks.workunit.client.1.vm05.stdout:2/183: mknod db/dd/c3e 0 2026-03-09T16:15:06.525 INFO:tasks.workunit.client.1.vm05.stdout:7/218: mkdir d1/d47 0 2026-03-09T16:15:06.529 INFO:tasks.workunit.client.1.vm05.stdout:7/219: write d1/d2/d8/dc/f3b [847181,84755] 0 2026-03-09T16:15:06.530 INFO:tasks.workunit.client.1.vm05.stdout:7/220: truncate d1/d2/d8/dc/d18/f2e 948524 0 2026-03-09T16:15:06.532 INFO:tasks.workunit.client.1.vm05.stdout:7/221: chown d1/d2/d8/c16 28412 1 2026-03-09T16:15:06.533 INFO:tasks.workunit.client.1.vm05.stdout:7/222: creat d1/d2/d8/dc/f48 x:0 0 0 2026-03-09T16:15:06.552 INFO:tasks.workunit.client.1.vm05.stdout:9/140: truncate d4/fa 3890947 0 2026-03-09T16:15:06.554 INFO:tasks.workunit.client.1.vm05.stdout:1/148: dread d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:06.556 INFO:tasks.workunit.client.1.vm05.stdout:1/149: read d7/f9 [1266047,10188] 0 2026-03-09T16:15:06.558 INFO:tasks.workunit.client.1.vm05.stdout:0/153: rmdir d5/d11 39 2026-03-09T16:15:06.561 INFO:tasks.workunit.client.1.vm05.stdout:9/141: dread d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:06.565 INFO:tasks.workunit.client.1.vm05.stdout:1/150: creat d7/f34 x:0 0 0 2026-03-09T16:15:06.571 INFO:tasks.workunit.client.1.vm05.stdout:9/142: write d4/f17 [1699387,115762] 0 2026-03-09T16:15:06.572 INFO:tasks.workunit.client.1.vm05.stdout:9/143: symlink d4/d8/l27 0 2026-03-09T16:15:06.572 INFO:tasks.workunit.client.1.vm05.stdout:9/144: creat d4/d8/f28 x:0 0 0 2026-03-09T16:15:06.572 INFO:tasks.workunit.client.1.vm05.stdout:7/223: sync 2026-03-09T16:15:06.575 INFO:tasks.workunit.client.1.vm05.stdout:1/151: dwrite d7/f34 [0,4194304] 0 2026-03-09T16:15:06.579 INFO:tasks.workunit.client.1.vm05.stdout:9/145: dwrite d4/d10/f22 [0,4194304] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/146: dread d4/d10/f18 [0,4194304] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/147: write d4/d8/f28 [81939,64453] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:1/152: unlink d7/d15/f2a 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:1/153: creat d7/dd/de/f35 x:0 0 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/148: dwrite d4/d8/f21 [0,4194304] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:1/154: write d7/fc [1811566,48673] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/149: dread d4/f6 [0,4194304] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/150: symlink d4/d10/l29 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/151: write d4/d8/f28 [592017,27735] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/152: write d4/f17 [8813,109636] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/153: creat d4/d10/f2a x:0 0 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/154: chown f2 191287 1 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/155: mkdir d4/d8/d2b 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/156: dread d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/157: write d4/d8/f21 [4240302,93093] 0 2026-03-09T16:15:06.614 INFO:tasks.workunit.client.1.vm05.stdout:9/158: fdatasync d4/d10/f1d 0 2026-03-09T16:15:06.618 INFO:tasks.workunit.client.1.vm05.stdout:9/159: link d4/d10/f1d d4/d8/d2b/f2c 0 2026-03-09T16:15:06.618 INFO:tasks.workunit.client.1.vm05.stdout:9/160: readlink d4/d10/l25 0 2026-03-09T16:15:06.619 INFO:tasks.workunit.client.1.vm05.stdout:9/161: creat d4/f2d x:0 0 0 2026-03-09T16:15:06.663 INFO:tasks.workunit.client.1.vm05.stdout:7/224: sync 2026-03-09T16:15:06.664 INFO:tasks.workunit.client.1.vm05.stdout:7/225: chown d1/f26 7500 1 2026-03-09T16:15:06.675 INFO:tasks.workunit.client.1.vm05.stdout:7/226: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:06.679 INFO:tasks.workunit.client.1.vm05.stdout:7/227: dwrite d1/d2/d8/dc/d14/f41 [0,4194304] 0 2026-03-09T16:15:06.695 INFO:tasks.workunit.client.1.vm05.stdout:3/99: dread d0/d9/fa [0,4194304] 0 2026-03-09T16:15:06.697 INFO:tasks.workunit.client.1.vm05.stdout:3/100: read d0/f13 [1300961,129942] 0 2026-03-09T16:15:06.705 INFO:tasks.workunit.client.1.vm05.stdout:7/228: creat d1/d2/d8/dc/d15/d3e/f49 x:0 0 0 2026-03-09T16:15:06.717 INFO:tasks.workunit.client.1.vm05.stdout:3/101: sync 2026-03-09T16:15:06.723 INFO:tasks.workunit.client.1.vm05.stdout:3/102: rename d0/d9/dc/l15 to d0/l1c 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/103: rename d0/f13 to d0/d9/f1d 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/104: read d0/d9/f1d [263273,105118] 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/105: link d0/d9/d10/l11 d0/l1e 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/106: dread d0/fd [0,4194304] 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/107: symlink d0/d9/dc/l1f 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/108: rename d0/l1e to d0/d9/dc/l20 0 2026-03-09T16:15:06.736 INFO:tasks.workunit.client.1.vm05.stdout:3/109: creat d0/d9/dc/f21 x:0 0 0 2026-03-09T16:15:06.738 INFO:tasks.workunit.client.1.vm05.stdout:3/110: dwrite d0/d9/f1d [0,4194304] 0 2026-03-09T16:15:06.770 INFO:tasks.workunit.client.1.vm05.stdout:8/111: rmdir d4/d6 39 2026-03-09T16:15:06.771 INFO:tasks.workunit.client.1.vm05.stdout:5/163: dwrite d8/f13 [0,4194304] 0 2026-03-09T16:15:06.771 INFO:tasks.workunit.client.1.vm05.stdout:6/151: write f5 [621710,80900] 0 2026-03-09T16:15:06.771 INFO:tasks.workunit.client.1.vm05.stdout:5/164: read f5 [72803,24929] 0 2026-03-09T16:15:06.773 INFO:tasks.workunit.client.1.vm05.stdout:4/151: truncate d5/d19/f1f 225091 0 2026-03-09T16:15:06.776 INFO:tasks.workunit.client.1.vm05.stdout:6/152: rename c15 to d17/d22/c2f 0 2026-03-09T16:15:06.777 INFO:tasks.workunit.client.1.vm05.stdout:5/165: mknod d8/d18/d1b/d2e/c3b 0 2026-03-09T16:15:06.777 INFO:tasks.workunit.client.1.vm05.stdout:6/153: stat d17 0 2026-03-09T16:15:06.780 INFO:tasks.workunit.client.1.vm05.stdout:5/166: write d8/d18/d1b/f2c [718200,89485] 0 2026-03-09T16:15:06.782 INFO:tasks.workunit.client.1.vm05.stdout:5/167: stat d8/f29 0 2026-03-09T16:15:06.785 INFO:tasks.workunit.client.1.vm05.stdout:7/229: getdents d1/d19 0 2026-03-09T16:15:06.788 INFO:tasks.workunit.client.1.vm05.stdout:4/152: dwrite d5/de/d15/f1b [4194304,4194304] 0 2026-03-09T16:15:06.800 INFO:tasks.workunit.client.1.vm05.stdout:8/112: dwrite d4/d6/fa [0,4194304] 0 2026-03-09T16:15:06.801 INFO:tasks.workunit.client.1.vm05.stdout:7/230: chown d1/d19/c37 164950928 1 2026-03-09T16:15:06.804 INFO:tasks.workunit.client.1.vm05.stdout:5/168: rename d8/f1f to d8/d18/d1b/d2e/f3c 0 2026-03-09T16:15:06.806 INFO:tasks.workunit.client.1.vm05.stdout:1/155: write d7/d15/d16/f1c [87216,74579] 0 2026-03-09T16:15:06.809 INFO:tasks.workunit.client.1.vm05.stdout:1/156: write d7/d15/d16/f29 [833298,126607] 0 2026-03-09T16:15:06.813 INFO:tasks.workunit.client.1.vm05.stdout:5/169: dwrite d8/fd [4194304,4194304] 0 2026-03-09T16:15:06.814 INFO:tasks.workunit.client.1.vm05.stdout:1/157: fdatasync d7/dd/de/f35 0 2026-03-09T16:15:06.814 INFO:tasks.workunit.client.1.vm05.stdout:8/113: mknod d4/d6/db/d14/d15/c2d 0 2026-03-09T16:15:06.815 INFO:tasks.workunit.client.1.vm05.stdout:6/154: link d17/f2e d17/f30 0 2026-03-09T16:15:06.826 INFO:tasks.workunit.client.1.vm05.stdout:0/154: dwrite d5/db/d1d/f2e [0,4194304] 0 2026-03-09T16:15:06.826 INFO:tasks.workunit.client.1.vm05.stdout:7/231: mkdir d1/d2/d8/d31/d4a 0 2026-03-09T16:15:06.827 INFO:tasks.workunit.client.1.vm05.stdout:8/114: rmdir d4/d6/db/d14 39 2026-03-09T16:15:06.839 INFO:tasks.workunit.client.1.vm05.stdout:4/153: link d5/f2d d5/de/d15/d21/d27/f36 0 2026-03-09T16:15:06.840 INFO:tasks.workunit.client.1.vm05.stdout:0/155: mknod d5/d1b/c36 0 2026-03-09T16:15:06.844 INFO:tasks.workunit.client.1.vm05.stdout:8/115: mkdir d4/d6/db/dc/d2e 0 2026-03-09T16:15:06.849 INFO:tasks.workunit.client.1.vm05.stdout:1/158: dwrite d7/dd/f1f [0,4194304] 0 2026-03-09T16:15:06.852 INFO:tasks.workunit.client.1.vm05.stdout:4/154: mkdir d5/d19/d37 0 2026-03-09T16:15:06.852 INFO:tasks.workunit.client.1.vm05.stdout:7/232: dread d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:06.852 INFO:tasks.workunit.client.1.vm05.stdout:2/184: dread db/dd/f1b [0,4194304] 0 2026-03-09T16:15:06.857 INFO:tasks.workunit.client.1.vm05.stdout:8/116: symlink d4/d6/l2f 0 2026-03-09T16:15:06.859 INFO:tasks.workunit.client.1.vm05.stdout:1/159: symlink d7/dd/de/l36 0 2026-03-09T16:15:06.860 INFO:tasks.workunit.client.1.vm05.stdout:0/156: dread d5/d11/f23 [0,4194304] 0 2026-03-09T16:15:06.864 INFO:tasks.workunit.client.1.vm05.stdout:1/160: creat d7/d27/f37 x:0 0 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:8/117: rename d4/d6/fa to d4/d6/db/dc/f30 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:0/157: creat d5/d11/f37 x:0 0 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:8/118: chown d4/l5 37962 1 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:2/185: mkdir db/dd/d15/d3f 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:1/161: write d7/f34 [948374,104581] 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:4/155: link d5/d19/f32 d5/de/d15/f38 0 2026-03-09T16:15:06.870 INFO:tasks.workunit.client.1.vm05.stdout:0/158: read d5/db/d1d/f2e [3506353,96052] 0 2026-03-09T16:15:06.873 INFO:tasks.workunit.client.1.vm05.stdout:0/159: truncate d5/d1b/d30/f2a 963251 0 2026-03-09T16:15:06.874 INFO:tasks.workunit.client.1.vm05.stdout:1/162: creat d7/dd/de/f38 x:0 0 0 2026-03-09T16:15:06.874 INFO:tasks.workunit.client.1.vm05.stdout:9/162: creat d4/f2e x:0 0 0 2026-03-09T16:15:06.876 INFO:tasks.workunit.client.1.vm05.stdout:2/186: symlink db/dd/d15/d1f/d21/l40 0 2026-03-09T16:15:06.880 INFO:tasks.workunit.client.1.vm05.stdout:1/163: mkdir d7/dd/d21/d39 0 2026-03-09T16:15:06.883 INFO:tasks.workunit.client.1.vm05.stdout:4/156: mkdir d5/de/d15/d21/d39 0 2026-03-09T16:15:06.883 INFO:tasks.workunit.client.1.vm05.stdout:9/163: dread d4/f6 [0,4194304] 0 2026-03-09T16:15:06.887 INFO:tasks.workunit.client.1.vm05.stdout:1/164: mkdir d7/dd/d21/d2d/d3a 0 2026-03-09T16:15:06.893 INFO:tasks.workunit.client.1.vm05.stdout:8/119: link d4/d6/l2f d4/d6/db/d14/d15/l31 0 2026-03-09T16:15:06.894 INFO:tasks.workunit.client.1.vm05.stdout:8/120: chown f0 1527 1 2026-03-09T16:15:06.896 INFO:tasks.workunit.client.1.vm05.stdout:0/160: link d5/ce d5/db/c38 0 2026-03-09T16:15:06.898 INFO:tasks.workunit.client.1.vm05.stdout:1/165: mkdir d7/dd/d21/d3b 0 2026-03-09T16:15:06.901 INFO:tasks.workunit.client.1.vm05.stdout:0/161: truncate d5/d34/f35 897151 0 2026-03-09T16:15:06.906 INFO:tasks.workunit.client.1.vm05.stdout:9/164: chown d4/d8/c24 14574555 1 2026-03-09T16:15:06.909 INFO:tasks.workunit.client.1.vm05.stdout:8/121: dwrite d4/d6/f24 [0,4194304] 0 2026-03-09T16:15:06.909 INFO:tasks.workunit.client.1.vm05.stdout:4/157: creat d5/de/d15/d21/f3a x:0 0 0 2026-03-09T16:15:06.909 INFO:tasks.workunit.client.1.vm05.stdout:0/162: write d5/d1b/d30/f2a [1416579,7539] 0 2026-03-09T16:15:06.912 INFO:tasks.workunit.client.1.vm05.stdout:4/158: fdatasync f0 0 2026-03-09T16:15:06.913 INFO:tasks.workunit.client.1.vm05.stdout:0/163: chown d5/d1b/d30/f2a 183869 1 2026-03-09T16:15:06.915 INFO:tasks.workunit.client.1.vm05.stdout:1/166: dwrite d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:06.916 INFO:tasks.workunit.client.1.vm05.stdout:9/165: creat d4/d8/d2b/f2f x:0 0 0 2026-03-09T16:15:06.919 INFO:tasks.workunit.client.1.vm05.stdout:4/159: creat d5/f3b x:0 0 0 2026-03-09T16:15:06.920 INFO:tasks.workunit.client.1.vm05.stdout:1/167: chown d7/dd/de/l1b 3 1 2026-03-09T16:15:06.920 INFO:tasks.workunit.client.1.vm05.stdout:0/164: link d5/d1b/f25 d5/db/d1d/f39 0 2026-03-09T16:15:06.925 INFO:tasks.workunit.client.1.vm05.stdout:1/168: creat d7/d27/f3c x:0 0 0 2026-03-09T16:15:06.925 INFO:tasks.workunit.client.1.vm05.stdout:0/165: creat d5/d2c/f3a x:0 0 0 2026-03-09T16:15:06.926 INFO:tasks.workunit.client.1.vm05.stdout:8/122: dwrite d4/f13 [0,4194304] 0 2026-03-09T16:15:06.927 INFO:tasks.workunit.client.1.vm05.stdout:1/169: chown d7/d15/d16/f29 62473 1 2026-03-09T16:15:06.935 INFO:tasks.workunit.client.1.vm05.stdout:9/166: dwrite d4/f2d [0,4194304] 0 2026-03-09T16:15:06.935 INFO:tasks.workunit.client.1.vm05.stdout:1/170: creat d7/dd/d21/f3d x:0 0 0 2026-03-09T16:15:06.936 INFO:tasks.workunit.client.1.vm05.stdout:0/166: chown d5/d1b/d30/c31 48059473 1 2026-03-09T16:15:06.937 INFO:tasks.workunit.client.1.vm05.stdout:4/160: read d5/de/d15/d21/d27/f36 [93478,64552] 0 2026-03-09T16:15:06.938 INFO:tasks.workunit.client.1.vm05.stdout:0/167: write d5/f8 [4855043,2060] 0 2026-03-09T16:15:06.939 INFO:tasks.workunit.client.1.vm05.stdout:1/171: truncate d7/dd/de/f2e 803156 0 2026-03-09T16:15:06.943 INFO:tasks.workunit.client.1.vm05.stdout:1/172: dread - d7/d27/f33 zero size 2026-03-09T16:15:06.946 INFO:tasks.workunit.client.1.vm05.stdout:0/168: chown d5/db/f12 402179513 1 2026-03-09T16:15:06.946 INFO:tasks.workunit.client.1.vm05.stdout:8/123: dread d4/d6/f27 [0,4194304] 0 2026-03-09T16:15:06.948 INFO:tasks.workunit.client.1.vm05.stdout:9/167: symlink d4/l30 0 2026-03-09T16:15:06.948 INFO:tasks.workunit.client.1.vm05.stdout:8/124: readlink d4/l5 0 2026-03-09T16:15:06.949 INFO:tasks.workunit.client.1.vm05.stdout:1/173: link d7/f9 d7/dd/de/f3e 0 2026-03-09T16:15:06.952 INFO:tasks.workunit.client.1.vm05.stdout:9/168: fdatasync d4/d10/f15 0 2026-03-09T16:15:06.961 INFO:tasks.workunit.client.1.vm05.stdout:8/125: creat d4/d6/db/d14/f32 x:0 0 0 2026-03-09T16:15:06.961 INFO:tasks.workunit.client.1.vm05.stdout:9/169: fdatasync d4/d10/f18 0 2026-03-09T16:15:06.961 INFO:tasks.workunit.client.1.vm05.stdout:9/170: mkdir d4/d8/d2b/d31 0 2026-03-09T16:15:06.964 INFO:tasks.workunit.client.1.vm05.stdout:9/171: dread d4/d10/f18 [0,4194304] 0 2026-03-09T16:15:06.967 INFO:tasks.workunit.client.1.vm05.stdout:0/169: dwrite d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:06.967 INFO:tasks.workunit.client.1.vm05.stdout:1/174: dwrite d7/fb [0,4194304] 0 2026-03-09T16:15:06.967 INFO:tasks.workunit.client.1.vm05.stdout:1/175: readlink d7/dd/de/l1b 0 2026-03-09T16:15:06.973 INFO:tasks.workunit.client.1.vm05.stdout:9/172: dread d4/d10/f1d [0,4194304] 0 2026-03-09T16:15:06.975 INFO:tasks.workunit.client.1.vm05.stdout:9/173: unlink d4/d8/f21 0 2026-03-09T16:15:06.998 INFO:tasks.workunit.client.1.vm05.stdout:9/174: sync 2026-03-09T16:15:07.108 INFO:tasks.workunit.client.1.vm05.stdout:0/170: dread d5/d2c/f28 [0,4194304] 0 2026-03-09T16:15:07.108 INFO:tasks.workunit.client.1.vm05.stdout:0/171: dread - d5/d1b/d30/f2f zero size 2026-03-09T16:15:07.109 INFO:tasks.workunit.client.1.vm05.stdout:3/111: fsync d0/d9/f1d 0 2026-03-09T16:15:07.109 INFO:tasks.workunit.client.1.vm05.stdout:0/172: truncate d5/d11/f37 687608 0 2026-03-09T16:15:07.109 INFO:tasks.workunit.client.1.vm05.stdout:3/112: chown d0/d9/dc/f14 5242 1 2026-03-09T16:15:07.113 INFO:tasks.workunit.client.1.vm05.stdout:0/173: dread d5/d34/f35 [0,4194304] 0 2026-03-09T16:15:07.115 INFO:tasks.workunit.client.1.vm05.stdout:3/113: readlink d0/d9/dc/l20 0 2026-03-09T16:15:07.120 INFO:tasks.workunit.client.1.vm05.stdout:6/155: dread f5 [0,4194304] 0 2026-03-09T16:15:07.120 INFO:tasks.workunit.client.1.vm05.stdout:0/174: mkdir d5/d1b/d3b 0 2026-03-09T16:15:07.120 INFO:tasks.workunit.client.1.vm05.stdout:0/175: stat d5/db/d1d/f39 0 2026-03-09T16:15:07.121 INFO:tasks.workunit.client.1.vm05.stdout:0/176: write d5/d1b/d30/f29 [3046242,80362] 0 2026-03-09T16:15:07.124 INFO:tasks.workunit.client.1.vm05.stdout:3/114: rename d0/d9/dc to d0/d9/d22 0 2026-03-09T16:15:07.125 INFO:tasks.workunit.client.1.vm05.stdout:0/177: dwrite d5/d1b/d30/f2a [0,4194304] 0 2026-03-09T16:15:07.126 INFO:tasks.workunit.client.1.vm05.stdout:6/156: write d17/f30 [2524956,82254] 0 2026-03-09T16:15:07.129 INFO:tasks.workunit.client.1.vm05.stdout:0/178: unlink d5/db/fc 0 2026-03-09T16:15:07.130 INFO:tasks.workunit.client.1.vm05.stdout:3/115: sync 2026-03-09T16:15:07.131 INFO:tasks.workunit.client.1.vm05.stdout:3/116: write d0/d9/d22/f18 [1406630,64259] 0 2026-03-09T16:15:07.136 INFO:tasks.workunit.client.1.vm05.stdout:5/170: write d8/d18/d1b/f28 [4045715,48556] 0 2026-03-09T16:15:07.140 INFO:tasks.workunit.client.1.vm05.stdout:7/233: write d1/d2/d8/dc/f1a [1124472,86784] 0 2026-03-09T16:15:07.140 INFO:tasks.workunit.client.1.vm05.stdout:3/117: write d0/d9/fa [1020297,80840] 0 2026-03-09T16:15:07.142 INFO:tasks.workunit.client.1.vm05.stdout:3/118: truncate d0/d9/d22/f14 755467 0 2026-03-09T16:15:07.145 INFO:tasks.workunit.client.1.vm05.stdout:7/234: dwrite d1/d2/d8/dc/d1b/f42 [0,4194304] 0 2026-03-09T16:15:07.149 INFO:tasks.workunit.client.1.vm05.stdout:6/157: creat d17/f31 x:0 0 0 2026-03-09T16:15:07.152 INFO:tasks.workunit.client.1.vm05.stdout:7/235: dwrite d1/f26 [0,4194304] 0 2026-03-09T16:15:07.152 INFO:tasks.workunit.client.1.vm05.stdout:0/179: creat d5/d1b/d3b/f3c x:0 0 0 2026-03-09T16:15:07.155 INFO:tasks.workunit.client.1.vm05.stdout:0/180: chown d5/d1b/d30/c31 1 1 2026-03-09T16:15:07.169 INFO:tasks.workunit.client.1.vm05.stdout:3/119: creat d0/d9/d22/f23 x:0 0 0 2026-03-09T16:15:07.183 INFO:tasks.workunit.client.1.vm05.stdout:0/181: dread d5/f17 [0,4194304] 0 2026-03-09T16:15:07.198 INFO:tasks.workunit.client.1.vm05.stdout:3/120: readlink d0/l1c 0 2026-03-09T16:15:07.200 INFO:tasks.workunit.client.1.vm05.stdout:0/182: unlink d5/d11/f18 0 2026-03-09T16:15:07.203 INFO:tasks.workunit.client.1.vm05.stdout:2/187: truncate db/dd/f1b 1624645 0 2026-03-09T16:15:07.207 INFO:tasks.workunit.client.1.vm05.stdout:3/121: unlink d0/d9/d10/c19 0 2026-03-09T16:15:07.207 INFO:tasks.workunit.client.1.vm05.stdout:4/161: truncate f0 2751735 0 2026-03-09T16:15:07.207 INFO:tasks.workunit.client.1.vm05.stdout:4/162: readlink d5/d19/l1a 0 2026-03-09T16:15:07.208 INFO:tasks.workunit.client.1.vm05.stdout:0/183: mknod d5/d11/c3d 0 2026-03-09T16:15:07.208 INFO:tasks.workunit.client.1.vm05.stdout:4/163: read d5/de/d15/f1b [1050140,77490] 0 2026-03-09T16:15:07.209 INFO:tasks.workunit.client.1.vm05.stdout:0/184: dread - d5/d1b/f25 zero size 2026-03-09T16:15:07.210 INFO:tasks.workunit.client.1.vm05.stdout:5/171: getdents d8 0 2026-03-09T16:15:07.210 INFO:tasks.workunit.client.1.vm05.stdout:0/185: stat d5/db/l33 0 2026-03-09T16:15:07.212 INFO:tasks.workunit.client.1.vm05.stdout:2/188: creat db/f41 x:0 0 0 2026-03-09T16:15:07.213 INFO:tasks.workunit.client.1.vm05.stdout:3/122: mknod d0/d9/d22/c24 0 2026-03-09T16:15:07.217 INFO:tasks.workunit.client.1.vm05.stdout:8/126: rename d4/d6/db/d14 to d4/d6/db/df/d33 0 2026-03-09T16:15:07.218 INFO:tasks.workunit.client.1.vm05.stdout:8/127: fsync d4/d6/db/fd 0 2026-03-09T16:15:07.227 INFO:tasks.workunit.client.1.vm05.stdout:5/172: mkdir d8/d3d 0 2026-03-09T16:15:07.228 INFO:tasks.workunit.client.1.vm05.stdout:0/186: write d5/d11/f23 [705366,96021] 0 2026-03-09T16:15:07.234 INFO:tasks.workunit.client.1.vm05.stdout:0/187: dwrite d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:07.241 INFO:tasks.workunit.client.1.vm05.stdout:0/188: dread d5/d11/f1e [0,4194304] 0 2026-03-09T16:15:07.250 INFO:tasks.workunit.client.1.vm05.stdout:8/128: unlink d4/d6/c7 0 2026-03-09T16:15:07.250 INFO:tasks.workunit.client.1.vm05.stdout:2/189: dread fa [4194304,4194304] 0 2026-03-09T16:15:07.261 INFO:tasks.workunit.client.1.vm05.stdout:6/158: dwrite d17/f2d [0,4194304] 0 2026-03-09T16:15:07.263 INFO:tasks.workunit.client.1.vm05.stdout:6/159: fsync d17/f2e 0 2026-03-09T16:15:07.277 INFO:tasks.workunit.client.1.vm05.stdout:0/189: symlink d5/d1b/d30/l3e 0 2026-03-09T16:15:07.277 INFO:tasks.workunit.client.1.vm05.stdout:0/190: chown d5/d1b/d30 7691 1 2026-03-09T16:15:07.281 INFO:tasks.workunit.client.1.vm05.stdout:8/129: symlink d4/d6/db/dc/l34 0 2026-03-09T16:15:07.282 INFO:tasks.workunit.client.1.vm05.stdout:4/164: write d5/d19/f32 [937487,17755] 0 2026-03-09T16:15:07.284 INFO:tasks.workunit.client.1.vm05.stdout:3/123: rmdir d0/d1b 0 2026-03-09T16:15:07.290 INFO:tasks.workunit.client.1.vm05.stdout:2/190: symlink db/dd/d15/l42 0 2026-03-09T16:15:07.294 INFO:tasks.workunit.client.1.vm05.stdout:1/176: rename f0 to d7/f3f 0 2026-03-09T16:15:07.294 INFO:tasks.workunit.client.1.vm05.stdout:2/191: truncate db/dd/d15/d1f/d20/f3d 124596 0 2026-03-09T16:15:07.294 INFO:tasks.workunit.client.1.vm05.stdout:1/177: fdatasync d7/d15/d16/f29 0 2026-03-09T16:15:07.300 INFO:tasks.workunit.client.1.vm05.stdout:6/160: write fa [7732048,58507] 0 2026-03-09T16:15:07.302 INFO:tasks.workunit.client.1.vm05.stdout:0/191: dread d5/d11/f1e [0,4194304] 0 2026-03-09T16:15:07.304 INFO:tasks.workunit.client.1.vm05.stdout:0/192: chown d5/d1b/d30/c26 22 1 2026-03-09T16:15:07.306 INFO:tasks.workunit.client.1.vm05.stdout:8/130: write d4/d6/db/dc/f30 [4117504,72027] 0 2026-03-09T16:15:07.306 INFO:tasks.workunit.client.1.vm05.stdout:8/131: dread - d4/d6/f1b zero size 2026-03-09T16:15:07.307 INFO:tasks.workunit.client.1.vm05.stdout:5/173: stat d8/f29 0 2026-03-09T16:15:07.309 INFO:tasks.workunit.client.1.vm05.stdout:6/161: dwrite d17/f2d [0,4194304] 0 2026-03-09T16:15:07.320 INFO:tasks.workunit.client.1.vm05.stdout:4/165: mkdir d5/de/d15/d21/d27/d3c 0 2026-03-09T16:15:07.333 INFO:tasks.workunit.client.1.vm05.stdout:3/124: symlink d0/d9/l25 0 2026-03-09T16:15:07.334 INFO:tasks.workunit.client.1.vm05.stdout:3/125: write d0/d9/d22/f23 [803013,23645] 0 2026-03-09T16:15:07.342 INFO:tasks.workunit.client.1.vm05.stdout:9/175: rename d4/f2d to d4/d8/f32 0 2026-03-09T16:15:07.349 INFO:tasks.workunit.client.1.vm05.stdout:5/174: chown d8/d18/c1a 430432 1 2026-03-09T16:15:07.353 INFO:tasks.workunit.client.1.vm05.stdout:8/132: rmdir d4/d6/db/df 39 2026-03-09T16:15:07.361 INFO:tasks.workunit.client.1.vm05.stdout:7/236: rename d1/d47 to d1/d2/d8/dc/d1b/d30/d4b 0 2026-03-09T16:15:07.363 INFO:tasks.workunit.client.1.vm05.stdout:9/176: symlink d4/d10/l33 0 2026-03-09T16:15:07.364 INFO:tasks.workunit.client.1.vm05.stdout:1/178: mknod d7/dd/d21/d2d/d3a/c40 0 2026-03-09T16:15:07.364 INFO:tasks.workunit.client.1.vm05.stdout:9/177: dread d4/d10/f1d [0,4194304] 0 2026-03-09T16:15:07.365 INFO:tasks.workunit.client.1.vm05.stdout:1/179: write d7/dd/f1f [1039188,62644] 0 2026-03-09T16:15:07.366 INFO:tasks.workunit.client.1.vm05.stdout:0/193: mknod d5/c3f 0 2026-03-09T16:15:07.366 INFO:tasks.workunit.client.1.vm05.stdout:5/175: fsync d8/d1d/f1e 0 2026-03-09T16:15:07.367 INFO:tasks.workunit.client.1.vm05.stdout:5/176: read - d8/d18/d1b/f30 zero size 2026-03-09T16:15:07.368 INFO:tasks.workunit.client.1.vm05.stdout:9/178: dread d4/d10/f22 [0,4194304] 0 2026-03-09T16:15:07.369 INFO:tasks.workunit.client.1.vm05.stdout:5/177: write d8/d18/d1b/f36 [897377,129889] 0 2026-03-09T16:15:07.369 INFO:tasks.workunit.client.1.vm05.stdout:9/179: chown d4/l30 13417 1 2026-03-09T16:15:07.371 INFO:tasks.workunit.client.1.vm05.stdout:4/166: creat d5/de/d15/d21/d27/d3c/f3d x:0 0 0 2026-03-09T16:15:07.372 INFO:tasks.workunit.client.1.vm05.stdout:9/180: write d4/d10/f18 [1093588,40511] 0 2026-03-09T16:15:07.373 INFO:tasks.workunit.client.1.vm05.stdout:0/194: dread d5/d1b/d30/f2a [0,4194304] 0 2026-03-09T16:15:07.373 INFO:tasks.workunit.client.1.vm05.stdout:7/237: creat d1/d2/d8/dc/d15/d3e/f4c x:0 0 0 2026-03-09T16:15:07.373 INFO:tasks.workunit.client.1.vm05.stdout:9/181: write d4/d8/f28 [740831,125027] 0 2026-03-09T16:15:07.374 INFO:tasks.workunit.client.1.vm05.stdout:2/192: creat db/dd/f43 x:0 0 0 2026-03-09T16:15:07.379 INFO:tasks.workunit.client.1.vm05.stdout:0/195: read d5/db/f12 [879897,54162] 0 2026-03-09T16:15:07.383 INFO:tasks.workunit.client.1.vm05.stdout:9/182: dread d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:07.385 INFO:tasks.workunit.client.1.vm05.stdout:1/180: creat d7/dd/d21/d2d/d3a/f41 x:0 0 0 2026-03-09T16:15:07.397 INFO:tasks.workunit.client.1.vm05.stdout:8/133: mknod d4/c35 0 2026-03-09T16:15:07.400 INFO:tasks.workunit.client.1.vm05.stdout:6/162: creat d17/f32 x:0 0 0 2026-03-09T16:15:07.401 INFO:tasks.workunit.client.1.vm05.stdout:6/163: chown d17/d22 80845 1 2026-03-09T16:15:07.402 INFO:tasks.workunit.client.1.vm05.stdout:6/164: write d17/f30 [11698,72678] 0 2026-03-09T16:15:07.405 INFO:tasks.workunit.client.1.vm05.stdout:4/167: creat d5/f3e x:0 0 0 2026-03-09T16:15:07.406 INFO:tasks.workunit.client.1.vm05.stdout:4/168: read d5/f2d [107126,45387] 0 2026-03-09T16:15:07.409 INFO:tasks.workunit.client.1.vm05.stdout:6/165: dwrite d17/f2e [4194304,4194304] 0 2026-03-09T16:15:07.413 INFO:tasks.workunit.client.1.vm05.stdout:5/178: dread d8/d18/f3a [0,4194304] 0 2026-03-09T16:15:07.414 INFO:tasks.workunit.client.1.vm05.stdout:5/179: chown d8/d18/c1a 1 1 2026-03-09T16:15:07.419 INFO:tasks.workunit.client.1.vm05.stdout:4/169: sync 2026-03-09T16:15:07.428 INFO:tasks.workunit.client.1.vm05.stdout:7/238: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:07.438 INFO:tasks.workunit.client.1.vm05.stdout:9/183: creat d4/d8/d2b/f34 x:0 0 0 2026-03-09T16:15:07.439 INFO:tasks.workunit.client.1.vm05.stdout:1/181: chown d7/dd/de/l17 2853 1 2026-03-09T16:15:07.443 INFO:tasks.workunit.client.1.vm05.stdout:1/182: dwrite d7/dd/f19 [0,4194304] 0 2026-03-09T16:15:07.460 INFO:tasks.workunit.client.1.vm05.stdout:3/126: getdents d0 0 2026-03-09T16:15:07.461 INFO:tasks.workunit.client.1.vm05.stdout:0/196: dwrite d5/db/f12 [0,4194304] 0 2026-03-09T16:15:07.467 INFO:tasks.workunit.client.1.vm05.stdout:7/239: mknod d1/d2/d8/dc/d15/c4d 0 2026-03-09T16:15:07.483 INFO:tasks.workunit.client.1.vm05.stdout:6/166: link d17/f1c d17/d1d/f33 0 2026-03-09T16:15:07.484 INFO:tasks.workunit.client.1.vm05.stdout:3/127: unlink d0/d9/d22/f21 0 2026-03-09T16:15:07.484 INFO:tasks.workunit.client.1.vm05.stdout:0/197: creat d5/d11/f40 x:0 0 0 2026-03-09T16:15:07.485 INFO:tasks.workunit.client.1.vm05.stdout:4/170: mkdir d5/de/d15/d3f 0 2026-03-09T16:15:07.490 INFO:tasks.workunit.client.1.vm05.stdout:5/180: link c0 d8/d18/c3e 0 2026-03-09T16:15:07.495 INFO:tasks.workunit.client.1.vm05.stdout:4/171: symlink d5/de/d15/d21/d27/d3c/l40 0 2026-03-09T16:15:07.495 INFO:tasks.workunit.client.1.vm05.stdout:4/172: write d5/de/f23 [309170,129421] 0 2026-03-09T16:15:07.495 INFO:tasks.workunit.client.1.vm05.stdout:8/134: link d4/d6/c1a d4/d6/db/df/c36 0 2026-03-09T16:15:07.497 INFO:tasks.workunit.client.1.vm05.stdout:0/198: sync 2026-03-09T16:15:07.501 INFO:tasks.workunit.client.1.vm05.stdout:6/167: unlink c14 0 2026-03-09T16:15:07.504 INFO:tasks.workunit.client.1.vm05.stdout:5/181: write d8/f13 [4864698,109449] 0 2026-03-09T16:15:07.506 INFO:tasks.workunit.client.1.vm05.stdout:9/184: rename d4/d8 to d4/d10/d35 0 2026-03-09T16:15:07.507 INFO:tasks.workunit.client.1.vm05.stdout:9/185: write d4/d10/d35/d2b/f2f [537845,114515] 0 2026-03-09T16:15:07.508 INFO:tasks.workunit.client.1.vm05.stdout:9/186: truncate d4/d10/f22 4837483 0 2026-03-09T16:15:07.509 INFO:tasks.workunit.client.1.vm05.stdout:9/187: readlink d4/l30 0 2026-03-09T16:15:07.509 INFO:tasks.workunit.client.1.vm05.stdout:5/182: dwrite d8/d18/d1b/f36 [0,4194304] 0 2026-03-09T16:15:07.516 INFO:tasks.workunit.client.1.vm05.stdout:0/199: write d5/d1b/f25 [80015,19457] 0 2026-03-09T16:15:07.517 INFO:tasks.workunit.client.1.vm05.stdout:1/183: getdents d7/dd/d21/d2d 0 2026-03-09T16:15:07.519 INFO:tasks.workunit.client.1.vm05.stdout:2/193: dwrite db/dd/f1b [0,4194304] 0 2026-03-09T16:15:07.524 INFO:tasks.workunit.client.1.vm05.stdout:6/168: rmdir d17/d22/d27 39 2026-03-09T16:15:07.530 INFO:tasks.workunit.client.1.vm05.stdout:7/240: link d1/d2/d8/dc/d15/c3a d1/c4e 0 2026-03-09T16:15:07.537 INFO:tasks.workunit.client.1.vm05.stdout:3/128: rename d0/d9/d22/ce to d0/d9/c26 0 2026-03-09T16:15:07.538 INFO:tasks.workunit.client.1.vm05.stdout:3/129: write d0/d9/f1d [4997133,68898] 0 2026-03-09T16:15:07.538 INFO:tasks.workunit.client.1.vm05.stdout:3/130: chown d0/d9/d22/f23 1 1 2026-03-09T16:15:07.551 INFO:tasks.workunit.client.1.vm05.stdout:8/135: chown d4/d6/db/df/c36 246 1 2026-03-09T16:15:07.551 INFO:tasks.workunit.client.1.vm05.stdout:0/200: stat d5/ce 0 2026-03-09T16:15:07.553 INFO:tasks.workunit.client.1.vm05.stdout:8/136: dread d4/d6/db/dc/f30 [0,4194304] 0 2026-03-09T16:15:07.555 INFO:tasks.workunit.client.1.vm05.stdout:1/184: unlink d7/l25 0 2026-03-09T16:15:07.556 INFO:tasks.workunit.client.1.vm05.stdout:1/185: dread - d7/dd/d21/d2d/d3a/f41 zero size 2026-03-09T16:15:07.560 INFO:tasks.workunit.client.1.vm05.stdout:5/183: dread d8/d18/d1b/d2e/f3c [0,4194304] 0 2026-03-09T16:15:07.561 INFO:tasks.workunit.client.1.vm05.stdout:5/184: chown d8/d18/d1b/f36 0 1 2026-03-09T16:15:07.562 INFO:tasks.workunit.client.1.vm05.stdout:5/185: write d8/d18/d1b/f32 [457142,75621] 0 2026-03-09T16:15:07.564 INFO:tasks.workunit.client.1.vm05.stdout:5/186: dread d8/d18/d1b/d2e/f35 [0,4194304] 0 2026-03-09T16:15:07.564 INFO:tasks.workunit.client.1.vm05.stdout:2/194: write db/dd/d15/d1f/f36 [3295002,34342] 0 2026-03-09T16:15:07.570 INFO:tasks.workunit.client.1.vm05.stdout:3/131: symlink d0/d9/d10/l27 0 2026-03-09T16:15:07.570 INFO:tasks.workunit.client.1.vm05.stdout:3/132: chown d0/d9/d10/l27 458658 1 2026-03-09T16:15:07.572 INFO:tasks.workunit.client.1.vm05.stdout:4/173: link d5/d19/l33 d5/de/d15/d21/d27/l41 0 2026-03-09T16:15:07.573 INFO:tasks.workunit.client.1.vm05.stdout:0/201: creat d5/d2c/f41 x:0 0 0 2026-03-09T16:15:07.578 INFO:tasks.workunit.client.1.vm05.stdout:2/195: mkdir db/dd/d15/d1f/d20/d23/d44 0 2026-03-09T16:15:07.579 INFO:tasks.workunit.client.1.vm05.stdout:6/169: mkdir d17/d22/d27/d34 0 2026-03-09T16:15:07.580 INFO:tasks.workunit.client.1.vm05.stdout:7/241: truncate d1/d2/f22 382223 0 2026-03-09T16:15:07.580 INFO:tasks.workunit.client.1.vm05.stdout:3/133: fsync d0/fd 0 2026-03-09T16:15:07.586 INFO:tasks.workunit.client.1.vm05.stdout:0/202: rmdir d5/d11 39 2026-03-09T16:15:07.589 INFO:tasks.workunit.client.1.vm05.stdout:8/137: symlink d4/d6/db/l37 0 2026-03-09T16:15:07.590 INFO:tasks.workunit.client.1.vm05.stdout:1/186: creat d7/dd/d21/d3b/f42 x:0 0 0 2026-03-09T16:15:07.590 INFO:tasks.workunit.client.1.vm05.stdout:5/187: creat d8/d3d/f3f x:0 0 0 2026-03-09T16:15:07.590 INFO:tasks.workunit.client.1.vm05.stdout:1/187: chown d7/d15/f22 0 1 2026-03-09T16:15:07.591 INFO:tasks.workunit.client.1.vm05.stdout:2/196: mknod db/dd/d15/d1f/c45 0 2026-03-09T16:15:07.591 INFO:tasks.workunit.client.1.vm05.stdout:2/197: stat db/l13 0 2026-03-09T16:15:07.591 INFO:tasks.workunit.client.1.vm05.stdout:8/138: rename d4/d6/db/df to d4/d6/db/df/d38 22 2026-03-09T16:15:07.592 INFO:tasks.workunit.client.1.vm05.stdout:3/134: unlink d0/d9/d22/f23 0 2026-03-09T16:15:07.592 INFO:tasks.workunit.client.1.vm05.stdout:9/188: getdents d4/d10/d35/d2b 0 2026-03-09T16:15:07.593 INFO:tasks.workunit.client.1.vm05.stdout:3/135: write d0/d9/fa [399170,77337] 0 2026-03-09T16:15:07.594 INFO:tasks.workunit.client.1.vm05.stdout:8/139: chown d4/d6/f29 24947193 1 2026-03-09T16:15:07.594 INFO:tasks.workunit.client.1.vm05.stdout:0/203: readlink d5/d11/l19 0 2026-03-09T16:15:07.594 INFO:tasks.workunit.client.1.vm05.stdout:9/189: dread d4/d10/f1d [0,4194304] 0 2026-03-09T16:15:07.594 INFO:tasks.workunit.client.1.vm05.stdout:6/170: fdatasync d17/d22/d27/f2a 0 2026-03-09T16:15:07.594 INFO:tasks.workunit.client.1.vm05.stdout:5/188: chown d8/d18/c19 37289 1 2026-03-09T16:15:07.596 INFO:tasks.workunit.client.1.vm05.stdout:0/204: truncate d5/d11/f37 1367227 0 2026-03-09T16:15:07.596 INFO:tasks.workunit.client.1.vm05.stdout:0/205: stat d5/db/d1d 0 2026-03-09T16:15:07.601 INFO:tasks.workunit.client.1.vm05.stdout:1/188: fsync d7/f3f 0 2026-03-09T16:15:07.602 INFO:tasks.workunit.client.1.vm05.stdout:2/198: mkdir db/dd/d15/d46 0 2026-03-09T16:15:07.604 INFO:tasks.workunit.client.1.vm05.stdout:3/136: dwrite d0/d9/f1d [0,4194304] 0 2026-03-09T16:15:07.609 INFO:tasks.workunit.client.1.vm05.stdout:8/140: rmdir d4/d6 39 2026-03-09T16:15:07.612 INFO:tasks.workunit.client.1.vm05.stdout:2/199: dwrite db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:07.615 INFO:tasks.workunit.client.1.vm05.stdout:9/190: mkdir d4/d10/d35/d36 0 2026-03-09T16:15:07.616 INFO:tasks.workunit.client.1.vm05.stdout:5/189: mkdir d8/d3d/d40 0 2026-03-09T16:15:07.618 INFO:tasks.workunit.client.1.vm05.stdout:0/206: dread d5/f17 [0,4194304] 0 2026-03-09T16:15:07.620 INFO:tasks.workunit.client.1.vm05.stdout:1/189: mknod d7/dd/d21/d3b/c43 0 2026-03-09T16:15:07.625 INFO:tasks.workunit.client.1.vm05.stdout:4/174: getdents d5/de/d15/d21 0 2026-03-09T16:15:07.636 INFO:tasks.workunit.client.1.vm05.stdout:2/200: unlink db/dd/c3e 0 2026-03-09T16:15:07.637 INFO:tasks.workunit.client.1.vm05.stdout:0/207: creat d5/d1b/d3b/f42 x:0 0 0 2026-03-09T16:15:07.648 INFO:tasks.workunit.client.1.vm05.stdout:7/242: getdents d1/d2/d8 0 2026-03-09T16:15:07.651 INFO:tasks.workunit.client.1.vm05.stdout:7/243: dread d1/d2/d8/dc/d14/f41 [0,4194304] 0 2026-03-09T16:15:07.658 INFO:tasks.workunit.client.1.vm05.stdout:5/190: creat d8/d3d/d40/f41 x:0 0 0 2026-03-09T16:15:07.663 INFO:tasks.workunit.client.1.vm05.stdout:4/175: creat d5/de/d15/d21/d39/f42 x:0 0 0 2026-03-09T16:15:07.663 INFO:tasks.workunit.client.1.vm05.stdout:1/190: rmdir d7 39 2026-03-09T16:15:07.665 INFO:tasks.workunit.client.1.vm05.stdout:5/191: readlink d8/l17 0 2026-03-09T16:15:07.665 INFO:tasks.workunit.client.1.vm05.stdout:3/137: write d0/fd [439244,77450] 0 2026-03-09T16:15:07.665 INFO:tasks.workunit.client.1.vm05.stdout:7/244: read d1/d2/d8/dc/d14/f35 [253578,89068] 0 2026-03-09T16:15:07.670 INFO:tasks.workunit.client.1.vm05.stdout:7/245: read d1/d2/d8/dc/d1b/f42 [2788673,116003] 0 2026-03-09T16:15:07.670 INFO:tasks.workunit.client.1.vm05.stdout:0/208: truncate d5/f7 7658412 0 2026-03-09T16:15:07.677 INFO:tasks.workunit.client.1.vm05.stdout:9/191: link d4/d10/c1a d4/d10/d35/d2b/d31/c37 0 2026-03-09T16:15:07.679 INFO:tasks.workunit.client.1.vm05.stdout:0/209: sync 2026-03-09T16:15:07.680 INFO:tasks.workunit.client.1.vm05.stdout:6/171: dwrite d17/f1b [0,4194304] 0 2026-03-09T16:15:07.682 INFO:tasks.workunit.client.1.vm05.stdout:9/192: write d4/d10/f15 [2144769,80984] 0 2026-03-09T16:15:07.685 INFO:tasks.workunit.client.1.vm05.stdout:5/192: dwrite d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:07.691 INFO:tasks.workunit.client.1.vm05.stdout:4/176: dwrite d5/de/d15/d21/d27/d3c/f3d [0,4194304] 0 2026-03-09T16:15:07.695 INFO:tasks.workunit.client.1.vm05.stdout:2/201: truncate db/dd/d15/d1f/f24 496816 0 2026-03-09T16:15:07.697 INFO:tasks.workunit.client.1.vm05.stdout:3/138: dwrite d0/d9/fa [0,4194304] 0 2026-03-09T16:15:07.699 INFO:tasks.workunit.client.1.vm05.stdout:8/141: dwrite d4/d6/db/dc/f30 [0,4194304] 0 2026-03-09T16:15:07.699 INFO:tasks.workunit.client.1.vm05.stdout:3/139: dread d0/fd [0,4194304] 0 2026-03-09T16:15:07.711 INFO:tasks.workunit.client.1.vm05.stdout:8/142: dwrite d4/d6/db/df/d33/d15/f22 [0,4194304] 0 2026-03-09T16:15:07.712 INFO:tasks.workunit.client.1.vm05.stdout:3/140: dwrite d0/d9/f1d [0,4194304] 0 2026-03-09T16:15:07.742 INFO:tasks.workunit.client.1.vm05.stdout:7/246: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:07.772 INFO:tasks.workunit.client.1.vm05.stdout:7/247: dread d1/d2/d8/dc/d14/f35 [0,4194304] 0 2026-03-09T16:15:07.854 INFO:tasks.workunit.client.1.vm05.stdout:1/191: mkdir d7/dd/d21/d44 0 2026-03-09T16:15:07.855 INFO:tasks.workunit.client.1.vm05.stdout:1/192: truncate d7/dd/d21/d2d/d3a/f41 436438 0 2026-03-09T16:15:07.870 INFO:tasks.workunit.client.1.vm05.stdout:0/210: fdatasync d5/f17 0 2026-03-09T16:15:07.871 INFO:tasks.workunit.client.1.vm05.stdout:0/211: dread - d5/d11/f40 zero size 2026-03-09T16:15:07.877 INFO:tasks.workunit.client.1.vm05.stdout:4/177: symlink d5/de/l43 0 2026-03-09T16:15:07.886 INFO:tasks.workunit.client.1.vm05.stdout:2/202: creat db/dd/d15/d1f/d21/f47 x:0 0 0 2026-03-09T16:15:07.901 INFO:tasks.workunit.client.1.vm05.stdout:3/141: symlink d0/l28 0 2026-03-09T16:15:07.904 INFO:tasks.workunit.client.1.vm05.stdout:7/248: unlink d1/d2/d8/dc/d18/f2e 0 2026-03-09T16:15:07.905 INFO:tasks.workunit.client.1.vm05.stdout:1/193: fsync d7/f9 0 2026-03-09T16:15:07.909 INFO:tasks.workunit.client.1.vm05.stdout:0/212: mknod d5/d34/c43 0 2026-03-09T16:15:07.910 INFO:tasks.workunit.client.1.vm05.stdout:6/172: mknod d17/d22/d27/d34/c35 0 2026-03-09T16:15:07.914 INFO:tasks.workunit.client.1.vm05.stdout:6/173: dread d17/f1c [0,4194304] 0 2026-03-09T16:15:07.914 INFO:tasks.workunit.client.1.vm05.stdout:9/193: write d4/f6 [1714653,51259] 0 2026-03-09T16:15:07.915 INFO:tasks.workunit.client.1.vm05.stdout:5/193: write d8/d18/d1b/f36 [4364717,42830] 0 2026-03-09T16:15:07.915 INFO:tasks.workunit.client.1.vm05.stdout:9/194: write d4/f6 [564047,59244] 0 2026-03-09T16:15:07.916 INFO:tasks.workunit.client.1.vm05.stdout:5/194: chown d8/d18/d1b/f32 3 1 2026-03-09T16:15:07.921 INFO:tasks.workunit.client.1.vm05.stdout:8/143: write d4/d6/f9 [1347522,59036] 0 2026-03-09T16:15:07.923 INFO:tasks.workunit.client.1.vm05.stdout:8/144: chown d4/d6/db/df/d33/d15/f22 350 1 2026-03-09T16:15:07.924 INFO:tasks.workunit.client.1.vm05.stdout:8/145: fsync d4/d6/f1f 0 2026-03-09T16:15:07.925 INFO:tasks.workunit.client.1.vm05.stdout:8/146: stat d4/d6/db/dc/d2e 0 2026-03-09T16:15:07.926 INFO:tasks.workunit.client.1.vm05.stdout:2/203: dwrite db/dd/f32 [0,4194304] 0 2026-03-09T16:15:07.927 INFO:tasks.workunit.client.1.vm05.stdout:2/204: fdatasync db/f12 0 2026-03-09T16:15:07.932 INFO:tasks.workunit.client.1.vm05.stdout:4/178: unlink d5/de/d15/d21/d27/l41 0 2026-03-09T16:15:07.944 INFO:tasks.workunit.client.1.vm05.stdout:9/195: mkdir d4/d10/d35/d2b/d38 0 2026-03-09T16:15:07.946 INFO:tasks.workunit.client.1.vm05.stdout:5/195: write d8/d18/d1b/d2e/f3c [146348,55037] 0 2026-03-09T16:15:07.947 INFO:tasks.workunit.client.1.vm05.stdout:5/196: fdatasync d8/d1d/f21 0 2026-03-09T16:15:07.957 INFO:tasks.workunit.client.1.vm05.stdout:8/147: symlink d4/d6/db/df/d33/d15/l39 0 2026-03-09T16:15:07.971 INFO:tasks.workunit.client.1.vm05.stdout:0/213: dwrite d5/d2c/f28 [0,4194304] 0 2026-03-09T16:15:07.973 INFO:tasks.workunit.client.1.vm05.stdout:1/194: mkdir d7/d15/d45 0 2026-03-09T16:15:07.977 INFO:tasks.workunit.client.1.vm05.stdout:0/214: dwrite d5/d1b/d30/f2f [0,4194304] 0 2026-03-09T16:15:07.978 INFO:tasks.workunit.client.1.vm05.stdout:5/197: symlink d8/d18/d1b/l42 0 2026-03-09T16:15:07.978 INFO:tasks.workunit.client.1.vm05.stdout:0/215: chown d5/d1b 1623 1 2026-03-09T16:15:07.978 INFO:tasks.workunit.client.1.vm05.stdout:3/142: creat d0/d9/d10/f29 x:0 0 0 2026-03-09T16:15:07.980 INFO:tasks.workunit.client.1.vm05.stdout:3/143: truncate d0/d9/f1d 5712737 0 2026-03-09T16:15:07.982 INFO:tasks.workunit.client.1.vm05.stdout:7/249: creat d1/d2/f4f x:0 0 0 2026-03-09T16:15:07.987 INFO:tasks.workunit.client.1.vm05.stdout:8/148: rename d4/d6/db/df/d33 to d4/d6/d3a 0 2026-03-09T16:15:07.987 INFO:tasks.workunit.client.1.vm05.stdout:5/198: dwrite d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:07.988 INFO:tasks.workunit.client.1.vm05.stdout:8/149: chown d4/d6/db/dc/f17 254225893 1 2026-03-09T16:15:07.994 INFO:tasks.workunit.client.1.vm05.stdout:5/199: write d8/d18/d1b/f31 [2248474,115134] 0 2026-03-09T16:15:08.000 INFO:tasks.workunit.client.1.vm05.stdout:7/250: dread d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:08.019 INFO:tasks.workunit.client.1.vm05.stdout:4/179: link d5/f6 d5/de/d15/d21/d39/f44 0 2026-03-09T16:15:08.021 INFO:tasks.workunit.client.1.vm05.stdout:9/196: fsync d4/f20 0 2026-03-09T16:15:08.032 INFO:tasks.workunit.client.1.vm05.stdout:3/144: creat d0/d9/d22/f2a x:0 0 0 2026-03-09T16:15:08.032 INFO:tasks.workunit.client.1.vm05.stdout:3/145: dread - d0/d9/d10/f29 zero size 2026-03-09T16:15:08.035 INFO:tasks.workunit.client.1.vm05.stdout:8/150: mkdir d4/d6/db/dc/d3b 0 2026-03-09T16:15:08.035 INFO:tasks.workunit.client.1.vm05.stdout:8/151: chown d4/d6/f9 449 1 2026-03-09T16:15:08.036 INFO:tasks.workunit.client.1.vm05.stdout:8/152: read - d4/d6/db/dc/f2a zero size 2026-03-09T16:15:08.040 INFO:tasks.workunit.client.1.vm05.stdout:5/200: rename d8/d3d/d40 to d8/d18/d1b/d2e/d43 0 2026-03-09T16:15:08.041 INFO:tasks.workunit.client.1.vm05.stdout:5/201: write d8/f13 [1163303,114517] 0 2026-03-09T16:15:08.045 INFO:tasks.workunit.client.1.vm05.stdout:7/251: symlink d1/d19/l50 0 2026-03-09T16:15:08.046 INFO:tasks.workunit.client.1.vm05.stdout:1/195: creat d7/dd/d21/d44/f46 x:0 0 0 2026-03-09T16:15:08.052 INFO:tasks.workunit.client.1.vm05.stdout:6/174: getdents d17/d22/d27/d34 0 2026-03-09T16:15:08.058 INFO:tasks.workunit.client.1.vm05.stdout:4/180: dwrite d5/d19/f1f [0,4194304] 0 2026-03-09T16:15:08.059 INFO:tasks.workunit.client.1.vm05.stdout:6/175: dwrite fa [4194304,4194304] 0 2026-03-09T16:15:08.070 INFO:tasks.workunit.client.1.vm05.stdout:2/205: dwrite db/dd/d15/d1f/f25 [0,4194304] 0 2026-03-09T16:15:08.070 INFO:tasks.workunit.client.1.vm05.stdout:8/153: mkdir d4/d6/d3a/d3c 0 2026-03-09T16:15:08.072 INFO:tasks.workunit.client.1.vm05.stdout:8/154: fdatasync d4/d6/d3a/f28 0 2026-03-09T16:15:08.082 INFO:tasks.workunit.client.1.vm05.stdout:3/146: dwrite d0/d9/d22/f14 [0,4194304] 0 2026-03-09T16:15:08.082 INFO:tasks.workunit.client.1.vm05.stdout:3/147: chown d0 3 1 2026-03-09T16:15:08.087 INFO:tasks.workunit.client.1.vm05.stdout:3/148: chown d0/d9/d22/c24 28617861 1 2026-03-09T16:15:08.096 INFO:tasks.workunit.client.1.vm05.stdout:5/202: creat d8/d1d/f44 x:0 0 0 2026-03-09T16:15:08.099 INFO:tasks.workunit.client.1.vm05.stdout:1/196: unlink d7/dd/d21/d3b/c43 0 2026-03-09T16:15:08.099 INFO:tasks.workunit.client.1.vm05.stdout:9/197: symlink d4/d10/d35/d36/l39 0 2026-03-09T16:15:08.112 INFO:tasks.workunit.client.1.vm05.stdout:6/176: mknod d17/d22/d27/d34/c36 0 2026-03-09T16:15:08.112 INFO:tasks.workunit.client.1.vm05.stdout:2/206: rename db/dd/f43 to db/dd/d15/f48 0 2026-03-09T16:15:08.112 INFO:tasks.workunit.client.1.vm05.stdout:4/181: write d5/de/f24 [1094546,54502] 0 2026-03-09T16:15:08.113 INFO:tasks.workunit.client.1.vm05.stdout:0/216: link d5/c15 d5/d11/c44 0 2026-03-09T16:15:08.113 INFO:tasks.workunit.client.1.vm05.stdout:3/149: creat d0/d9/f2b x:0 0 0 2026-03-09T16:15:08.113 INFO:tasks.workunit.client.1.vm05.stdout:7/252: getdents d1/d2/d8/dc/d1b/d30/d4b 0 2026-03-09T16:15:08.114 INFO:tasks.workunit.client.1.vm05.stdout:1/197: rmdir d7/d15 39 2026-03-09T16:15:08.115 INFO:tasks.workunit.client.1.vm05.stdout:3/150: write d0/d9/d10/f29 [202184,88711] 0 2026-03-09T16:15:08.124 INFO:tasks.workunit.client.1.vm05.stdout:6/177: fsync fa 0 2026-03-09T16:15:08.125 INFO:tasks.workunit.client.1.vm05.stdout:8/155: symlink d4/d6/db/dc/d3b/l3d 0 2026-03-09T16:15:08.125 INFO:tasks.workunit.client.1.vm05.stdout:6/178: fsync d17/f30 0 2026-03-09T16:15:08.125 INFO:tasks.workunit.client.1.vm05.stdout:4/182: creat d5/f45 x:0 0 0 2026-03-09T16:15:08.125 INFO:tasks.workunit.client.1.vm05.stdout:8/156: chown d4/d6/db/dc/l34 1826480 1 2026-03-09T16:15:08.125 INFO:tasks.workunit.client.1.vm05.stdout:6/179: chown cd 117113090 1 2026-03-09T16:15:08.134 INFO:tasks.workunit.client.1.vm05.stdout:2/207: creat db/dd/d15/d1f/f49 x:0 0 0 2026-03-09T16:15:08.134 INFO:tasks.workunit.client.1.vm05.stdout:2/208: write db/f12 [4445626,17155] 0 2026-03-09T16:15:08.137 INFO:tasks.workunit.client.1.vm05.stdout:7/253: creat d1/d2/d8/d31/f51 x:0 0 0 2026-03-09T16:15:08.139 INFO:tasks.workunit.client.1.vm05.stdout:1/198: symlink d7/dd/d21/d44/l47 0 2026-03-09T16:15:08.141 INFO:tasks.workunit.client.1.vm05.stdout:1/199: truncate d7/dd/d21/d2d/d3a/f41 943546 0 2026-03-09T16:15:08.143 INFO:tasks.workunit.client.1.vm05.stdout:0/217: mknod d5/c45 0 2026-03-09T16:15:08.143 INFO:tasks.workunit.client.1.vm05.stdout:0/218: chown d5/d1b/l2b 1026213 1 2026-03-09T16:15:08.144 INFO:tasks.workunit.client.1.vm05.stdout:0/219: write d5/d1b/d30/f29 [47112,54442] 0 2026-03-09T16:15:08.145 INFO:tasks.workunit.client.1.vm05.stdout:4/183: creat d5/de/d15/d21/d39/f46 x:0 0 0 2026-03-09T16:15:08.147 INFO:tasks.workunit.client.1.vm05.stdout:1/200: rmdir d7 39 2026-03-09T16:15:08.149 INFO:tasks.workunit.client.1.vm05.stdout:0/220: dread d5/d11/f23 [0,4194304] 0 2026-03-09T16:15:08.149 INFO:tasks.workunit.client.1.vm05.stdout:0/221: stat d5/d2c 0 2026-03-09T16:15:08.150 INFO:tasks.workunit.client.1.vm05.stdout:6/180: creat d17/d22/f37 x:0 0 0 2026-03-09T16:15:08.151 INFO:tasks.workunit.client.1.vm05.stdout:8/157: creat d4/f3e x:0 0 0 2026-03-09T16:15:08.151 INFO:tasks.workunit.client.1.vm05.stdout:6/181: chown d17/d1d/c1f 461898546 1 2026-03-09T16:15:08.152 INFO:tasks.workunit.client.1.vm05.stdout:0/222: rmdir d5/db/d1d 39 2026-03-09T16:15:08.152 INFO:tasks.workunit.client.1.vm05.stdout:8/158: truncate d4/d6/f1b 258540 0 2026-03-09T16:15:08.152 INFO:tasks.workunit.client.1.vm05.stdout:4/184: mknod d5/d19/d37/c47 0 2026-03-09T16:15:08.152 INFO:tasks.workunit.client.1.vm05.stdout:8/159: dread - d4/d6/f29 zero size 2026-03-09T16:15:08.153 INFO:tasks.workunit.client.1.vm05.stdout:2/209: rmdir db/dd/d15/d1f/d20/d23/d44 0 2026-03-09T16:15:08.156 INFO:tasks.workunit.client.1.vm05.stdout:6/182: creat d17/d1d/f38 x:0 0 0 2026-03-09T16:15:08.158 INFO:tasks.workunit.client.1.vm05.stdout:2/210: dread f5 [0,4194304] 0 2026-03-09T16:15:08.160 INFO:tasks.workunit.client.1.vm05.stdout:1/201: sync 2026-03-09T16:15:08.160 INFO:tasks.workunit.client.1.vm05.stdout:7/254: sync 2026-03-09T16:15:08.160 INFO:tasks.workunit.client.1.vm05.stdout:4/185: creat d5/d19/f48 x:0 0 0 2026-03-09T16:15:08.162 INFO:tasks.workunit.client.1.vm05.stdout:0/223: dread d5/d1b/d30/f2a [0,4194304] 0 2026-03-09T16:15:08.163 INFO:tasks.workunit.client.1.vm05.stdout:5/203: rmdir d8/d18/d1b 39 2026-03-09T16:15:08.163 INFO:tasks.workunit.client.1.vm05.stdout:4/186: dread - d5/de/d15/f34 zero size 2026-03-09T16:15:08.163 INFO:tasks.workunit.client.1.vm05.stdout:3/151: dread d0/d9/d10/f29 [0,4194304] 0 2026-03-09T16:15:08.165 INFO:tasks.workunit.client.1.vm05.stdout:4/187: readlink d5/de/d15/d21/d27/d3c/l40 0 2026-03-09T16:15:08.165 INFO:tasks.workunit.client.1.vm05.stdout:2/211: dread - db/dd/d15/d1f/d21/f29 zero size 2026-03-09T16:15:08.165 INFO:tasks.workunit.client.1.vm05.stdout:4/188: stat d5/d19/f32 0 2026-03-09T16:15:08.166 INFO:tasks.workunit.client.1.vm05.stdout:2/212: fdatasync db/dd/f32 0 2026-03-09T16:15:08.169 INFO:tasks.workunit.client.1.vm05.stdout:9/198: write d4/d10/f1d [1536292,92876] 0 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: pgmap v13: 65 pgs: 65 active+clean; 2.1 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 21 MiB/s rd, 51 MiB/s wr, 193 op/s 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all mgr 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.gbgzmu"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.gbgzmu"}]': finished 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.dygxfv"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.dygxfv"}]': finished 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:07 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.184 INFO:tasks.workunit.client.1.vm05.stdout:8/160: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:15:08.185 INFO:tasks.workunit.client.1.vm05.stdout:8/161: dread - d4/f23 zero size 2026-03-09T16:15:08.187 INFO:tasks.workunit.client.1.vm05.stdout:8/162: chown d4/d6/db/dc/d2e 957 1 2026-03-09T16:15:08.188 INFO:tasks.workunit.client.1.vm05.stdout:8/163: write d4/d6/db/dc/f30 [1142428,56621] 0 2026-03-09T16:15:08.191 INFO:tasks.workunit.client.1.vm05.stdout:8/164: chown d4/d6/d3a/d15 327 1 2026-03-09T16:15:08.197 INFO:tasks.workunit.client.1.vm05.stdout:1/202: dwrite d7/dd/d21/d2d/d3a/f41 [0,4194304] 0 2026-03-09T16:15:08.204 INFO:tasks.workunit.client.1.vm05.stdout:5/204: write d8/d18/d1b/f31 [439473,39344] 0 2026-03-09T16:15:08.206 INFO:tasks.workunit.client.1.vm05.stdout:6/183: write d17/f1c [1881471,124831] 0 2026-03-09T16:15:08.206 INFO:tasks.workunit.client.1.vm05.stdout:5/205: dread d8/d18/d1b/f36 [0,4194304] 0 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: pgmap v13: 65 pgs: 65 active+clean; 2.1 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 21 MiB/s rd, 51 MiB/s wr, 193 op/s 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all mgr 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.gbgzmu"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.gbgzmu"}]': finished 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.dygxfv"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.dygxfv"}]': finished 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:08.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:07 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:08.240 INFO:tasks.workunit.client.1.vm05.stdout:8/165: rename d4/d6/db/fd to d4/d6/d3a/d3c/f3f 0 2026-03-09T16:15:08.248 INFO:tasks.workunit.client.1.vm05.stdout:3/152: link d0/d9/fa d0/d9/f2c 0 2026-03-09T16:15:08.249 INFO:tasks.workunit.client.1.vm05.stdout:3/153: fsync d0/d9/f1d 0 2026-03-09T16:15:08.249 INFO:tasks.workunit.client.1.vm05.stdout:3/154: chown d0/d9/d10/l27 14526 1 2026-03-09T16:15:08.253 INFO:tasks.workunit.client.1.vm05.stdout:2/213: creat db/dd/d15/d3f/f4a x:0 0 0 2026-03-09T16:15:08.253 INFO:tasks.workunit.client.1.vm05.stdout:0/224: dwrite d5/f7 [0,4194304] 0 2026-03-09T16:15:08.255 INFO:tasks.workunit.client.1.vm05.stdout:2/214: chown db/l13 36 1 2026-03-09T16:15:08.260 INFO:tasks.workunit.client.1.vm05.stdout:0/225: dwrite d5/d1b/d30/f2f [0,4194304] 0 2026-03-09T16:15:08.293 INFO:tasks.workunit.client.1.vm05.stdout:4/189: creat d5/de/d15/d21/d31/f49 x:0 0 0 2026-03-09T16:15:08.294 INFO:tasks.workunit.client.1.vm05.stdout:4/190: write d5/f3e [102421,59148] 0 2026-03-09T16:15:08.340 INFO:tasks.workunit.client.1.vm05.stdout:5/206: mkdir d8/d18/d1b/d2e/d43/d45 0 2026-03-09T16:15:08.340 INFO:tasks.workunit.client.1.vm05.stdout:9/199: mknod d4/c3a 0 2026-03-09T16:15:08.341 INFO:tasks.workunit.client.1.vm05.stdout:5/207: chown d8/d18/d1b/d2e/f35 3 1 2026-03-09T16:15:08.344 INFO:tasks.workunit.client.1.vm05.stdout:5/208: dwrite d8/d18/d1b/d2e/d43/f41 [0,4194304] 0 2026-03-09T16:15:08.364 INFO:tasks.workunit.client.1.vm05.stdout:1/203: mkdir d7/dd/d21/d39/d48 0 2026-03-09T16:15:08.371 INFO:tasks.workunit.client.1.vm05.stdout:3/155: write d0/d9/fa [298900,28783] 0 2026-03-09T16:15:08.377 INFO:tasks.workunit.client.1.vm05.stdout:2/215: symlink db/dd/d15/d1f/l4b 0 2026-03-09T16:15:08.381 INFO:tasks.workunit.client.1.vm05.stdout:6/184: symlink d17/l39 0 2026-03-09T16:15:08.383 INFO:tasks.workunit.client.1.vm05.stdout:2/216: dwrite db/dd/d15/d1f/f49 [0,4194304] 0 2026-03-09T16:15:08.385 INFO:tasks.workunit.client.1.vm05.stdout:9/200: mknod d4/d10/d35/d36/c3b 0 2026-03-09T16:15:08.394 INFO:tasks.workunit.client.1.vm05.stdout:6/185: sync 2026-03-09T16:15:08.408 INFO:tasks.workunit.client.1.vm05.stdout:5/209: mknod d8/d1d/c46 0 2026-03-09T16:15:08.423 INFO:tasks.workunit.client.1.vm05.stdout:1/204: rename d7/dd/de/l20 to d7/d27/l49 0 2026-03-09T16:15:08.424 INFO:tasks.workunit.client.1.vm05.stdout:1/205: chown d7/dd/d21/d3b 18535879 1 2026-03-09T16:15:08.442 INFO:tasks.workunit.client.1.vm05.stdout:4/191: dwrite d5/de/d15/d21/d39/f44 [0,4194304] 0 2026-03-09T16:15:08.444 INFO:tasks.workunit.client.1.vm05.stdout:4/192: chown d5/d19/f48 5748 1 2026-03-09T16:15:08.460 INFO:tasks.workunit.client.1.vm05.stdout:7/255: getdents d1/d2/d8/dc/d1b/d30 0 2026-03-09T16:15:08.461 INFO:tasks.workunit.client.1.vm05.stdout:7/256: stat d1/d2/d8/dc/d14/f44 0 2026-03-09T16:15:08.462 INFO:tasks.workunit.client.1.vm05.stdout:7/257: write d1/d2/d8/dc/d15/d3e/f4c [401607,92113] 0 2026-03-09T16:15:08.462 INFO:tasks.workunit.client.1.vm05.stdout:7/258: dread - d1/d2/d8/dc/d15/d3e/f49 zero size 2026-03-09T16:15:08.469 INFO:tasks.workunit.client.1.vm05.stdout:0/226: unlink d5/db/d1d/f39 0 2026-03-09T16:15:08.469 INFO:tasks.workunit.client.1.vm05.stdout:0/227: chown d5/db/l33 2896 1 2026-03-09T16:15:08.469 INFO:tasks.workunit.client.1.vm05.stdout:0/228: chown d5/d11/f1e 178 1 2026-03-09T16:15:08.474 INFO:tasks.workunit.client.1.vm05.stdout:2/217: mkdir db/dd/d15/d4c 0 2026-03-09T16:15:08.482 INFO:tasks.workunit.client.1.vm05.stdout:8/166: truncate d4/d6/db/fe 4060697 0 2026-03-09T16:15:08.484 INFO:tasks.workunit.client.1.vm05.stdout:5/210: write d8/d18/d1b/d2e/f35 [1141945,52971] 0 2026-03-09T16:15:08.487 INFO:tasks.workunit.client.1.vm05.stdout:6/186: dwrite f5 [0,4194304] 0 2026-03-09T16:15:08.487 INFO:tasks.workunit.client.1.vm05.stdout:4/193: mkdir d5/de/d4a 0 2026-03-09T16:15:08.488 INFO:tasks.workunit.client.1.vm05.stdout:4/194: chown d5/d19/d37/c47 0 1 2026-03-09T16:15:08.491 INFO:tasks.workunit.client.1.vm05.stdout:4/195: dwrite d5/f3e [0,4194304] 0 2026-03-09T16:15:08.495 INFO:tasks.workunit.client.1.vm05.stdout:4/196: write d5/de/d15/d21/d27/f29 [83279,5701] 0 2026-03-09T16:15:08.496 INFO:tasks.workunit.client.1.vm05.stdout:7/259: rename d1/d2/d8/dc/f48 to d1/d2/d8/dc/d18/f52 0 2026-03-09T16:15:08.497 INFO:tasks.workunit.client.1.vm05.stdout:5/211: mkdir d8/d18/d1b/d47 0 2026-03-09T16:15:08.499 INFO:tasks.workunit.client.1.vm05.stdout:0/229: write d5/d11/f1e [968444,61705] 0 2026-03-09T16:15:08.499 INFO:tasks.workunit.client.1.vm05.stdout:4/197: sync 2026-03-09T16:15:08.504 INFO:tasks.workunit.client.1.vm05.stdout:2/218: mknod db/dd/d15/d4c/c4d 0 2026-03-09T16:15:08.505 INFO:tasks.workunit.client.1.vm05.stdout:2/219: readlink db/dd/d15/d1f/d20/d23/l1d 0 2026-03-09T16:15:08.506 INFO:tasks.workunit.client.1.vm05.stdout:8/167: getdents d4/d6/db/dc/d2e 0 2026-03-09T16:15:08.507 INFO:tasks.workunit.client.1.vm05.stdout:8/168: read - d4/f23 zero size 2026-03-09T16:15:08.508 INFO:tasks.workunit.client.1.vm05.stdout:9/201: creat d4/f3c x:0 0 0 2026-03-09T16:15:08.508 INFO:tasks.workunit.client.1.vm05.stdout:5/212: unlink d8/f29 0 2026-03-09T16:15:08.510 INFO:tasks.workunit.client.1.vm05.stdout:1/206: rename d7/dd/d21/d44/l47 to d7/d15/d45/l4a 0 2026-03-09T16:15:08.510 INFO:tasks.workunit.client.1.vm05.stdout:0/230: dwrite d5/db/f12 [4194304,4194304] 0 2026-03-09T16:15:08.512 INFO:tasks.workunit.client.1.vm05.stdout:4/198: fdatasync d5/de/d15/f25 0 2026-03-09T16:15:08.514 INFO:tasks.workunit.client.1.vm05.stdout:9/202: symlink d4/d10/l3d 0 2026-03-09T16:15:08.517 INFO:tasks.workunit.client.1.vm05.stdout:8/169: unlink d4/d6/db/df/f1d 0 2026-03-09T16:15:08.519 INFO:tasks.workunit.client.1.vm05.stdout:8/170: write d4/d6/db/dc/f30 [4026010,118786] 0 2026-03-09T16:15:08.519 INFO:tasks.workunit.client.1.vm05.stdout:0/231: dwrite d5/d1b/d3b/f42 [0,4194304] 0 2026-03-09T16:15:08.523 INFO:tasks.workunit.client.1.vm05.stdout:0/232: fdatasync d5/f8 0 2026-03-09T16:15:08.524 INFO:tasks.workunit.client.1.vm05.stdout:2/220: creat db/dd/d15/d46/f4e x:0 0 0 2026-03-09T16:15:08.526 INFO:tasks.workunit.client.1.vm05.stdout:0/233: truncate d5/d1b/d3b/f3c 52172 0 2026-03-09T16:15:08.533 INFO:tasks.workunit.client.1.vm05.stdout:4/199: rmdir d5/d19/d37 39 2026-03-09T16:15:08.534 INFO:tasks.workunit.client.1.vm05.stdout:4/200: readlink d5/l2b 0 2026-03-09T16:15:08.535 INFO:tasks.workunit.client.1.vm05.stdout:4/201: fsync d5/de/d15/d21/d27/d3c/f3d 0 2026-03-09T16:15:08.536 INFO:tasks.workunit.client.1.vm05.stdout:5/213: dread d8/f13 [0,4194304] 0 2026-03-09T16:15:08.537 INFO:tasks.workunit.client.1.vm05.stdout:4/202: dread d5/d19/f1f [0,4194304] 0 2026-03-09T16:15:08.538 INFO:tasks.workunit.client.1.vm05.stdout:4/203: write d5/de/d15/d21/d31/f49 [281978,83371] 0 2026-03-09T16:15:08.543 INFO:tasks.workunit.client.1.vm05.stdout:4/204: write d5/de/d15/d21/f2a [885836,108143] 0 2026-03-09T16:15:08.543 INFO:tasks.workunit.client.1.vm05.stdout:9/203: symlink d4/d10/l3e 0 2026-03-09T16:15:08.545 INFO:tasks.workunit.client.1.vm05.stdout:4/205: dwrite d5/de/f23 [0,4194304] 0 2026-03-09T16:15:08.546 INFO:tasks.workunit.client.1.vm05.stdout:4/206: chown d5/f2e 1042002023 1 2026-03-09T16:15:08.547 INFO:tasks.workunit.client.1.vm05.stdout:8/171: mkdir d4/d6/d3a/d40 0 2026-03-09T16:15:08.549 INFO:tasks.workunit.client.1.vm05.stdout:7/260: link d1/d2/cd d1/d2/d8/dc/d1b/c53 0 2026-03-09T16:15:08.549 INFO:tasks.workunit.client.1.vm05.stdout:8/172: chown l3 1936 1 2026-03-09T16:15:08.549 INFO:tasks.workunit.client.1.vm05.stdout:7/261: fdatasync d1/d2/d8/dc/f3b 0 2026-03-09T16:15:08.551 INFO:tasks.workunit.client.1.vm05.stdout:8/173: fsync d4/d6/f24 0 2026-03-09T16:15:08.552 INFO:tasks.workunit.client.1.vm05.stdout:2/221: rmdir db/dd/d15/d1f/d20 39 2026-03-09T16:15:08.553 INFO:tasks.workunit.client.1.vm05.stdout:0/234: symlink d5/d1b/d30/l46 0 2026-03-09T16:15:08.553 INFO:tasks.workunit.client.1.vm05.stdout:2/222: write db/dd/d15/d1f/d21/f47 [623766,2288] 0 2026-03-09T16:15:08.560 INFO:tasks.workunit.client.1.vm05.stdout:1/207: rename d7/dd/f11 to d7/f4b 0 2026-03-09T16:15:08.562 INFO:tasks.workunit.client.1.vm05.stdout:1/208: fdatasync d7/dd/d21/f3d 0 2026-03-09T16:15:08.562 INFO:tasks.workunit.client.1.vm05.stdout:0/235: write d5/d11/f37 [419283,118923] 0 2026-03-09T16:15:08.565 INFO:tasks.workunit.client.1.vm05.stdout:2/223: sync 2026-03-09T16:15:08.566 INFO:tasks.workunit.client.1.vm05.stdout:8/174: dwrite d4/d6/f1b [0,4194304] 0 2026-03-09T16:15:08.570 INFO:tasks.workunit.client.1.vm05.stdout:8/175: dread - d4/f23 zero size 2026-03-09T16:15:08.573 INFO:tasks.workunit.client.1.vm05.stdout:9/204: dwrite d4/f17 [0,4194304] 0 2026-03-09T16:15:08.585 INFO:tasks.workunit.client.1.vm05.stdout:7/262: rename d1/d2/d8/dc/d14/f44 to d1/d2/d11/f54 0 2026-03-09T16:15:08.585 INFO:tasks.workunit.client.1.vm05.stdout:1/209: dread d7/d15/f22 [0,4194304] 0 2026-03-09T16:15:08.591 INFO:tasks.workunit.client.1.vm05.stdout:4/207: mknod d5/de/d4a/c4b 0 2026-03-09T16:15:08.592 INFO:tasks.workunit.client.1.vm05.stdout:0/236: creat d5/d1b/f47 x:0 0 0 2026-03-09T16:15:08.592 INFO:tasks.workunit.client.1.vm05.stdout:5/214: mkdir d8/d18/d1b/d47/d48 0 2026-03-09T16:15:08.592 INFO:tasks.workunit.client.1.vm05.stdout:9/205: creat d4/d10/f3f x:0 0 0 2026-03-09T16:15:08.593 INFO:tasks.workunit.client.1.vm05.stdout:7/263: fsync d1/d2/f5 0 2026-03-09T16:15:08.593 INFO:tasks.workunit.client.1.vm05.stdout:1/210: unlink d7/dd/de/l17 0 2026-03-09T16:15:08.595 INFO:tasks.workunit.client.1.vm05.stdout:2/224: dread db/dd/d15/d1f/f36 [0,4194304] 0 2026-03-09T16:15:08.597 INFO:tasks.workunit.client.1.vm05.stdout:4/208: rmdir d5/de/d15/d21/d39 39 2026-03-09T16:15:08.601 INFO:tasks.workunit.client.1.vm05.stdout:4/209: write d5/f45 [276205,21092] 0 2026-03-09T16:15:08.601 INFO:tasks.workunit.client.1.vm05.stdout:5/215: rename d8/d18/d1b/d2e/c3b to d8/d18/d1b/c49 0 2026-03-09T16:15:08.601 INFO:tasks.workunit.client.1.vm05.stdout:0/237: rmdir d5/d1b/d30 39 2026-03-09T16:15:08.602 INFO:tasks.workunit.client.1.vm05.stdout:1/211: creat d7/d27/f4c x:0 0 0 2026-03-09T16:15:08.604 INFO:tasks.workunit.client.1.vm05.stdout:0/238: dread - d5/d1b/f47 zero size 2026-03-09T16:15:08.605 INFO:tasks.workunit.client.1.vm05.stdout:9/206: creat d4/d10/d35/d36/f40 x:0 0 0 2026-03-09T16:15:08.605 INFO:tasks.workunit.client.1.vm05.stdout:2/225: mknod db/dd/d15/d4c/c4f 0 2026-03-09T16:15:08.605 INFO:tasks.workunit.client.1.vm05.stdout:5/216: symlink d8/d18/d1b/d2e/l4a 0 2026-03-09T16:15:08.609 INFO:tasks.workunit.client.1.vm05.stdout:7/264: unlink d1/d2/d8/dc/d14/f35 0 2026-03-09T16:15:08.610 INFO:tasks.workunit.client.1.vm05.stdout:3/156: truncate d0/d9/d22/f14 2899467 0 2026-03-09T16:15:08.614 INFO:tasks.workunit.client.1.vm05.stdout:1/212: sync 2026-03-09T16:15:08.616 INFO:tasks.workunit.client.1.vm05.stdout:2/226: dwrite db/dd/d15/d1f/f24 [0,4194304] 0 2026-03-09T16:15:08.622 INFO:tasks.workunit.client.1.vm05.stdout:6/187: dwrite d17/d22/d27/f2a [0,4194304] 0 2026-03-09T16:15:08.628 INFO:tasks.workunit.client.1.vm05.stdout:2/227: dwrite db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:08.628 INFO:tasks.workunit.client.1.vm05.stdout:5/217: mknod d8/d18/d1b/c4b 0 2026-03-09T16:15:08.630 INFO:tasks.workunit.client.1.vm05.stdout:7/265: rename d1/d2/d8/l10 to d1/d2/d8/dc/d18/l55 0 2026-03-09T16:15:08.631 INFO:tasks.workunit.client.1.vm05.stdout:3/157: chown d0/d9/c26 1 1 2026-03-09T16:15:08.632 INFO:tasks.workunit.client.1.vm05.stdout:9/207: symlink d4/l41 0 2026-03-09T16:15:08.632 INFO:tasks.workunit.client.1.vm05.stdout:0/239: mkdir d5/db/d48 0 2026-03-09T16:15:08.633 INFO:tasks.workunit.client.1.vm05.stdout:1/213: creat d7/d27/f4d x:0 0 0 2026-03-09T16:15:08.634 INFO:tasks.workunit.client.1.vm05.stdout:2/228: symlink db/dd/d15/d46/l50 0 2026-03-09T16:15:08.644 INFO:tasks.workunit.client.1.vm05.stdout:1/214: dwrite d7/dd/f19 [4194304,4194304] 0 2026-03-09T16:15:08.644 INFO:tasks.workunit.client.1.vm05.stdout:6/188: mknod d17/c3a 0 2026-03-09T16:15:08.645 INFO:tasks.workunit.client.1.vm05.stdout:0/240: dread d5/d1b/f25 [0,4194304] 0 2026-03-09T16:15:08.647 INFO:tasks.workunit.client.1.vm05.stdout:0/241: chown d5/d11/f23 57 1 2026-03-09T16:15:08.654 INFO:tasks.workunit.client.1.vm05.stdout:9/208: dwrite d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:08.654 INFO:tasks.workunit.client.1.vm05.stdout:2/229: rename db/dd/d15/d1f/d20/d23/f1a to db/dd/d15/f51 0 2026-03-09T16:15:08.655 INFO:tasks.workunit.client.1.vm05.stdout:0/242: mkdir d5/d2c/d49 0 2026-03-09T16:15:08.655 INFO:tasks.workunit.client.1.vm05.stdout:0/243: readlink d5/d1b/l2b 0 2026-03-09T16:15:08.658 INFO:tasks.workunit.client.1.vm05.stdout:2/230: symlink db/dd/d15/d1f/d20/d23/l52 0 2026-03-09T16:15:08.660 INFO:tasks.workunit.client.1.vm05.stdout:9/209: rename d4/l16 to d4/d10/l42 0 2026-03-09T16:15:08.660 INFO:tasks.workunit.client.1.vm05.stdout:2/231: dread f5 [0,4194304] 0 2026-03-09T16:15:08.663 INFO:tasks.workunit.client.1.vm05.stdout:2/232: creat db/dd/d15/d1f/d20/f53 x:0 0 0 2026-03-09T16:15:08.668 INFO:tasks.workunit.client.1.vm05.stdout:4/210: dread d5/f10 [0,4194304] 0 2026-03-09T16:15:08.668 INFO:tasks.workunit.client.1.vm05.stdout:1/215: dread d7/d15/d16/f1c [0,4194304] 0 2026-03-09T16:15:08.671 INFO:tasks.workunit.client.1.vm05.stdout:1/216: read d7/dd/de/f23 [3745568,114599] 0 2026-03-09T16:15:08.672 INFO:tasks.workunit.client.1.vm05.stdout:2/233: fdatasync db/dd/f1b 0 2026-03-09T16:15:08.674 INFO:tasks.workunit.client.1.vm05.stdout:2/234: stat db/dd/d15/d1f/f49 0 2026-03-09T16:15:08.682 INFO:tasks.workunit.client.1.vm05.stdout:9/210: rename d4/d10/d35/d36/f40 to d4/f43 0 2026-03-09T16:15:08.684 INFO:tasks.workunit.client.1.vm05.stdout:2/235: symlink db/dd/d15/d4c/l54 0 2026-03-09T16:15:08.687 INFO:tasks.workunit.client.1.vm05.stdout:2/236: mkdir db/dd/d15/d3f/d55 0 2026-03-09T16:15:08.690 INFO:tasks.workunit.client.1.vm05.stdout:2/237: mkdir db/dd/d15/d4c/d56 0 2026-03-09T16:15:08.695 INFO:tasks.workunit.client.1.vm05.stdout:4/211: link d5/c20 d5/d19/c4c 0 2026-03-09T16:15:08.696 INFO:tasks.workunit.client.1.vm05.stdout:1/217: rename d7/dd/de/c1e to d7/d15/c4e 0 2026-03-09T16:15:08.700 INFO:tasks.workunit.client.1.vm05.stdout:4/212: creat d5/de/d15/d21/d27/d3c/f4d x:0 0 0 2026-03-09T16:15:08.701 INFO:tasks.workunit.client.1.vm05.stdout:4/213: write d5/de/d15/d21/d27/d3c/f4d [942785,110138] 0 2026-03-09T16:15:08.701 INFO:tasks.workunit.client.1.vm05.stdout:4/214: readlink d5/l1d 0 2026-03-09T16:15:08.702 INFO:tasks.workunit.client.1.vm05.stdout:1/218: mknod d7/dd/c4f 0 2026-03-09T16:15:08.703 INFO:tasks.workunit.client.1.vm05.stdout:4/215: mkdir d5/de/d4a/d4e 0 2026-03-09T16:15:08.704 INFO:tasks.workunit.client.1.vm05.stdout:9/211: sync 2026-03-09T16:15:08.704 INFO:tasks.workunit.client.1.vm05.stdout:9/212: stat d4/d10/d35/d2b 0 2026-03-09T16:15:08.707 INFO:tasks.workunit.client.1.vm05.stdout:9/213: creat d4/d10/d35/f44 x:0 0 0 2026-03-09T16:15:08.708 INFO:tasks.workunit.client.1.vm05.stdout:9/214: creat d4/d10/d35/d2b/f45 x:0 0 0 2026-03-09T16:15:08.708 INFO:tasks.workunit.client.1.vm05.stdout:4/216: link c3 d5/de/d2f/c4f 0 2026-03-09T16:15:08.709 INFO:tasks.workunit.client.1.vm05.stdout:4/217: chown d5/de/d4a/d4e 183060173 1 2026-03-09T16:15:08.709 INFO:tasks.workunit.client.1.vm05.stdout:9/215: mknod d4/d10/c46 0 2026-03-09T16:15:08.712 INFO:tasks.workunit.client.1.vm05.stdout:9/216: mknod d4/c47 0 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:9/217: dread - d4/f3c zero size 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:4/218: link d5/f3e d5/de/d15/d21/f50 0 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:9/218: chown d4/d10/d35/c19 1868826 1 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:9/219: chown d4/d10/d35/d2b 4 1 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:4/219: dwrite d5/d19/f48 [0,4194304] 0 2026-03-09T16:15:08.721 INFO:tasks.workunit.client.1.vm05.stdout:9/220: mkdir d4/d10/d35/d36/d48 0 2026-03-09T16:15:08.725 INFO:tasks.workunit.client.1.vm05.stdout:9/221: creat d4/d10/d35/d36/f49 x:0 0 0 2026-03-09T16:15:08.746 INFO:tasks.workunit.client.1.vm05.stdout:5/218: read d8/d18/d1b/f2d [1515228,74048] 0 2026-03-09T16:15:08.751 INFO:tasks.workunit.client.1.vm05.stdout:5/219: dwrite d8/d18/d1b/f32 [0,4194304] 0 2026-03-09T16:15:08.798 INFO:tasks.workunit.client.1.vm05.stdout:4/220: creat d5/d19/d37/f51 x:0 0 0 2026-03-09T16:15:08.803 INFO:tasks.workunit.client.1.vm05.stdout:5/220: dread - d8/d1d/f21 zero size 2026-03-09T16:15:08.804 INFO:tasks.workunit.client.1.vm05.stdout:5/221: fdatasync d8/d18/d1b/d2e/f35 0 2026-03-09T16:15:08.804 INFO:tasks.workunit.client.1.vm05.stdout:9/222: rename d4/d10/d35/f28 to d4/f4a 0 2026-03-09T16:15:08.805 INFO:tasks.workunit.client.1.vm05.stdout:5/222: chown d8/d18/d1b/f30 8132 1 2026-03-09T16:15:08.805 INFO:tasks.workunit.client.1.vm05.stdout:9/223: write d4/d10/f3f [624466,67804] 0 2026-03-09T16:15:08.806 INFO:tasks.workunit.client.1.vm05.stdout:9/224: truncate d4/d10/f15 4641517 0 2026-03-09T16:15:08.811 INFO:tasks.workunit.client.1.vm05.stdout:5/223: write f1 [8243724,4876] 0 2026-03-09T16:15:08.811 INFO:tasks.workunit.client.1.vm05.stdout:5/224: read d8/f11 [100185,124772] 0 2026-03-09T16:15:08.813 INFO:tasks.workunit.client.1.vm05.stdout:5/225: fsync d8/d18/d1b/f36 0 2026-03-09T16:15:08.814 INFO:tasks.workunit.client.1.vm05.stdout:5/226: rmdir d8/d18/d1b/d2e/d43 39 2026-03-09T16:15:08.818 INFO:tasks.workunit.client.1.vm05.stdout:9/225: sync 2026-03-09T16:15:08.818 INFO:tasks.workunit.client.1.vm05.stdout:5/227: sync 2026-03-09T16:15:08.822 INFO:tasks.workunit.client.1.vm05.stdout:9/226: dwrite d4/d10/f3f [0,4194304] 0 2026-03-09T16:15:08.826 INFO:tasks.workunit.client.1.vm05.stdout:5/228: rmdir d8/d18/d1b/d47 39 2026-03-09T16:15:08.826 INFO:tasks.workunit.client.1.vm05.stdout:5/229: read - d8/d1d/f21 zero size 2026-03-09T16:15:08.826 INFO:tasks.workunit.client.1.vm05.stdout:5/230: chown d8/c23 15080 1 2026-03-09T16:15:08.827 INFO:tasks.workunit.client.1.vm05.stdout:9/227: creat d4/d10/d35/d2b/d38/f4b x:0 0 0 2026-03-09T16:15:08.828 INFO:tasks.workunit.client.1.vm05.stdout:9/228: chown d4/d10/d35/c1b 13 1 2026-03-09T16:15:08.829 INFO:tasks.workunit.client.1.vm05.stdout:5/231: creat d8/d18/d1b/d47/f4c x:0 0 0 2026-03-09T16:15:08.833 INFO:tasks.workunit.client.1.vm05.stdout:5/232: stat c0 0 2026-03-09T16:15:08.835 INFO:tasks.workunit.client.1.vm05.stdout:5/233: creat d8/d18/d1b/d2e/f4d x:0 0 0 2026-03-09T16:15:08.837 INFO:tasks.workunit.client.1.vm05.stdout:5/234: mkdir d8/d18/d1b/d47/d4e 0 2026-03-09T16:15:08.839 INFO:tasks.workunit.client.1.vm05.stdout:5/235: getdents d8/d18/d1b/d2e/d43 0 2026-03-09T16:15:08.841 INFO:tasks.workunit.client.1.vm05.stdout:5/236: creat d8/d18/d1b/d47/d4e/f4f x:0 0 0 2026-03-09T16:15:08.842 INFO:tasks.workunit.client.1.vm05.stdout:5/237: stat d8/d18/f20 0 2026-03-09T16:15:08.844 INFO:tasks.workunit.client.1.vm05.stdout:5/238: rename d8/ce to d8/d18/d1b/d47/d4e/c50 0 2026-03-09T16:15:08.846 INFO:tasks.workunit.client.1.vm05.stdout:5/239: creat d8/d18/d1b/d2e/f51 x:0 0 0 2026-03-09T16:15:08.847 INFO:tasks.workunit.client.1.vm05.stdout:5/240: creat d8/d18/d1b/d2e/f52 x:0 0 0 2026-03-09T16:15:08.887 INFO:tasks.workunit.client.1.vm05.stdout:8/176: truncate d4/d6/db/dc/f17 2069776 0 2026-03-09T16:15:08.901 INFO:tasks.workunit.client.1.vm05.stdout:8/177: creat d4/d6/db/dc/f41 x:0 0 0 2026-03-09T16:15:08.904 INFO:tasks.workunit.client.1.vm05.stdout:8/178: dread d4/d6/d3a/d15/f22 [0,4194304] 0 2026-03-09T16:15:08.925 INFO:tasks.workunit.client.1.vm05.stdout:3/158: truncate d0/d9/f2c 1136856 0 2026-03-09T16:15:08.928 INFO:tasks.workunit.client.1.vm05.stdout:6/189: getdents d17 0 2026-03-09T16:15:08.929 INFO:tasks.workunit.client.1.vm05.stdout:6/190: write d17/d1d/f1e [1387054,17973] 0 2026-03-09T16:15:08.934 INFO:tasks.workunit.client.1.vm05.stdout:3/159: mknod d0/d9/d22/c2d 0 2026-03-09T16:15:08.935 INFO:tasks.workunit.client.1.vm05.stdout:3/160: chown d0/d9/c26 55644749 1 2026-03-09T16:15:08.940 INFO:tasks.workunit.client.1.vm05.stdout:6/191: creat d17/f3b x:0 0 0 2026-03-09T16:15:08.942 INFO:tasks.workunit.client.1.vm05.stdout:0/244: truncate d5/db/f12 5723396 0 2026-03-09T16:15:08.943 INFO:tasks.workunit.client.1.vm05.stdout:3/161: creat d0/d9/d22/f2e x:0 0 0 2026-03-09T16:15:08.946 INFO:tasks.workunit.client.1.vm05.stdout:7/266: link d1/d2/d8/d31/f39 d1/d2/d8/dc/d18/f56 0 2026-03-09T16:15:08.947 INFO:tasks.workunit.client.1.vm05.stdout:9/229: fsync d4/f43 0 2026-03-09T16:15:08.950 INFO:tasks.workunit.client.1.vm05.stdout:0/245: rename d5/d11/f23 to d5/db/d1d/f4a 0 2026-03-09T16:15:08.952 INFO:tasks.workunit.client.1.vm05.stdout:2/238: rmdir db/dd 39 2026-03-09T16:15:08.954 INFO:tasks.workunit.client.1.vm05.stdout:3/162: readlink d0/l7 0 2026-03-09T16:15:08.959 INFO:tasks.workunit.client.1.vm05.stdout:6/192: creat d17/d22/d27/f3c x:0 0 0 2026-03-09T16:15:08.960 INFO:tasks.workunit.client.1.vm05.stdout:6/193: dread - d17/f3b zero size 2026-03-09T16:15:08.961 INFO:tasks.workunit.client.1.vm05.stdout:1/219: write d7/dd/de/f23 [1295571,52202] 0 2026-03-09T16:15:08.969 INFO:tasks.workunit.client.1.vm05.stdout:2/239: chown db/dd/d15/d1f/d21 3298583 1 2026-03-09T16:15:08.973 INFO:tasks.workunit.client.1.vm05.stdout:3/163: creat d0/d9/f2f x:0 0 0 2026-03-09T16:15:08.974 INFO:tasks.workunit.client.1.vm05.stdout:3/164: chown d0/d9/d22/c1a 12 1 2026-03-09T16:15:08.974 INFO:tasks.workunit.client.1.vm05.stdout:3/165: chown d0/d9/d22/c2d 1812635635 1 2026-03-09T16:15:08.975 INFO:tasks.workunit.client.1.vm05.stdout:9/230: mkdir d4/d10/d35/d36/d48/d4c 0 2026-03-09T16:15:08.976 INFO:tasks.workunit.client.1.vm05.stdout:4/221: truncate d5/f3e 4135043 0 2026-03-09T16:15:08.981 INFO:tasks.workunit.client.1.vm05.stdout:1/220: dwrite d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:08.983 INFO:tasks.workunit.client.1.vm05.stdout:0/246: creat d5/d1b/d30/f4b x:0 0 0 2026-03-09T16:15:08.987 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all rgw 2026-03-09T16:15:08.994 INFO:tasks.workunit.client.1.vm05.stdout:2/240: readlink l4 0 2026-03-09T16:15:09.011 INFO:tasks.workunit.client.1.vm05.stdout:9/231: mknod d4/d10/d35/c4d 0 2026-03-09T16:15:09.028 INFO:tasks.workunit.client.1.vm05.stdout:4/222: write d5/de/d15/d21/d39/f46 [8266,994] 0 2026-03-09T16:15:09.074 INFO:tasks.workunit.client.1.vm05.stdout:1/221: readlink d7/d27/l49 0 2026-03-09T16:15:09.077 INFO:tasks.workunit.client.1.vm05.stdout:1/222: dwrite d7/d15/d16/f1c [0,4194304] 0 2026-03-09T16:15:09.080 INFO:tasks.workunit.client.1.vm05.stdout:0/247: mknod d5/d11/c4c 0 2026-03-09T16:15:09.088 INFO:tasks.workunit.client.1.vm05.stdout:1/223: dread d7/f9 [0,4194304] 0 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all rgw 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all iscsi 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all nfs 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all nvmeof 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-09T16:15:09.089 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:08 vm05.local ceph-mon[58702]: Deploying daemon node-exporter.vm03 on vm03 2026-03-09T16:15:09.089 INFO:tasks.workunit.client.1.vm05.stdout:3/166: link d0/d9/f1d d0/d9/d22/f30 0 2026-03-09T16:15:09.103 INFO:tasks.workunit.client.1.vm05.stdout:9/232: symlink d4/d10/d35/d36/d48/l4e 0 2026-03-09T16:15:09.122 INFO:tasks.workunit.client.1.vm05.stdout:2/241: mknod db/dd/d15/d3f/d55/c57 0 2026-03-09T16:15:09.131 INFO:tasks.workunit.client.1.vm05.stdout:1/224: mkdir d7/dd/d21/d2d/d3a/d50 0 2026-03-09T16:15:09.131 INFO:tasks.workunit.client.1.vm05.stdout:1/225: fdatasync d7/d27/f33 0 2026-03-09T16:15:09.137 INFO:tasks.workunit.client.1.vm05.stdout:6/194: rename d17/f2e to d17/d22/f3d 0 2026-03-09T16:15:09.137 INFO:tasks.workunit.client.1.vm05.stdout:0/248: symlink d5/d2c/d49/l4d 0 2026-03-09T16:15:09.143 INFO:tasks.workunit.client.1.vm05.stdout:9/233: symlink d4/d10/d35/d36/d48/d4c/l4f 0 2026-03-09T16:15:09.145 INFO:tasks.workunit.client.1.vm05.stdout:0/249: symlink d5/d2c/l4e 0 2026-03-09T16:15:09.145 INFO:tasks.workunit.client.1.vm05.stdout:0/250: chown d5/d11/f1e 10273 1 2026-03-09T16:15:09.146 INFO:tasks.workunit.client.1.vm05.stdout:0/251: readlink d5/d1b/d30/l46 0 2026-03-09T16:15:09.146 INFO:tasks.workunit.client.1.vm05.stdout:2/242: fdatasync db/dd/f10 0 2026-03-09T16:15:09.147 INFO:tasks.workunit.client.1.vm05.stdout:1/226: mknod d7/dd/d21/d39/d48/c51 0 2026-03-09T16:15:09.148 INFO:tasks.workunit.client.1.vm05.stdout:3/167: link d0/d9/d22/l20 d0/d9/d10/l31 0 2026-03-09T16:15:09.149 INFO:tasks.workunit.client.1.vm05.stdout:3/168: dread - d0/d9/d22/f2e zero size 2026-03-09T16:15:09.150 INFO:tasks.workunit.client.1.vm05.stdout:1/227: dread d7/dd/de/f3e [0,4194304] 0 2026-03-09T16:15:09.151 INFO:tasks.workunit.client.1.vm05.stdout:6/195: rename d17/d1d/c29 to d17/d1d/c3e 0 2026-03-09T16:15:09.151 INFO:tasks.workunit.client.1.vm05.stdout:6/196: chown d17/d1d 944320 1 2026-03-09T16:15:09.155 INFO:tasks.workunit.client.1.vm05.stdout:9/234: truncate d4/f20 3673802 0 2026-03-09T16:15:09.157 INFO:tasks.workunit.client.1.vm05.stdout:3/169: rename d0/d9/d22/l1f to d0/d9/l32 0 2026-03-09T16:15:09.159 INFO:tasks.workunit.client.1.vm05.stdout:1/228: mkdir d7/dd/de/d52 0 2026-03-09T16:15:09.169 INFO:tasks.workunit.client.1.vm05.stdout:9/235: write d4/d10/d35/f32 [1114271,91149] 0 2026-03-09T16:15:09.173 INFO:tasks.workunit.client.1.vm05.stdout:9/236: dread d4/f4a [0,4194304] 0 2026-03-09T16:15:09.173 INFO:tasks.workunit.client.1.vm05.stdout:6/197: symlink d17/l3f 0 2026-03-09T16:15:09.178 INFO:tasks.workunit.client.1.vm05.stdout:9/237: mkdir d4/d10/d50 0 2026-03-09T16:15:09.179 INFO:tasks.workunit.client.1.vm05.stdout:6/198: mknod d17/d1d/c40 0 2026-03-09T16:15:09.179 INFO:tasks.workunit.client.1.vm05.stdout:9/238: dread d4/f4a [0,4194304] 0 2026-03-09T16:15:09.180 INFO:tasks.workunit.client.1.vm05.stdout:9/239: chown d4/d10/d35/d2b/d31/c37 0 1 2026-03-09T16:15:09.183 INFO:tasks.workunit.client.1.vm05.stdout:0/252: getdents d5/d1b 0 2026-03-09T16:15:09.188 INFO:tasks.workunit.client.1.vm05.stdout:9/240: dread d4/d10/d35/f32 [0,4194304] 0 2026-03-09T16:15:09.188 INFO:tasks.workunit.client.1.vm05.stdout:5/241: rmdir d8/d18/d1b/d2e 39 2026-03-09T16:15:09.189 INFO:tasks.workunit.client.1.vm05.stdout:9/241: write d4/f3c [228595,81566] 0 2026-03-09T16:15:09.193 INFO:tasks.workunit.client.1.vm05.stdout:0/253: mkdir d5/d11/d4f 0 2026-03-09T16:15:09.197 INFO:tasks.workunit.client.1.vm05.stdout:8/179: read d4/d6/db/dc/f17 [1637422,3754] 0 2026-03-09T16:15:09.198 INFO:tasks.workunit.client.1.vm05.stdout:7/267: fdatasync d1/d2/d11/f25 0 2026-03-09T16:15:09.202 INFO:tasks.workunit.client.1.vm05.stdout:3/170: dread d0/d9/f2c [0,4194304] 0 2026-03-09T16:15:09.202 INFO:tasks.workunit.client.1.vm05.stdout:5/242: mkdir d8/d53 0 2026-03-09T16:15:09.203 INFO:tasks.workunit.client.1.vm05.stdout:5/243: write d8/d1d/f44 [43535,63529] 0 2026-03-09T16:15:09.203 INFO:tasks.workunit.client.1.vm05.stdout:5/244: chown d8/d1d/f21 8 1 2026-03-09T16:15:09.205 INFO:tasks.workunit.client.1.vm05.stdout:7/268: readlink d1/d2/la 0 2026-03-09T16:15:09.208 INFO:tasks.workunit.client.1.vm05.stdout:5/245: mknod d8/d18/c54 0 2026-03-09T16:15:09.209 INFO:tasks.workunit.client.1.vm05.stdout:7/269: link d1/d19/f46 d1/d2/d8/dc/d33/f57 0 2026-03-09T16:15:09.213 INFO:tasks.workunit.client.1.vm05.stdout:8/180: link d4/d6/l19 d4/l42 0 2026-03-09T16:15:09.217 INFO:tasks.workunit.client.1.vm05.stdout:9/242: dread d4/fa [0,4194304] 0 2026-03-09T16:15:09.220 INFO:tasks.workunit.client.1.vm05.stdout:5/246: link d8/d18/f20 d8/f55 0 2026-03-09T16:15:09.221 INFO:tasks.workunit.client.1.vm05.stdout:5/247: readlink d8/d18/d1b/l42 0 2026-03-09T16:15:09.222 INFO:tasks.workunit.client.1.vm05.stdout:8/181: dwrite d4/f1c [0,4194304] 0 2026-03-09T16:15:09.231 INFO:tasks.workunit.client.1.vm05.stdout:9/243: mknod d4/d10/d35/d36/d48/d4c/c51 0 2026-03-09T16:15:09.231 INFO:tasks.workunit.client.1.vm05.stdout:7/270: link d1/d19/l21 d1/d2/d8/dc/d15/l58 0 2026-03-09T16:15:09.232 INFO:tasks.workunit.client.1.vm05.stdout:9/244: readlink d4/d10/l3e 0 2026-03-09T16:15:09.232 INFO:tasks.workunit.client.1.vm05.stdout:9/245: chown d4/d10/d35 146424 1 2026-03-09T16:15:09.233 INFO:tasks.workunit.client.1.vm05.stdout:9/246: fsync d4/d10/f18 0 2026-03-09T16:15:09.240 INFO:tasks.workunit.client.1.vm05.stdout:9/247: rmdir d4/d10/d50 0 2026-03-09T16:15:09.242 INFO:tasks.workunit.client.1.vm05.stdout:9/248: creat d4/d10/f52 x:0 0 0 2026-03-09T16:15:09.243 INFO:tasks.workunit.client.1.vm05.stdout:9/249: symlink d4/d10/d35/d36/l53 0 2026-03-09T16:15:09.252 INFO:tasks.workunit.client.1.vm05.stdout:9/250: dread f2 [0,4194304] 0 2026-03-09T16:15:09.252 INFO:tasks.workunit.client.1.vm05.stdout:6/199: getdents d17/d22 0 2026-03-09T16:15:09.253 INFO:tasks.workunit.client.1.vm05.stdout:6/200: stat d17/d22 0 2026-03-09T16:15:09.255 INFO:tasks.workunit.client.1.vm05.stdout:9/251: mkdir d4/d10/d35/d36/d48/d54 0 2026-03-09T16:15:09.257 INFO:tasks.workunit.client.1.vm05.stdout:9/252: write d4/d10/f18 [516385,62554] 0 2026-03-09T16:15:09.260 INFO:tasks.workunit.client.1.vm05.stdout:6/201: dwrite d17/f1c [0,4194304] 0 2026-03-09T16:15:09.264 INFO:tasks.workunit.client.1.vm05.stdout:1/229: getdents d7/dd/de 0 2026-03-09T16:15:09.268 INFO:tasks.workunit.client.1.vm05.stdout:9/253: creat d4/d10/d35/d2b/d31/f55 x:0 0 0 2026-03-09T16:15:09.268 INFO:tasks.workunit.client.1.vm05.stdout:2/243: truncate db/dd/d15/d1f/f2b 3402198 0 2026-03-09T16:15:09.268 INFO:tasks.workunit.client.1.vm05.stdout:2/244: stat db/dd/f10 0 2026-03-09T16:15:09.270 INFO:tasks.workunit.client.1.vm05.stdout:9/254: fdatasync d4/fa 0 2026-03-09T16:15:09.270 INFO:tasks.workunit.client.1.vm05.stdout:9/255: chown d4/f43 215 1 2026-03-09T16:15:09.271 INFO:tasks.workunit.client.1.vm05.stdout:2/245: creat db/dd/d15/d4c/f58 x:0 0 0 2026-03-09T16:15:09.274 INFO:tasks.workunit.client.1.vm05.stdout:2/246: dread db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:09.279 INFO:tasks.workunit.client.1.vm05.stdout:1/230: getdents d7/d15/d16 0 2026-03-09T16:15:09.282 INFO:tasks.workunit.client.1.vm05.stdout:2/247: unlink db/dd/d15/d1f/d20/d23/l52 0 2026-03-09T16:15:09.283 INFO:tasks.workunit.client.1.vm05.stdout:9/256: dwrite d4/f4a [0,4194304] 0 2026-03-09T16:15:09.287 INFO:tasks.workunit.client.1.vm05.stdout:1/231: creat d7/d15/d16/f53 x:0 0 0 2026-03-09T16:15:09.287 INFO:tasks.workunit.client.1.vm05.stdout:1/232: truncate d7/f34 4835678 0 2026-03-09T16:15:09.290 INFO:tasks.workunit.client.1.vm05.stdout:7/271: unlink d1/d2/cd 0 2026-03-09T16:15:09.294 INFO:tasks.workunit.client.1.vm05.stdout:0/254: write d5/d34/f35 [73168,72428] 0 2026-03-09T16:15:09.294 INFO:tasks.workunit.client.1.vm05.stdout:1/233: creat d7/dd/d21/d2d/d3a/f54 x:0 0 0 2026-03-09T16:15:09.294 INFO:tasks.workunit.client.1.vm05.stdout:1/234: stat d7/dd/de 0 2026-03-09T16:15:09.295 INFO:tasks.workunit.client.1.vm05.stdout:3/171: rename d0/d9/d10 to d0/d33 0 2026-03-09T16:15:09.297 INFO:tasks.workunit.client.1.vm05.stdout:1/235: mkdir d7/dd/d21/d3b/d55 0 2026-03-09T16:15:09.298 INFO:tasks.workunit.client.1.vm05.stdout:1/236: stat d7/d15/d16/f26 0 2026-03-09T16:15:09.300 INFO:tasks.workunit.client.1.vm05.stdout:8/182: rename d4/d6/d3a/d15/l39 to d4/d6/db/dc/l43 0 2026-03-09T16:15:09.303 INFO:tasks.workunit.client.1.vm05.stdout:1/237: creat d7/dd/de/f56 x:0 0 0 2026-03-09T16:15:09.303 INFO:tasks.workunit.client.1.vm05.stdout:4/223: rmdir d5 39 2026-03-09T16:15:09.304 INFO:tasks.workunit.client.1.vm05.stdout:3/172: dwrite d0/d33/f29 [0,4194304] 0 2026-03-09T16:15:09.306 INFO:tasks.workunit.client.1.vm05.stdout:2/248: getdents db 0 2026-03-09T16:15:09.306 INFO:tasks.workunit.client.1.vm05.stdout:8/183: creat d4/d6/f44 x:0 0 0 2026-03-09T16:15:09.312 INFO:tasks.workunit.client.1.vm05.stdout:5/248: truncate d8/d18/d1b/d2e/f3c 2211390 0 2026-03-09T16:15:09.318 INFO:tasks.workunit.client.1.vm05.stdout:5/249: dread - d8/d1d/f21 zero size 2026-03-09T16:15:09.318 INFO:tasks.workunit.client.1.vm05.stdout:6/202: rename d17/d22/f37 to d17/d1d/f41 0 2026-03-09T16:15:09.318 INFO:tasks.workunit.client.1.vm05.stdout:2/249: truncate f5 43046 0 2026-03-09T16:15:09.318 INFO:tasks.workunit.client.1.vm05.stdout:2/250: chown db/dd/d15/d1f/d21/l31 30 1 2026-03-09T16:15:09.319 INFO:tasks.workunit.client.1.vm05.stdout:7/272: rename d1/d2/d8/dc/d15/d3e/f4c to d1/d19/f59 0 2026-03-09T16:15:09.319 INFO:tasks.workunit.client.1.vm05.stdout:2/251: write db/dd/d15/d4c/f58 [356871,114503] 0 2026-03-09T16:15:09.322 INFO:tasks.workunit.client.1.vm05.stdout:4/224: dwrite d5/de/d15/f25 [0,4194304] 0 2026-03-09T16:15:09.322 INFO:tasks.workunit.client.1.vm05.stdout:2/252: chown db/dd/d15/d3f/d55 215048 1 2026-03-09T16:15:09.322 INFO:tasks.workunit.client.1.vm05.stdout:3/173: symlink d0/l34 0 2026-03-09T16:15:09.324 INFO:tasks.workunit.client.1.vm05.stdout:5/250: mknod d8/d18/c56 0 2026-03-09T16:15:09.328 INFO:tasks.workunit.client.1.vm05.stdout:4/225: write d5/de/f24 [1748278,96548] 0 2026-03-09T16:15:09.328 INFO:tasks.workunit.client.1.vm05.stdout:0/255: sync 2026-03-09T16:15:09.335 INFO:tasks.workunit.client.1.vm05.stdout:4/226: readlink d5/d19/l33 0 2026-03-09T16:15:09.337 INFO:tasks.workunit.client.1.vm05.stdout:8/184: getdents d4/d6/db 0 2026-03-09T16:15:09.338 INFO:tasks.workunit.client.1.vm05.stdout:1/238: sync 2026-03-09T16:15:09.338 INFO:tasks.workunit.client.1.vm05.stdout:3/174: sync 2026-03-09T16:15:09.338 INFO:tasks.workunit.client.1.vm05.stdout:5/251: sync 2026-03-09T16:15:09.344 INFO:tasks.workunit.client.1.vm05.stdout:3/175: truncate d0/d33/f29 5037279 0 2026-03-09T16:15:09.345 INFO:tasks.workunit.client.1.vm05.stdout:7/273: dwrite d1/d19/f59 [0,4194304] 0 2026-03-09T16:15:09.345 INFO:tasks.workunit.client.1.vm05.stdout:4/227: dread d5/d19/f32 [0,4194304] 0 2026-03-09T16:15:09.346 INFO:tasks.workunit.client.1.vm05.stdout:8/185: creat d4/d6/d3a/d3c/f45 x:0 0 0 2026-03-09T16:15:09.349 INFO:tasks.workunit.client.1.vm05.stdout:8/186: dread - d4/d6/d3a/f28 zero size 2026-03-09T16:15:09.353 INFO:tasks.workunit.client.1.vm05.stdout:9/257: fsync d4/f4a 0 2026-03-09T16:15:09.354 INFO:tasks.workunit.client.1.vm05.stdout:5/252: truncate d8/f11 2789678 0 2026-03-09T16:15:09.355 INFO:tasks.workunit.client.1.vm05.stdout:0/256: dwrite d5/db/f12 [4194304,4194304] 0 2026-03-09T16:15:09.357 INFO:tasks.workunit.client.1.vm05.stdout:3/176: chown d0/d9/f2c 2009235201 1 2026-03-09T16:15:09.357 INFO:tasks.workunit.client.1.vm05.stdout:0/257: dread - d5/d1b/f47 zero size 2026-03-09T16:15:09.358 INFO:tasks.workunit.client.1.vm05.stdout:3/177: dread - d0/d9/f2f zero size 2026-03-09T16:15:09.361 INFO:tasks.workunit.client.1.vm05.stdout:0/258: write d5/f8 [2052342,73802] 0 2026-03-09T16:15:09.361 INFO:tasks.workunit.client.1.vm05.stdout:3/178: chown d0/d9/d22/c1a 37427396 1 2026-03-09T16:15:09.366 INFO:tasks.workunit.client.1.vm05.stdout:7/274: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:09.367 INFO:tasks.workunit.client.1.vm05.stdout:3/179: dread - d0/d9/f2b zero size 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all iscsi 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all nfs 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all nvmeof 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-09T16:15:09.368 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:08 vm03.local ceph-mon[51019]: Deploying daemon node-exporter.vm03 on vm03 2026-03-09T16:15:09.377 INFO:tasks.workunit.client.1.vm05.stdout:1/239: dread d7/f34 [0,4194304] 0 2026-03-09T16:15:09.382 INFO:tasks.workunit.client.1.vm05.stdout:2/253: dread db/f12 [0,4194304] 0 2026-03-09T16:15:09.389 INFO:tasks.workunit.client.1.vm05.stdout:9/258: rename d4/d10/d35/d36/d48/d4c/l4f to d4/d10/d35/l56 0 2026-03-09T16:15:09.389 INFO:tasks.workunit.client.1.vm05.stdout:9/259: truncate d4/d10/d35/d2b/d31/f55 29226 0 2026-03-09T16:15:09.389 INFO:tasks.workunit.client.1.vm05.stdout:7/275: rmdir d1/d2/d8 39 2026-03-09T16:15:09.391 INFO:tasks.workunit.client.1.vm05.stdout:7/276: sync 2026-03-09T16:15:09.399 INFO:tasks.workunit.client.1.vm05.stdout:3/180: symlink d0/d33/l35 0 2026-03-09T16:15:09.399 INFO:tasks.workunit.client.1.vm05.stdout:8/187: creat d4/d6/db/dc/d2e/f46 x:0 0 0 2026-03-09T16:15:09.400 INFO:tasks.workunit.client.1.vm05.stdout:1/240: creat d7/d27/f57 x:0 0 0 2026-03-09T16:15:09.400 INFO:tasks.workunit.client.1.vm05.stdout:2/254: rmdir db/dd/d15/d1f 39 2026-03-09T16:15:09.401 INFO:tasks.workunit.client.1.vm05.stdout:0/259: link d5/d1b/d30/f4b d5/d1b/f50 0 2026-03-09T16:15:09.403 INFO:tasks.workunit.client.1.vm05.stdout:4/228: creat d5/de/d15/f52 x:0 0 0 2026-03-09T16:15:09.404 INFO:tasks.workunit.client.1.vm05.stdout:7/277: read d1/d2/d11/f54 [1431832,4234] 0 2026-03-09T16:15:09.405 INFO:tasks.workunit.client.1.vm05.stdout:3/181: creat d0/d33/f36 x:0 0 0 2026-03-09T16:15:09.405 INFO:tasks.workunit.client.1.vm05.stdout:8/188: dwrite d4/d6/f24 [0,4194304] 0 2026-03-09T16:15:09.410 INFO:tasks.workunit.client.1.vm05.stdout:4/229: dwrite d5/de/f23 [0,4194304] 0 2026-03-09T16:15:09.412 INFO:tasks.workunit.client.1.vm05.stdout:4/230: stat d5/de/d15/f25 0 2026-03-09T16:15:09.412 INFO:tasks.workunit.client.1.vm05.stdout:4/231: fdatasync d5/de/d15/f34 0 2026-03-09T16:15:09.413 INFO:tasks.workunit.client.1.vm05.stdout:4/232: read d5/de/d15/f25 [587756,103020] 0 2026-03-09T16:15:09.414 INFO:tasks.workunit.client.1.vm05.stdout:4/233: read d5/d19/f48 [3050720,122159] 0 2026-03-09T16:15:09.415 INFO:tasks.workunit.client.1.vm05.stdout:2/255: dwrite db/dd/d15/d1f/f49 [4194304,4194304] 0 2026-03-09T16:15:09.421 INFO:tasks.workunit.client.1.vm05.stdout:6/203: mkdir d17/d22/d27/d34/d42 0 2026-03-09T16:15:09.421 INFO:tasks.workunit.client.1.vm05.stdout:1/241: creat d7/dd/de/d52/f58 x:0 0 0 2026-03-09T16:15:09.421 INFO:tasks.workunit.client.1.vm05.stdout:2/256: write db/dd/d15/d1f/d21/f29 [351726,82285] 0 2026-03-09T16:15:09.423 INFO:tasks.workunit.client.1.vm05.stdout:3/182: creat d0/d9/f37 x:0 0 0 2026-03-09T16:15:09.423 INFO:tasks.workunit.client.1.vm05.stdout:2/257: chown db/dd/d15/d1f/d21/f47 0 1 2026-03-09T16:15:09.428 INFO:tasks.workunit.client.1.vm05.stdout:0/260: symlink d5/d11/d4f/l51 0 2026-03-09T16:15:09.428 INFO:tasks.workunit.client.1.vm05.stdout:6/204: mknod d17/d22/d27/d34/c43 0 2026-03-09T16:15:09.429 INFO:tasks.workunit.client.1.vm05.stdout:3/183: mknod d0/d33/c38 0 2026-03-09T16:15:09.429 INFO:tasks.workunit.client.1.vm05.stdout:8/189: link d4/d6/f9 d4/d6/db/dc/d2e/f47 0 2026-03-09T16:15:09.434 INFO:tasks.workunit.client.1.vm05.stdout:0/261: creat d5/db/d1d/f52 x:0 0 0 2026-03-09T16:15:09.437 INFO:tasks.workunit.client.1.vm05.stdout:4/234: dread f0 [0,4194304] 0 2026-03-09T16:15:09.437 INFO:tasks.workunit.client.1.vm05.stdout:8/190: dread d4/f10 [0,4194304] 0 2026-03-09T16:15:09.439 INFO:tasks.workunit.client.1.vm05.stdout:9/260: dwrite d4/d10/d35/f32 [0,4194304] 0 2026-03-09T16:15:09.439 INFO:tasks.workunit.client.1.vm05.stdout:4/235: fdatasync d5/fb 0 2026-03-09T16:15:09.446 INFO:tasks.workunit.client.1.vm05.stdout:3/184: chown d0/l7 0 1 2026-03-09T16:15:09.448 INFO:tasks.workunit.client.1.vm05.stdout:0/262: symlink d5/d11/d4f/l53 0 2026-03-09T16:15:09.449 INFO:tasks.workunit.client.1.vm05.stdout:6/205: unlink d17/f32 0 2026-03-09T16:15:09.451 INFO:tasks.workunit.client.1.vm05.stdout:0/263: stat d5/d11/f37 0 2026-03-09T16:15:09.451 INFO:tasks.workunit.client.1.vm05.stdout:3/185: truncate d0/d9/f2b 882101 0 2026-03-09T16:15:09.453 INFO:tasks.workunit.client.1.vm05.stdout:2/258: mknod db/dd/c59 0 2026-03-09T16:15:09.453 INFO:tasks.workunit.client.1.vm05.stdout:9/261: mknod d4/d10/d35/d36/d48/d4c/c57 0 2026-03-09T16:15:09.457 INFO:tasks.workunit.client.1.vm05.stdout:4/236: creat d5/de/d15/d21/d39/f53 x:0 0 0 2026-03-09T16:15:09.458 INFO:tasks.workunit.client.1.vm05.stdout:7/278: creat d1/d2/d8/dc/d1b/f5a x:0 0 0 2026-03-09T16:15:09.458 INFO:tasks.workunit.client.1.vm05.stdout:5/253: dread d8/f11 [0,4194304] 0 2026-03-09T16:15:09.458 INFO:tasks.workunit.client.1.vm05.stdout:6/206: mkdir d17/d22/d27/d44 0 2026-03-09T16:15:09.459 INFO:tasks.workunit.client.1.vm05.stdout:5/254: dread - d8/d18/d1b/d2e/f51 zero size 2026-03-09T16:15:09.461 INFO:tasks.workunit.client.1.vm05.stdout:7/279: truncate d1/d2/d11/f25 605872 0 2026-03-09T16:15:09.461 INFO:tasks.workunit.client.1.vm05.stdout:7/280: write d1/d2/f4f [69781,34563] 0 2026-03-09T16:15:09.467 INFO:tasks.workunit.client.1.vm05.stdout:2/259: dwrite db/dd/d15/d1f/d21/f29 [0,4194304] 0 2026-03-09T16:15:09.468 INFO:tasks.workunit.client.1.vm05.stdout:2/260: stat db/c1e 0 2026-03-09T16:15:09.468 INFO:tasks.workunit.client.1.vm05.stdout:9/262: symlink d4/d10/d35/d36/d48/l58 0 2026-03-09T16:15:09.473 INFO:tasks.workunit.client.1.vm05.stdout:1/242: getdents d7/dd/d21 0 2026-03-09T16:15:09.473 INFO:tasks.workunit.client.1.vm05.stdout:3/186: symlink d0/l39 0 2026-03-09T16:15:09.474 INFO:tasks.workunit.client.1.vm05.stdout:5/255: creat d8/d18/d1b/d47/d4e/f57 x:0 0 0 2026-03-09T16:15:09.475 INFO:tasks.workunit.client.1.vm05.stdout:1/243: truncate d7/dd/de/f2e 1705825 0 2026-03-09T16:15:09.476 INFO:tasks.workunit.client.1.vm05.stdout:1/244: dread - d7/dd/d21/d2d/d3a/f54 zero size 2026-03-09T16:15:09.476 INFO:tasks.workunit.client.1.vm05.stdout:4/237: rename d5/c18 to d5/de/d15/d21/d27/d3c/c54 0 2026-03-09T16:15:09.476 INFO:tasks.workunit.client.1.vm05.stdout:2/261: symlink db/dd/d15/d46/l5a 0 2026-03-09T16:15:09.477 INFO:tasks.workunit.client.1.vm05.stdout:7/281: creat d1/d19/d2a/f5b x:0 0 0 2026-03-09T16:15:09.478 INFO:tasks.workunit.client.1.vm05.stdout:6/207: mknod d17/c45 0 2026-03-09T16:15:09.478 INFO:tasks.workunit.client.1.vm05.stdout:4/238: write d5/de/d15/d21/d27/d3c/f4d [783738,125371] 0 2026-03-09T16:15:09.480 INFO:tasks.workunit.client.1.vm05.stdout:8/191: getdents d4/d6/db/dc/d3b 0 2026-03-09T16:15:09.481 INFO:tasks.workunit.client.1.vm05.stdout:1/245: read d7/dd/de/f23 [1916703,74142] 0 2026-03-09T16:15:09.481 INFO:tasks.workunit.client.1.vm05.stdout:0/264: creat d5/db/f54 x:0 0 0 2026-03-09T16:15:09.482 INFO:tasks.workunit.client.1.vm05.stdout:5/256: rename d8/d18/d1b/d2e/d43/d45 to d8/d53/d58 0 2026-03-09T16:15:09.485 INFO:tasks.workunit.client.1.vm05.stdout:0/265: dread - d5/db/f54 zero size 2026-03-09T16:15:09.485 INFO:tasks.workunit.client.1.vm05.stdout:6/208: mknod d17/d22/d27/c46 0 2026-03-09T16:15:09.490 INFO:tasks.workunit.client.1.vm05.stdout:8/192: symlink d4/d6/db/dc/l48 0 2026-03-09T16:15:09.490 INFO:tasks.workunit.client.1.vm05.stdout:2/262: dwrite db/dd/d15/d3f/f4a [0,4194304] 0 2026-03-09T16:15:09.492 INFO:tasks.workunit.client.1.vm05.stdout:1/246: creat d7/dd/d21/d39/d48/f59 x:0 0 0 2026-03-09T16:15:09.494 INFO:tasks.workunit.client.1.vm05.stdout:0/266: truncate d5/d11/f37 2030346 0 2026-03-09T16:15:09.505 INFO:tasks.workunit.client.1.vm05.stdout:5/257: truncate d8/d18/d1b/f36 1300470 0 2026-03-09T16:15:09.506 INFO:tasks.workunit.client.1.vm05.stdout:2/263: dwrite db/dd/f10 [0,4194304] 0 2026-03-09T16:15:09.506 INFO:tasks.workunit.client.1.vm05.stdout:1/247: rename d7/dd/d21/d2d/d3a to d7/dd/d21/d39/d5a 0 2026-03-09T16:15:09.506 INFO:tasks.workunit.client.1.vm05.stdout:2/264: read - db/dd/d15/d1f/d20/f53 zero size 2026-03-09T16:15:09.508 INFO:tasks.workunit.client.1.vm05.stdout:8/193: fsync d4/f1c 0 2026-03-09T16:15:09.508 INFO:tasks.workunit.client.1.vm05.stdout:6/209: sync 2026-03-09T16:15:09.508 INFO:tasks.workunit.client.1.vm05.stdout:3/187: sync 2026-03-09T16:15:09.508 INFO:tasks.workunit.client.1.vm05.stdout:6/210: fsync f11 0 2026-03-09T16:15:09.509 INFO:tasks.workunit.client.1.vm05.stdout:4/239: dread d5/de/d15/d21/f2a [0,4194304] 0 2026-03-09T16:15:09.510 INFO:tasks.workunit.client.1.vm05.stdout:7/282: getdents d1/d2/d8/dc/d33 0 2026-03-09T16:15:09.518 INFO:tasks.workunit.client.1.vm05.stdout:1/248: mkdir d7/dd/de/d52/d5b 0 2026-03-09T16:15:09.518 INFO:tasks.workunit.client.1.vm05.stdout:2/265: mkdir db/dd/d15/d3f/d5b 0 2026-03-09T16:15:09.518 INFO:tasks.workunit.client.1.vm05.stdout:1/249: chown d7/dd/d21/d2d 14 1 2026-03-09T16:15:09.518 INFO:tasks.workunit.client.1.vm05.stdout:1/250: write d7/dd/de/f2e [2122777,85451] 0 2026-03-09T16:15:09.518 INFO:tasks.workunit.client.1.vm05.stdout:6/211: rename f5 to d17/d22/d27/d34/f47 0 2026-03-09T16:15:09.519 INFO:tasks.workunit.client.1.vm05.stdout:1/251: write d7/d15/d16/f29 [1004106,28939] 0 2026-03-09T16:15:09.519 INFO:tasks.workunit.client.1.vm05.stdout:2/266: read db/dd/f1b [879382,126702] 0 2026-03-09T16:15:09.519 INFO:tasks.workunit.client.1.vm05.stdout:1/252: chown d7/dd/de/d52/f58 221883636 1 2026-03-09T16:15:09.521 INFO:tasks.workunit.client.1.vm05.stdout:7/283: mknod d1/d19/c5c 0 2026-03-09T16:15:09.521 INFO:tasks.workunit.client.1.vm05.stdout:8/194: creat d4/d6/d3a/f49 x:0 0 0 2026-03-09T16:15:09.521 INFO:tasks.workunit.client.1.vm05.stdout:0/267: creat d5/d1b/d30/f55 x:0 0 0 2026-03-09T16:15:09.523 INFO:tasks.workunit.client.1.vm05.stdout:1/253: unlink d7/d27/f37 0 2026-03-09T16:15:09.526 INFO:tasks.workunit.client.1.vm05.stdout:6/212: creat d17/d22/d27/d44/f48 x:0 0 0 2026-03-09T16:15:09.528 INFO:tasks.workunit.client.1.vm05.stdout:1/254: dwrite d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:09.529 INFO:tasks.workunit.client.1.vm05.stdout:4/240: rmdir d5/de/d15/d3f 0 2026-03-09T16:15:09.531 INFO:tasks.workunit.client.1.vm05.stdout:4/241: rename d5 to d5/de/d4a/d4e/d55 22 2026-03-09T16:15:09.531 INFO:tasks.workunit.client.1.vm05.stdout:1/255: mkdir d7/dd/d21/d44/d5c 0 2026-03-09T16:15:09.532 INFO:tasks.workunit.client.1.vm05.stdout:1/256: write d7/dd/d21/d39/d48/f59 [896008,65856] 0 2026-03-09T16:15:09.537 INFO:tasks.workunit.client.1.vm05.stdout:1/257: dwrite d7/d15/d16/f29 [0,4194304] 0 2026-03-09T16:15:09.538 INFO:tasks.workunit.client.1.vm05.stdout:6/213: symlink d17/l49 0 2026-03-09T16:15:09.541 INFO:tasks.workunit.client.1.vm05.stdout:1/258: mkdir d7/dd/d21/d39/d48/d5d 0 2026-03-09T16:15:09.543 INFO:tasks.workunit.client.1.vm05.stdout:0/268: chown d5/d1b/f47 1822 1 2026-03-09T16:15:09.544 INFO:tasks.workunit.client.1.vm05.stdout:1/259: creat d7/dd/de/d52/d5b/f5e x:0 0 0 2026-03-09T16:15:09.544 INFO:tasks.workunit.client.1.vm05.stdout:6/214: creat d17/f4a x:0 0 0 2026-03-09T16:15:09.545 INFO:tasks.workunit.client.1.vm05.stdout:6/215: truncate d17/f3b 182993 0 2026-03-09T16:15:09.546 INFO:tasks.workunit.client.1.vm05.stdout:8/195: dread d4/d6/d3a/d15/f22 [0,4194304] 0 2026-03-09T16:15:09.550 INFO:tasks.workunit.client.1.vm05.stdout:1/260: creat d7/d15/d16/f5f x:0 0 0 2026-03-09T16:15:09.551 INFO:tasks.workunit.client.1.vm05.stdout:1/261: stat d7/dd/l24 0 2026-03-09T16:15:09.551 INFO:tasks.workunit.client.1.vm05.stdout:0/269: truncate d5/d1b/f47 58754 0 2026-03-09T16:15:09.551 INFO:tasks.workunit.client.1.vm05.stdout:1/262: dwrite d7/dd/f1f [0,4194304] 0 2026-03-09T16:15:09.551 INFO:tasks.workunit.client.1.vm05.stdout:1/263: truncate d7/dd/d21/f3d 620933 0 2026-03-09T16:15:09.556 INFO:tasks.workunit.client.1.vm05.stdout:8/196: write d4/d6/db/dc/f41 [429284,88162] 0 2026-03-09T16:15:09.557 INFO:tasks.workunit.client.1.vm05.stdout:3/188: dread d0/d9/d22/f18 [0,4194304] 0 2026-03-09T16:15:09.559 INFO:tasks.workunit.client.1.vm05.stdout:6/216: dwrite d17/f2d [0,4194304] 0 2026-03-09T16:15:09.568 INFO:tasks.workunit.client.1.vm05.stdout:6/217: dread - d17/d22/d27/f3c zero size 2026-03-09T16:15:09.569 INFO:tasks.workunit.client.1.vm05.stdout:6/218: read d17/d22/d27/d34/f47 [276119,96086] 0 2026-03-09T16:15:09.569 INFO:tasks.workunit.client.1.vm05.stdout:3/189: read - d0/d9/d22/f2a zero size 2026-03-09T16:15:09.569 INFO:tasks.workunit.client.1.vm05.stdout:1/264: dwrite d7/dd/de/d52/d5b/f5e [0,4194304] 0 2026-03-09T16:15:09.569 INFO:tasks.workunit.client.1.vm05.stdout:3/190: dread - d0/d9/f37 zero size 2026-03-09T16:15:09.575 INFO:tasks.workunit.client.1.vm05.stdout:6/219: mkdir d17/d22/d27/d34/d4b 0 2026-03-09T16:15:09.576 INFO:tasks.workunit.client.1.vm05.stdout:6/220: stat d17/d22/d27/d34/c43 0 2026-03-09T16:15:09.577 INFO:tasks.workunit.client.1.vm05.stdout:6/221: truncate d17/d1d/f33 4752595 0 2026-03-09T16:15:09.585 INFO:tasks.workunit.client.1.vm05.stdout:3/191: rmdir d0 39 2026-03-09T16:15:09.585 INFO:tasks.workunit.client.1.vm05.stdout:0/270: creat d5/d1b/f56 x:0 0 0 2026-03-09T16:15:09.585 INFO:tasks.workunit.client.1.vm05.stdout:8/197: unlink d4/d6/db/df/c36 0 2026-03-09T16:15:09.586 INFO:tasks.workunit.client.1.vm05.stdout:1/265: truncate d7/f4b 209976 0 2026-03-09T16:15:09.587 INFO:tasks.workunit.client.1.vm05.stdout:1/266: write d7/d15/d16/f26 [2645011,39266] 0 2026-03-09T16:15:09.588 INFO:tasks.workunit.client.1.vm05.stdout:1/267: dread - d7/dd/de/f56 zero size 2026-03-09T16:15:09.589 INFO:tasks.workunit.client.1.vm05.stdout:3/192: write d0/d9/d22/f2e [734015,7669] 0 2026-03-09T16:15:09.595 INFO:tasks.workunit.client.1.vm05.stdout:3/193: creat d0/d33/f3a x:0 0 0 2026-03-09T16:15:09.596 INFO:tasks.workunit.client.1.vm05.stdout:3/194: truncate d0/d9/f37 1031774 0 2026-03-09T16:15:09.596 INFO:tasks.workunit.client.1.vm05.stdout:3/195: write d0/d9/f37 [1219009,61143] 0 2026-03-09T16:15:09.599 INFO:tasks.workunit.client.1.vm05.stdout:3/196: mknod d0/d33/c3b 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:6/222: dread d17/f18 [0,4194304] 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:6/223: write d17/d22/d27/f2a [618702,25439] 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:3/197: rename d0/d33/l35 to d0/d33/l3c 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:6/224: mknod d17/d1d/c4c 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:1/268: dread d7/dd/f19 [0,4194304] 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:6/225: chown d17/d1d/f1e 81 1 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:1/269: mkdir d7/dd/d21/d39/d5a/d50/d60 0 2026-03-09T16:15:09.610 INFO:tasks.workunit.client.1.vm05.stdout:6/226: mknod d17/c4d 0 2026-03-09T16:15:09.612 INFO:tasks.workunit.client.1.vm05.stdout:1/270: dwrite d7/d27/f33 [0,4194304] 0 2026-03-09T16:15:09.614 INFO:tasks.workunit.client.1.vm05.stdout:6/227: creat d17/f4e x:0 0 0 2026-03-09T16:15:09.622 INFO:tasks.workunit.client.1.vm05.stdout:1/271: mknod d7/c61 0 2026-03-09T16:15:09.622 INFO:tasks.workunit.client.1.vm05.stdout:1/272: stat d7/dd/l24 0 2026-03-09T16:15:09.623 INFO:tasks.workunit.client.1.vm05.stdout:1/273: fsync d7/dd/de/f3e 0 2026-03-09T16:15:09.624 INFO:tasks.workunit.client.1.vm05.stdout:1/274: write d7/d15/d16/f1c [2258346,32747] 0 2026-03-09T16:15:09.636 INFO:tasks.workunit.client.1.vm05.stdout:9/263: truncate d4/d10/f15 1973733 0 2026-03-09T16:15:09.637 INFO:tasks.workunit.client.1.vm05.stdout:9/264: readlink d4/l41 0 2026-03-09T16:15:09.637 INFO:tasks.workunit.client.1.vm05.stdout:9/265: fsync d4/d10/f2a 0 2026-03-09T16:15:09.643 INFO:tasks.workunit.client.1.vm05.stdout:8/198: dread d4/d6/db/fe [0,4194304] 0 2026-03-09T16:15:09.648 INFO:tasks.workunit.client.1.vm05.stdout:5/258: write d8/d18/f20 [50052,56837] 0 2026-03-09T16:15:09.649 INFO:tasks.workunit.client.1.vm05.stdout:5/259: chown d8/d18/d1b/d2e/d43/f41 26873 1 2026-03-09T16:15:09.649 INFO:tasks.workunit.client.1.vm05.stdout:5/260: fsync d8/d18/d1b/f2c 0 2026-03-09T16:15:09.649 INFO:tasks.workunit.client.1.vm05.stdout:5/261: write d8/d1d/f21 [1015182,105952] 0 2026-03-09T16:15:09.649 INFO:tasks.workunit.client.1.vm05.stdout:9/266: mkdir d4/d10/d35/d36/d48/d54/d59 0 2026-03-09T16:15:09.653 INFO:tasks.workunit.client.1.vm05.stdout:8/199: link d4/d6/d3a/c2c d4/d6/d3a/d40/c4a 0 2026-03-09T16:15:09.660 INFO:tasks.workunit.client.1.vm05.stdout:2/267: dwrite db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:09.666 INFO:tasks.workunit.client.1.vm05.stdout:7/284: dwrite d1/d2/f22 [0,4194304] 0 2026-03-09T16:15:09.667 INFO:tasks.workunit.client.1.vm05.stdout:5/262: rename d8/d53/d58 to d8/d59 0 2026-03-09T16:15:09.667 INFO:tasks.workunit.client.1.vm05.stdout:4/242: dwrite f1 [0,4194304] 0 2026-03-09T16:15:09.669 INFO:tasks.workunit.client.1.vm05.stdout:4/243: truncate d5/de/d15/d21/d27/f2c 819120 0 2026-03-09T16:15:09.671 INFO:tasks.workunit.client.1.vm05.stdout:9/267: getdents d4/d10/d35/d2b/d38 0 2026-03-09T16:15:09.681 INFO:tasks.workunit.client.1.vm05.stdout:7/285: mkdir d1/d2/d8/dc/d18/d5d 0 2026-03-09T16:15:09.681 INFO:tasks.workunit.client.1.vm05.stdout:2/268: rename db/f41 to db/dd/d15/d3f/f5c 0 2026-03-09T16:15:09.681 INFO:tasks.workunit.client.1.vm05.stdout:7/286: dread - d1/d2/d8/d31/f39 zero size 2026-03-09T16:15:09.682 INFO:tasks.workunit.client.1.vm05.stdout:7/287: stat d1/d2/d8/dc/d18 0 2026-03-09T16:15:09.682 INFO:tasks.workunit.client.1.vm05.stdout:0/271: write d5/f17 [4839454,51577] 0 2026-03-09T16:15:09.684 INFO:tasks.workunit.client.1.vm05.stdout:0/272: chown d5/d11/f37 3 1 2026-03-09T16:15:09.688 INFO:tasks.workunit.client.1.vm05.stdout:4/244: mknod d5/de/d15/c56 0 2026-03-09T16:15:09.702 INFO:tasks.workunit.client.1.vm05.stdout:6/228: rmdir d17 39 2026-03-09T16:15:09.702 INFO:tasks.workunit.client.1.vm05.stdout:3/198: dwrite d0/d9/fa [0,4194304] 0 2026-03-09T16:15:09.702 INFO:tasks.workunit.client.1.vm05.stdout:4/245: dwrite d5/fd [4194304,4194304] 0 2026-03-09T16:15:09.702 INFO:tasks.workunit.client.1.vm05.stdout:1/275: dwrite d7/d15/f22 [4194304,4194304] 0 2026-03-09T16:15:09.703 INFO:tasks.workunit.client.1.vm05.stdout:3/199: mknod d0/d9/c3d 0 2026-03-09T16:15:09.703 INFO:tasks.workunit.client.1.vm05.stdout:3/200: stat d0/d9/fa 0 2026-03-09T16:15:09.715 INFO:tasks.workunit.client.1.vm05.stdout:6/229: mkdir d17/d4f 0 2026-03-09T16:15:09.715 INFO:tasks.workunit.client.1.vm05.stdout:1/276: mkdir d7/d62 0 2026-03-09T16:15:09.715 INFO:tasks.workunit.client.1.vm05.stdout:6/230: rmdir d17 39 2026-03-09T16:15:09.716 INFO:tasks.workunit.client.1.vm05.stdout:1/277: truncate d7/dd/de/f23 119192 0 2026-03-09T16:15:09.719 INFO:tasks.workunit.client.1.vm05.stdout:5/263: sync 2026-03-09T16:15:09.723 INFO:tasks.workunit.client.1.vm05.stdout:5/264: getdents d8/d18 0 2026-03-09T16:15:09.731 INFO:tasks.workunit.client.1.vm05.stdout:5/265: truncate d8/d18/d1b/d47/f4c 785828 0 2026-03-09T16:15:09.731 INFO:tasks.workunit.client.1.vm05.stdout:5/266: fsync f1 0 2026-03-09T16:15:09.731 INFO:tasks.workunit.client.1.vm05.stdout:5/267: symlink d8/d18/d1b/d47/d48/l5a 0 2026-03-09T16:15:09.745 INFO:tasks.workunit.client.1.vm05.stdout:9/268: sync 2026-03-09T16:15:09.746 INFO:tasks.workunit.client.1.vm05.stdout:7/288: sync 2026-03-09T16:15:09.746 INFO:tasks.workunit.client.1.vm05.stdout:0/273: sync 2026-03-09T16:15:09.749 INFO:tasks.workunit.client.1.vm05.stdout:9/269: creat d4/d10/d35/d36/d48/d54/f5a x:0 0 0 2026-03-09T16:15:09.759 INFO:tasks.workunit.client.1.vm05.stdout:7/289: fdatasync d1/d2/d8/dc/d18/f52 0 2026-03-09T16:15:09.760 INFO:tasks.workunit.client.1.vm05.stdout:9/270: getdents d4 0 2026-03-09T16:15:09.760 INFO:tasks.workunit.client.1.vm05.stdout:7/290: dwrite d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:09.760 INFO:tasks.workunit.client.1.vm05.stdout:9/271: creat d4/f5b x:0 0 0 2026-03-09T16:15:09.762 INFO:tasks.workunit.client.1.vm05.stdout:7/291: mkdir d1/d2/d8/dc/d1b/d30/d5e 0 2026-03-09T16:15:09.764 INFO:tasks.workunit.client.1.vm05.stdout:7/292: link d1/d2/d8/lb d1/d2/d8/dc/d1b/d30/d5e/l5f 0 2026-03-09T16:15:09.771 INFO:tasks.workunit.client.1.vm05.stdout:7/293: dwrite d1/d2/d8/dc/f1e [0,4194304] 0 2026-03-09T16:15:09.773 INFO:tasks.workunit.client.1.vm05.stdout:7/294: creat d1/d19/d3c/f60 x:0 0 0 2026-03-09T16:15:09.777 INFO:tasks.workunit.client.1.vm05.stdout:0/274: sync 2026-03-09T16:15:09.778 INFO:tasks.workunit.client.1.vm05.stdout:0/275: read - d5/d1b/d30/f55 zero size 2026-03-09T16:15:09.780 INFO:tasks.workunit.client.1.vm05.stdout:8/200: write f0 [1545320,30571] 0 2026-03-09T16:15:09.787 INFO:tasks.workunit.client.1.vm05.stdout:7/295: dread d1/d2/d8/dc/d14/f41 [0,4194304] 0 2026-03-09T16:15:09.788 INFO:tasks.workunit.client.1.vm05.stdout:0/276: rename d5/d34 to d5/d1b/d57 0 2026-03-09T16:15:09.791 INFO:tasks.workunit.client.1.vm05.stdout:8/201: rename d4/l42 to d4/d6/d3a/l4b 0 2026-03-09T16:15:09.795 INFO:tasks.workunit.client.1.vm05.stdout:0/277: rename d5/d1b/d3b/f42 to d5/db/d48/f58 0 2026-03-09T16:15:09.800 INFO:tasks.workunit.client.1.vm05.stdout:0/278: rename d5/d1b/d30/f4b to d5/db/d1d/f59 0 2026-03-09T16:15:09.800 INFO:tasks.workunit.client.1.vm05.stdout:8/202: dwrite d4/d6/d3a/f25 [0,4194304] 0 2026-03-09T16:15:09.800 INFO:tasks.workunit.client.1.vm05.stdout:8/203: chown d4/d6/d3a/c20 13 1 2026-03-09T16:15:09.801 INFO:tasks.workunit.client.1.vm05.stdout:0/279: dwrite d5/d1b/d57/f35 [0,4194304] 0 2026-03-09T16:15:09.815 INFO:tasks.workunit.client.1.vm05.stdout:8/204: symlink d4/d6/db/l4c 0 2026-03-09T16:15:09.820 INFO:tasks.workunit.client.1.vm05.stdout:0/280: dwrite d5/db/d1d/f4a [0,4194304] 0 2026-03-09T16:15:09.821 INFO:tasks.workunit.client.1.vm05.stdout:4/246: dwrite d5/de/d15/f25 [0,4194304] 0 2026-03-09T16:15:09.822 INFO:tasks.workunit.client.1.vm05.stdout:3/201: fdatasync d0/d9/fa 0 2026-03-09T16:15:09.822 INFO:tasks.workunit.client.1.vm05.stdout:4/247: chown d5/de/l43 6 1 2026-03-09T16:15:09.826 INFO:tasks.workunit.client.1.vm05.stdout:6/231: truncate d17/f30 3516250 0 2026-03-09T16:15:09.827 INFO:tasks.workunit.client.1.vm05.stdout:0/281: dwrite d5/d1b/f47 [0,4194304] 0 2026-03-09T16:15:09.833 INFO:tasks.workunit.client.1.vm05.stdout:3/202: symlink d0/d9/l3e 0 2026-03-09T16:15:09.833 INFO:tasks.workunit.client.1.vm05.stdout:2/269: dread db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:09.833 INFO:tasks.workunit.client.1.vm05.stdout:4/248: creat d5/de/d4a/f57 x:0 0 0 2026-03-09T16:15:09.834 INFO:tasks.workunit.client.1.vm05.stdout:3/203: read d0/d9/f2b [788785,106372] 0 2026-03-09T16:15:09.835 INFO:tasks.workunit.client.1.vm05.stdout:4/249: dread d5/de/d15/d21/d27/f2c [0,4194304] 0 2026-03-09T16:15:09.836 INFO:tasks.workunit.client.1.vm05.stdout:0/282: mknod d5/d11/d4f/c5a 0 2026-03-09T16:15:09.841 INFO:tasks.workunit.client.1.vm05.stdout:6/232: mknod d17/d4f/c50 0 2026-03-09T16:15:09.843 INFO:tasks.workunit.client.1.vm05.stdout:2/270: dwrite db/dd/d15/d1f/f25 [0,4194304] 0 2026-03-09T16:15:09.843 INFO:tasks.workunit.client.1.vm05.stdout:2/271: write db/dd/f1b [3430284,34723] 0 2026-03-09T16:15:09.843 INFO:tasks.workunit.client.1.vm05.stdout:4/250: creat d5/de/d15/d21/d31/f58 x:0 0 0 2026-03-09T16:15:09.847 INFO:tasks.workunit.client.1.vm05.stdout:4/251: creat d5/f59 x:0 0 0 2026-03-09T16:15:09.848 INFO:tasks.workunit.client.1.vm05.stdout:6/233: symlink d17/l51 0 2026-03-09T16:15:09.849 INFO:tasks.workunit.client.1.vm05.stdout:6/234: write d17/f4e [839433,89025] 0 2026-03-09T16:15:09.850 INFO:tasks.workunit.client.1.vm05.stdout:6/235: write d17/d1d/f38 [183022,73193] 0 2026-03-09T16:15:09.853 INFO:tasks.workunit.client.1.vm05.stdout:4/252: dwrite d5/de/d15/d21/d27/f30 [0,4194304] 0 2026-03-09T16:15:09.853 INFO:tasks.workunit.client.1.vm05.stdout:6/236: mknod d17/d1d/c52 0 2026-03-09T16:15:09.856 INFO:tasks.workunit.client.1.vm05.stdout:3/204: sync 2026-03-09T16:15:09.858 INFO:tasks.workunit.client.1.vm05.stdout:6/237: mkdir d17/d22/d27/d34/d42/d53 0 2026-03-09T16:15:09.862 INFO:tasks.workunit.client.1.vm05.stdout:3/205: link d0/d9/d22/c24 d0/c3f 0 2026-03-09T16:15:09.882 INFO:tasks.workunit.client.1.vm05.stdout:6/238: dwrite d17/f31 [0,4194304] 0 2026-03-09T16:15:09.882 INFO:tasks.workunit.client.1.vm05.stdout:3/206: dwrite d0/d9/d22/f2a [0,4194304] 0 2026-03-09T16:15:09.883 INFO:tasks.workunit.client.1.vm05.stdout:6/239: symlink d17/d22/d27/l54 0 2026-03-09T16:15:09.883 INFO:tasks.workunit.client.1.vm05.stdout:6/240: read d17/d1d/f33 [1390847,128277] 0 2026-03-09T16:15:09.887 INFO:tasks.workunit.client.1.vm05.stdout:1/278: truncate d7/d15/d16/f26 6691004 0 2026-03-09T16:15:09.887 INFO:tasks.workunit.client.1.vm05.stdout:3/207: symlink d0/d9/d22/l40 0 2026-03-09T16:15:09.890 INFO:tasks.workunit.client.1.vm05.stdout:3/208: creat d0/d33/f41 x:0 0 0 2026-03-09T16:15:09.892 INFO:tasks.workunit.client.1.vm05.stdout:1/279: rmdir d7/dd/d21/d39/d5a/d50/d60 0 2026-03-09T16:15:09.893 INFO:tasks.workunit.client.1.vm05.stdout:1/280: mkdir d7/dd/d21/d63 0 2026-03-09T16:15:09.894 INFO:tasks.workunit.client.1.vm05.stdout:1/281: creat d7/d27/f64 x:0 0 0 2026-03-09T16:15:09.895 INFO:tasks.workunit.client.1.vm05.stdout:1/282: creat d7/dd/d21/d3b/f65 x:0 0 0 2026-03-09T16:15:09.895 INFO:tasks.workunit.client.1.vm05.stdout:6/241: dwrite d17/d22/f2c [0,4194304] 0 2026-03-09T16:15:09.897 INFO:tasks.workunit.client.1.vm05.stdout:1/283: fdatasync d7/f3f 0 2026-03-09T16:15:09.900 INFO:tasks.workunit.client.1.vm05.stdout:1/284: creat d7/d15/d16/f66 x:0 0 0 2026-03-09T16:15:09.902 INFO:tasks.workunit.client.1.vm05.stdout:1/285: fdatasync d7/dd/d21/d3b/f42 0 2026-03-09T16:15:09.907 INFO:tasks.workunit.client.1.vm05.stdout:1/286: dwrite d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:09.918 INFO:tasks.workunit.client.1.vm05.stdout:3/209: sync 2026-03-09T16:15:09.921 INFO:tasks.workunit.client.1.vm05.stdout:3/210: symlink d0/l42 0 2026-03-09T16:15:09.921 INFO:tasks.workunit.client.1.vm05.stdout:3/211: stat d0/d33 0 2026-03-09T16:15:09.925 INFO:tasks.workunit.client.1.vm05.stdout:3/212: truncate d0/d9/d22/f14 2341092 0 2026-03-09T16:15:09.926 INFO:tasks.workunit.client.1.vm05.stdout:9/272: stat d4/d10/d35/d36/l53 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:9/273: truncate d4/f43 193967 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:0/283: rename d5/d1b/d57 to d5/db/d5b 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:0/284: dread - d5/d1b/f56 zero size 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:9/274: dwrite d4/d10/f2a [0,4194304] 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:7/296: write d1/d2/d11/f54 [788911,12570] 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:7/297: symlink d1/d2/d8/dc/l61 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:4/253: rename d5/lf to d5/de/d15/l5a 0 2026-03-09T16:15:09.940 INFO:tasks.workunit.client.1.vm05.stdout:7/298: readlink d1/l24 0 2026-03-09T16:15:09.942 INFO:tasks.workunit.client.1.vm05.stdout:6/242: rename fc to d17/d22/d27/d34/d42/d53/f55 0 2026-03-09T16:15:09.945 INFO:tasks.workunit.client.1.vm05.stdout:9/275: getdents d4/d10/d35 0 2026-03-09T16:15:09.947 INFO:tasks.workunit.client.1.vm05.stdout:9/276: chown d4/c11 850805 1 2026-03-09T16:15:09.951 INFO:tasks.workunit.client.1.vm05.stdout:6/243: dwrite d17/d22/d27/f3c [0,4194304] 0 2026-03-09T16:15:09.952 INFO:tasks.workunit.client.1.vm05.stdout:9/277: creat d4/d10/d35/d36/d48/d54/d59/f5c x:0 0 0 2026-03-09T16:15:09.952 INFO:tasks.workunit.client.1.vm05.stdout:9/278: chown d4/d10/f2a 0 1 2026-03-09T16:15:09.952 INFO:tasks.workunit.client.1.vm05.stdout:9/279: chown d4/d10/d35/d36/d48/l58 14160 1 2026-03-09T16:15:09.956 INFO:tasks.workunit.client.1.vm05.stdout:7/299: sync 2026-03-09T16:15:09.964 INFO:tasks.workunit.client.1.vm05.stdout:6/244: rename d17/d1d/c52 to d17/d22/d27/d34/c56 0 2026-03-09T16:15:09.970 INFO:tasks.workunit.client.1.vm05.stdout:9/280: rename d4/c11 to d4/d10/d35/c5d 0 2026-03-09T16:15:09.971 INFO:tasks.workunit.client.1.vm05.stdout:7/300: link d1/d2/d8/d31/f39 d1/d2/d8/dc/d1b/f62 0 2026-03-09T16:15:09.972 INFO:tasks.workunit.client.1.vm05.stdout:7/301: creat d1/d2/d8/dc/d15/f63 x:0 0 0 2026-03-09T16:15:09.972 INFO:tasks.workunit.client.1.vm05.stdout:7/302: truncate d1/d2/f5 5326444 0 2026-03-09T16:15:09.975 INFO:tasks.workunit.client.1.vm05.stdout:7/303: link d1/d2/d8/c16 d1/d2/d8/dc/d15/d3e/c64 0 2026-03-09T16:15:09.975 INFO:tasks.workunit.client.1.vm05.stdout:7/304: read - d1/d2/d8/dc/d15/f63 zero size 2026-03-09T16:15:10.021 INFO:tasks.workunit.client.1.vm05.stdout:2/272: dread db/dd/f32 [0,4194304] 0 2026-03-09T16:15:10.026 INFO:tasks.workunit.client.1.vm05.stdout:2/273: dwrite db/dd/d15/d1f/d21/f39 [0,4194304] 0 2026-03-09T16:15:10.032 INFO:tasks.workunit.client.1.vm05.stdout:2/274: creat db/dd/d15/d1f/d21/f5d x:0 0 0 2026-03-09T16:15:10.038 INFO:tasks.workunit.client.1.vm05.stdout:2/275: dwrite db/dd/d15/f48 [0,4194304] 0 2026-03-09T16:15:10.125 INFO:tasks.workunit.client.1.vm05.stdout:0/285: fdatasync d5/db/d1d/f4a 0 2026-03-09T16:15:10.128 INFO:tasks.workunit.client.1.vm05.stdout:0/286: unlink d5/ce 0 2026-03-09T16:15:10.134 INFO:tasks.workunit.client.1.vm05.stdout:5/268: write d8/d18/d1b/f36 [1514483,82435] 0 2026-03-09T16:15:10.137 INFO:tasks.workunit.client.1.vm05.stdout:8/205: truncate d4/d6/d3a/d3c/f3f 2626652 0 2026-03-09T16:15:10.137 INFO:tasks.workunit.client.1.vm05.stdout:8/206: stat f0 0 2026-03-09T16:15:10.139 INFO:tasks.workunit.client.1.vm05.stdout:3/213: truncate d0/d9/f2b 147820 0 2026-03-09T16:15:10.142 INFO:tasks.workunit.client.1.vm05.stdout:0/287: unlink d5/d1b/c36 0 2026-03-09T16:15:10.143 INFO:tasks.workunit.client.1.vm05.stdout:8/207: symlink d4/d6/d3a/d40/l4d 0 2026-03-09T16:15:10.144 INFO:tasks.workunit.client.1.vm05.stdout:3/214: dwrite d0/d9/f2c [0,4194304] 0 2026-03-09T16:15:10.145 INFO:tasks.workunit.client.1.vm05.stdout:8/208: creat d4/d6/d3a/d40/f4e x:0 0 0 2026-03-09T16:15:10.145 INFO:tasks.workunit.client.1.vm05.stdout:3/215: stat d0/d9/d22/c2d 0 2026-03-09T16:15:10.146 INFO:tasks.workunit.client.1.vm05.stdout:0/288: creat d5/f5c x:0 0 0 2026-03-09T16:15:10.146 INFO:tasks.workunit.client.1.vm05.stdout:0/289: stat d5/d11/f1e 0 2026-03-09T16:15:10.150 INFO:tasks.workunit.client.1.vm05.stdout:0/290: creat d5/d2c/d49/f5d x:0 0 0 2026-03-09T16:15:10.150 INFO:tasks.workunit.client.1.vm05.stdout:0/291: stat d5/db/c1c 0 2026-03-09T16:15:10.151 INFO:tasks.workunit.client.1.vm05.stdout:0/292: chown d5/d11/d4f/l53 1454 1 2026-03-09T16:15:10.155 INFO:tasks.workunit.client.1.vm05.stdout:0/293: symlink d5/d2c/l5e 0 2026-03-09T16:15:10.159 INFO:tasks.workunit.client.1.vm05.stdout:0/294: dwrite d5/f5c [0,4194304] 0 2026-03-09T16:15:10.162 INFO:tasks.workunit.client.1.vm05.stdout:0/295: chown d5/d11/c4c 1399 1 2026-03-09T16:15:10.166 INFO:tasks.workunit.client.1.vm05.stdout:0/296: stat d5/d1b/f25 0 2026-03-09T16:15:10.169 INFO:tasks.workunit.client.1.vm05.stdout:0/297: mkdir d5/db/d5f 0 2026-03-09T16:15:10.170 INFO:tasks.workunit.client.1.vm05.stdout:0/298: creat d5/db/d1d/f60 x:0 0 0 2026-03-09T16:15:10.171 INFO:tasks.workunit.client.1.vm05.stdout:0/299: fsync d5/db/d5b/f35 0 2026-03-09T16:15:10.174 INFO:tasks.workunit.client.1.vm05.stdout:0/300: dread d5/d1b/f47 [0,4194304] 0 2026-03-09T16:15:10.176 INFO:tasks.workunit.client.1.vm05.stdout:0/301: write d5/d11/f40 [768982,100145] 0 2026-03-09T16:15:10.181 INFO:tasks.workunit.client.1.vm05.stdout:8/209: dread d4/d6/f1b [0,4194304] 0 2026-03-09T16:15:10.186 INFO:tasks.workunit.client.1.vm05.stdout:1/287: unlink d7/d15/d16/f26 0 2026-03-09T16:15:10.190 INFO:tasks.workunit.client.1.vm05.stdout:3/216: dread d0/d9/d22/f30 [0,4194304] 0 2026-03-09T16:15:10.190 INFO:tasks.workunit.client.1.vm05.stdout:1/288: fsync d7/f34 0 2026-03-09T16:15:10.194 INFO:tasks.workunit.client.1.vm05.stdout:8/210: dread d4/d6/db/df/f18 [0,4194304] 0 2026-03-09T16:15:10.197 INFO:tasks.workunit.client.1.vm05.stdout:1/289: sync 2026-03-09T16:15:10.197 INFO:tasks.workunit.client.1.vm05.stdout:3/217: sync 2026-03-09T16:15:10.199 INFO:tasks.workunit.client.1.vm05.stdout:8/211: mkdir d4/d6/db/df/d4f 0 2026-03-09T16:15:10.201 INFO:tasks.workunit.client.1.vm05.stdout:1/290: creat d7/d15/d45/f67 x:0 0 0 2026-03-09T16:15:10.201 INFO:tasks.workunit.client.1.vm05.stdout:8/212: dread - d4/d6/d3a/d40/f4e zero size 2026-03-09T16:15:10.202 INFO:tasks.workunit.client.1.vm05.stdout:1/291: fsync d7/dd/f1f 0 2026-03-09T16:15:10.205 INFO:tasks.workunit.client.1.vm05.stdout:8/213: readlink d4/d6/db/dc/d3b/l3d 0 2026-03-09T16:15:10.206 INFO:tasks.workunit.client.1.vm05.stdout:3/218: dwrite d0/d9/d22/f2e [0,4194304] 0 2026-03-09T16:15:10.206 INFO:tasks.workunit.client.1.vm05.stdout:4/254: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:10.207 INFO:tasks.workunit.client.1.vm05.stdout:8/214: dread - d4/d6/d3a/f28 zero size 2026-03-09T16:15:10.208 INFO:tasks.workunit.client.1.vm05.stdout:7/305: rename d1/d2/d8/dc/d15 to d1/d2/d8/dc/d1b/d30/d4b/d65 0 2026-03-09T16:15:10.209 INFO:tasks.workunit.client.1.vm05.stdout:9/281: write d4/f4a [4426118,37902] 0 2026-03-09T16:15:10.212 INFO:tasks.workunit.client.1.vm05.stdout:9/282: sync 2026-03-09T16:15:10.213 INFO:tasks.workunit.client.1.vm05.stdout:8/215: mknod d4/d6/c50 0 2026-03-09T16:15:10.213 INFO:tasks.workunit.client.1.vm05.stdout:2/276: rename db/dd/d15/d1f/c3c to db/dd/d15/d1f/d21/c5e 0 2026-03-09T16:15:10.220 INFO:tasks.workunit.client.1.vm05.stdout:3/219: mknod d0/c43 0 2026-03-09T16:15:10.220 INFO:tasks.workunit.client.1.vm05.stdout:9/283: stat d4/l41 0 2026-03-09T16:15:10.220 INFO:tasks.workunit.client.1.vm05.stdout:6/245: dread d17/f30 [0,4194304] 0 2026-03-09T16:15:10.224 INFO:tasks.workunit.client.1.vm05.stdout:2/277: unlink db/dd/d15/d4c/c4f 0 2026-03-09T16:15:10.224 INFO:tasks.workunit.client.1.vm05.stdout:1/292: link d7/c8 d7/dd/d21/d2d/c68 0 2026-03-09T16:15:10.224 INFO:tasks.workunit.client.1.vm05.stdout:4/255: link d5/de/d15/d21/d27/d3c/f3d d5/de/d15/d21/d27/d3c/f5b 0 2026-03-09T16:15:10.226 INFO:tasks.workunit.client.1.vm05.stdout:3/220: dread d0/d9/f1d [0,4194304] 0 2026-03-09T16:15:10.230 INFO:tasks.workunit.client.1.vm05.stdout:6/246: sync 2026-03-09T16:15:10.231 INFO:tasks.workunit.client.1.vm05.stdout:0/302: rename d5/d1b/d30/f2a to d5/d1b/f61 0 2026-03-09T16:15:10.231 INFO:tasks.workunit.client.1.vm05.stdout:4/256: mkdir d5/de/d15/d21/d27/d3c/d5c 0 2026-03-09T16:15:10.231 INFO:tasks.workunit.client.1.vm05.stdout:8/216: rename d4/d6/db to d4/d6/db/df/d4f/d51 22 2026-03-09T16:15:10.231 INFO:tasks.workunit.client.1.vm05.stdout:0/303: stat d5/d1b/f61 0 2026-03-09T16:15:10.232 INFO:tasks.workunit.client.1.vm05.stdout:9/284: creat d4/d10/d35/d2b/d38/f5e x:0 0 0 2026-03-09T16:15:10.232 INFO:tasks.workunit.client.1.vm05.stdout:1/293: creat d7/d62/f69 x:0 0 0 2026-03-09T16:15:10.234 INFO:tasks.workunit.client.1.vm05.stdout:6/247: symlink d17/d22/d27/d44/l57 0 2026-03-09T16:15:10.238 INFO:tasks.workunit.client.1.vm05.stdout:4/257: dread d5/de/d15/d21/d27/f30 [0,4194304] 0 2026-03-09T16:15:10.240 INFO:tasks.workunit.client.1.vm05.stdout:1/294: symlink d7/dd/de/d52/l6a 0 2026-03-09T16:15:10.243 INFO:tasks.workunit.client.1.vm05.stdout:9/285: symlink d4/d10/d35/d36/d48/d4c/l5f 0 2026-03-09T16:15:10.243 INFO:tasks.workunit.client.1.vm05.stdout:3/221: symlink d0/l44 0 2026-03-09T16:15:10.244 INFO:tasks.workunit.client.1.vm05.stdout:6/248: dwrite d17/d1d/f38 [0,4194304] 0 2026-03-09T16:15:10.252 INFO:tasks.workunit.client.1.vm05.stdout:9/286: sync 2026-03-09T16:15:10.256 INFO:tasks.workunit.client.1.vm05.stdout:9/287: dwrite d4/f4a [4194304,4194304] 0 2026-03-09T16:15:10.257 INFO:tasks.workunit.client.1.vm05.stdout:9/288: stat d4/f5b 0 2026-03-09T16:15:10.258 INFO:tasks.workunit.client.1.vm05.stdout:9/289: truncate d4/d10/d35/d2b/f2c 4528414 0 2026-03-09T16:15:10.264 INFO:tasks.workunit.client.1.vm05.stdout:5/269: dwrite d8/d18/d1b/d2e/f3c [0,4194304] 0 2026-03-09T16:15:10.266 INFO:tasks.workunit.client.1.vm05.stdout:4/258: mkdir d5/de/d15/d21/d39/d5d 0 2026-03-09T16:15:10.271 INFO:tasks.workunit.client.1.vm05.stdout:5/270: dwrite d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:10.275 INFO:tasks.workunit.client.1.vm05.stdout:6/249: fdatasync d17/f2d 0 2026-03-09T16:15:10.277 INFO:tasks.workunit.client.1.vm05.stdout:6/250: chown d17/d22/d27/d34/d42 0 1 2026-03-09T16:15:10.278 INFO:tasks.workunit.client.1.vm05.stdout:6/251: write d17/f4a [617350,95236] 0 2026-03-09T16:15:10.281 INFO:tasks.workunit.client.1.vm05.stdout:9/290: mkdir d4/d10/d35/d36/d48/d60 0 2026-03-09T16:15:10.288 INFO:tasks.workunit.client.1.vm05.stdout:0/304: fsync d5/f5c 0 2026-03-09T16:15:10.289 INFO:tasks.workunit.client.1.vm05.stdout:0/305: dread - d5/d2c/f41 zero size 2026-03-09T16:15:10.289 INFO:tasks.workunit.client.1.vm05.stdout:0/306: read - d5/d1b/d30/f55 zero size 2026-03-09T16:15:10.295 INFO:tasks.workunit.client.1.vm05.stdout:4/259: creat d5/de/d4a/f5e x:0 0 0 2026-03-09T16:15:10.298 INFO:tasks.workunit.client.1.vm05.stdout:4/260: stat d5/f2e 0 2026-03-09T16:15:10.299 INFO:tasks.workunit.client.1.vm05.stdout:5/271: unlink d8/d18/d1b/d2e/f4d 0 2026-03-09T16:15:10.299 INFO:tasks.workunit.client.1.vm05.stdout:5/272: fsync d8/d1d/f21 0 2026-03-09T16:15:10.302 INFO:tasks.workunit.client.1.vm05.stdout:6/252: mkdir d17/d22/d27/d58 0 2026-03-09T16:15:10.303 INFO:tasks.workunit.client.1.vm05.stdout:5/273: dread d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:10.305 INFO:tasks.workunit.client.1.vm05.stdout:9/291: read d4/f17 [2914911,127513] 0 2026-03-09T16:15:10.307 INFO:tasks.workunit.client.1.vm05.stdout:8/217: rmdir d4/d6 39 2026-03-09T16:15:10.309 INFO:tasks.workunit.client.1.vm05.stdout:7/306: write d1/d2/d8/dc/d33/f57 [696076,51984] 0 2026-03-09T16:15:10.310 INFO:tasks.workunit.client.1.vm05.stdout:2/278: rename f5 to db/f5f 0 2026-03-09T16:15:10.310 INFO:tasks.workunit.client.1.vm05.stdout:4/261: write d5/de/d15/d21/f26 [505781,27834] 0 2026-03-09T16:15:10.313 INFO:tasks.workunit.client.1.vm05.stdout:5/274: mkdir d8/d59/d5b 0 2026-03-09T16:15:10.318 INFO:tasks.workunit.client.1.vm05.stdout:5/275: creat d8/d59/f5c x:0 0 0 2026-03-09T16:15:10.320 INFO:tasks.workunit.client.1.vm05.stdout:8/218: creat d4/d6/d3a/d40/f52 x:0 0 0 2026-03-09T16:15:10.320 INFO:tasks.workunit.client.1.vm05.stdout:9/292: creat d4/f61 x:0 0 0 2026-03-09T16:15:10.320 INFO:tasks.workunit.client.1.vm05.stdout:6/253: dwrite d17/d22/d27/d34/d42/d53/f55 [4194304,4194304] 0 2026-03-09T16:15:10.320 INFO:tasks.workunit.client.1.vm05.stdout:7/307: link d1/d19/d2a/f5b d1/d2/d8/dc/d1b/f66 0 2026-03-09T16:15:10.322 INFO:tasks.workunit.client.1.vm05.stdout:8/219: chown d4/c16 689345566 1 2026-03-09T16:15:10.323 INFO:tasks.workunit.client.1.vm05.stdout:8/220: readlink d4/d6/db/l4c 0 2026-03-09T16:15:10.323 INFO:tasks.workunit.client.1.vm05.stdout:6/254: readlink d17/l3f 0 2026-03-09T16:15:10.333 INFO:tasks.workunit.client.1.vm05.stdout:5/276: symlink d8/d1d/l5d 0 2026-03-09T16:15:10.333 INFO:tasks.workunit.client.1.vm05.stdout:8/221: stat l3 0 2026-03-09T16:15:10.333 INFO:tasks.workunit.client.1.vm05.stdout:5/277: fsync d8/d3d/f3f 0 2026-03-09T16:15:10.335 INFO:tasks.workunit.client.1.vm05.stdout:8/222: mkdir d4/d6/d53 0 2026-03-09T16:15:10.336 INFO:tasks.workunit.client.1.vm05.stdout:2/279: dread db/f12 [0,4194304] 0 2026-03-09T16:15:10.336 INFO:tasks.workunit.client.1.vm05.stdout:5/278: rmdir d8/d18/d1b 39 2026-03-09T16:15:10.336 INFO:tasks.workunit.client.1.vm05.stdout:7/308: dread d1/d2/d11/f54 [0,4194304] 0 2026-03-09T16:15:10.337 INFO:tasks.workunit.client.1.vm05.stdout:5/279: chown d8/d3d/f3f 0 1 2026-03-09T16:15:10.337 INFO:tasks.workunit.client.1.vm05.stdout:8/223: mkdir d4/d6/db/d54 0 2026-03-09T16:15:10.339 INFO:tasks.workunit.client.1.vm05.stdout:2/280: chown db/dd/d15/d1f/d20/d23/c27 28011270 1 2026-03-09T16:15:10.340 INFO:tasks.workunit.client.1.vm05.stdout:5/280: mkdir d8/d5e 0 2026-03-09T16:15:10.341 INFO:tasks.workunit.client.1.vm05.stdout:8/224: mkdir d4/d55 0 2026-03-09T16:15:10.342 INFO:tasks.workunit.client.1.vm05.stdout:2/281: mkdir db/dd/d15/d3f/d5b/d60 0 2026-03-09T16:15:10.342 INFO:tasks.workunit.client.1.vm05.stdout:7/309: dwrite d1/d2/d11/f1c [0,4194304] 0 2026-03-09T16:15:10.343 INFO:tasks.workunit.client.1.vm05.stdout:8/225: readlink d4/d6/l2f 0 2026-03-09T16:15:10.343 INFO:tasks.workunit.client.1.vm05.stdout:2/282: write db/dd/d15/f48 [2223206,42438] 0 2026-03-09T16:15:10.344 INFO:tasks.workunit.client.1.vm05.stdout:5/281: write d8/d18/d1b/f32 [3984178,26660] 0 2026-03-09T16:15:10.344 INFO:tasks.workunit.client.1.vm05.stdout:8/226: dread - d4/d6/d3a/d40/f52 zero size 2026-03-09T16:15:10.350 INFO:tasks.workunit.client.1.vm05.stdout:7/310: mkdir d1/d2/d8/d67 0 2026-03-09T16:15:10.354 INFO:tasks.workunit.client.1.vm05.stdout:8/227: symlink d4/d6/db/d54/l56 0 2026-03-09T16:15:10.361 INFO:tasks.workunit.client.1.vm05.stdout:5/282: link d8/d18/d1b/d47/f4c d8/d59/f5f 0 2026-03-09T16:15:10.361 INFO:tasks.workunit.client.1.vm05.stdout:7/311: rmdir d1/d2/d8/dc/d1b/d30/d4b/d65 39 2026-03-09T16:15:10.370 INFO:tasks.workunit.client.1.vm05.stdout:8/228: mknod d4/d6/db/df/c57 0 2026-03-09T16:15:10.373 INFO:tasks.workunit.client.1.vm05.stdout:8/229: unlink d4/d6/db/fe 0 2026-03-09T16:15:10.376 INFO:tasks.workunit.client.1.vm05.stdout:7/312: creat d1/d2/d8/dc/d1b/d30/d5e/f68 x:0 0 0 2026-03-09T16:15:10.376 INFO:tasks.workunit.client.1.vm05.stdout:8/230: rename d4/d6/f27 to d4/d6/f58 0 2026-03-09T16:15:10.376 INFO:tasks.workunit.client.1.vm05.stdout:5/283: truncate d8/d18/f3a 6494417 0 2026-03-09T16:15:10.377 INFO:tasks.workunit.client.1.vm05.stdout:8/231: mkdir d4/d6/db/d59 0 2026-03-09T16:15:10.377 INFO:tasks.workunit.client.1.vm05.stdout:7/313: symlink d1/d2/d8/dc/d18/d5d/l69 0 2026-03-09T16:15:10.379 INFO:tasks.workunit.client.1.vm05.stdout:7/314: stat d1/d2/d8/dc/d1b/d30/d5e 0 2026-03-09T16:15:10.388 INFO:tasks.workunit.client.1.vm05.stdout:7/315: fsync d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f49 0 2026-03-09T16:15:10.389 INFO:tasks.workunit.client.1.vm05.stdout:5/284: dread d8/f13 [0,4194304] 0 2026-03-09T16:15:10.389 INFO:tasks.workunit.client.1.vm05.stdout:5/285: stat d8/d53 0 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: pgmap v14: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 25 MiB/s rd, 113 MiB/s wr, 308 op/s 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: Standby manager daemon vm05.dygxfv restarted 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:15:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:10 vm03.local ceph-mon[51019]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:15:10.391 INFO:tasks.workunit.client.1.vm05.stdout:5/286: fsync d8/d18/d1b/d47/d4e/f57 0 2026-03-09T16:15:10.394 INFO:tasks.workunit.client.1.vm05.stdout:7/316: dwrite d1/d2/d8/dc/d1b/f5a [0,4194304] 0 2026-03-09T16:15:10.397 INFO:tasks.workunit.client.1.vm05.stdout:5/287: creat d8/d18/d1b/d47/d48/f60 x:0 0 0 2026-03-09T16:15:10.406 INFO:tasks.workunit.client.1.vm05.stdout:7/317: mkdir d1/d2/d8/d6a 0 2026-03-09T16:15:10.406 INFO:tasks.workunit.client.1.vm05.stdout:7/318: chown d1/d2/d8/dc/d18/f52 37954 1 2026-03-09T16:15:10.407 INFO:tasks.workunit.client.1.vm05.stdout:7/319: link d1/d2/d11/f54 d1/d2/d8/d6a/f6b 0 2026-03-09T16:15:10.437 INFO:tasks.workunit.client.1.vm05.stdout:1/295: truncate d7/d15/d16/f1c 2706968 0 2026-03-09T16:15:10.437 INFO:tasks.workunit.client.1.vm05.stdout:1/296: chown d7/d15/d45 5163 1 2026-03-09T16:15:10.438 INFO:tasks.workunit.client.1.vm05.stdout:1/297: write d7/dd/f1f [3535750,61881] 0 2026-03-09T16:15:10.438 INFO:tasks.workunit.client.1.vm05.stdout:4/262: rename d5/de/d4a to d5/de/d15/d21/d27/d3c/d5c/d5f 0 2026-03-09T16:15:10.439 INFO:tasks.workunit.client.1.vm05.stdout:3/222: write d0/fd [1810508,36572] 0 2026-03-09T16:15:10.440 INFO:tasks.workunit.client.1.vm05.stdout:1/298: chown d7/fb 60 1 2026-03-09T16:15:10.442 INFO:tasks.workunit.client.1.vm05.stdout:0/307: dwrite d5/db/d1d/f52 [0,4194304] 0 2026-03-09T16:15:10.449 INFO:tasks.workunit.client.1.vm05.stdout:0/308: dwrite d5/d2c/d49/f5d [0,4194304] 0 2026-03-09T16:15:10.452 INFO:tasks.workunit.client.1.vm05.stdout:0/309: truncate d5/db/d1d/f60 492080 0 2026-03-09T16:15:10.456 INFO:tasks.workunit.client.1.vm05.stdout:2/283: rename db/dd/d15/d4c/c4d to db/dd/d15/d3f/d55/c61 0 2026-03-09T16:15:10.463 INFO:tasks.workunit.client.1.vm05.stdout:9/293: write d4/f20 [2448694,82809] 0 2026-03-09T16:15:10.463 INFO:tasks.workunit.client.1.vm05.stdout:6/255: truncate d17/f1b 4135796 0 2026-03-09T16:15:10.465 INFO:tasks.workunit.client.1.vm05.stdout:4/263: mkdir d5/d19/d37/d60 0 2026-03-09T16:15:10.465 INFO:tasks.workunit.client.1.vm05.stdout:4/264: fdatasync d5/de/f23 0 2026-03-09T16:15:10.465 INFO:tasks.workunit.client.1.vm05.stdout:1/299: creat d7/dd/de/f6b x:0 0 0 2026-03-09T16:15:10.466 INFO:tasks.workunit.client.1.vm05.stdout:4/265: write d5/fb [386840,34970] 0 2026-03-09T16:15:10.469 INFO:tasks.workunit.client.1.vm05.stdout:0/310: rmdir d5/d1b 39 2026-03-09T16:15:10.469 INFO:tasks.workunit.client.1.vm05.stdout:5/288: truncate d8/d59/f5f 162910 0 2026-03-09T16:15:10.470 INFO:tasks.workunit.client.1.vm05.stdout:7/320: rename d1/d2/d8/d31/d4a to d1/d19/d2a/d6c 0 2026-03-09T16:15:10.471 INFO:tasks.workunit.client.1.vm05.stdout:4/266: dread d5/de/d15/d21/f26 [0,4194304] 0 2026-03-09T16:15:10.471 INFO:tasks.workunit.client.1.vm05.stdout:4/267: chown d5/d19 27205 1 2026-03-09T16:15:10.476 INFO:tasks.workunit.client.1.vm05.stdout:6/256: write d17/f18 [4256329,27390] 0 2026-03-09T16:15:10.477 INFO:tasks.workunit.client.1.vm05.stdout:1/300: symlink d7/dd/d21/d2d/l6c 0 2026-03-09T16:15:10.477 INFO:tasks.workunit.client.1.vm05.stdout:4/268: dwrite d5/de/d15/d21/d31/f49 [0,4194304] 0 2026-03-09T16:15:10.486 INFO:tasks.workunit.client.1.vm05.stdout:3/223: creat d0/f45 x:0 0 0 2026-03-09T16:15:10.488 INFO:tasks.workunit.client.1.vm05.stdout:5/289: creat d8/d18/d1b/d47/d48/f61 x:0 0 0 2026-03-09T16:15:10.488 INFO:tasks.workunit.client.1.vm05.stdout:8/232: truncate d4/d6/db/dc/f30 2113457 0 2026-03-09T16:15:10.495 INFO:tasks.workunit.client.1.vm05.stdout:2/284: creat db/dd/d15/d4c/d56/f62 x:0 0 0 2026-03-09T16:15:10.496 INFO:tasks.workunit.client.1.vm05.stdout:9/294: creat d4/d10/d35/d2b/d38/f62 x:0 0 0 2026-03-09T16:15:10.496 INFO:tasks.workunit.client.1.vm05.stdout:8/233: fsync d4/d6/f44 0 2026-03-09T16:15:10.497 INFO:tasks.workunit.client.1.vm05.stdout:0/311: rmdir d5/d1b/d30 39 2026-03-09T16:15:10.498 INFO:tasks.workunit.client.1.vm05.stdout:5/290: creat d8/d18/d1b/d2e/d43/f62 x:0 0 0 2026-03-09T16:15:10.498 INFO:tasks.workunit.client.1.vm05.stdout:1/301: dwrite d7/fb [0,4194304] 0 2026-03-09T16:15:10.500 INFO:tasks.workunit.client.1.vm05.stdout:2/285: mknod db/dd/d15/d3f/d55/c63 0 2026-03-09T16:15:10.505 INFO:tasks.workunit.client.1.vm05.stdout:0/312: fdatasync d5/d11/f1e 0 2026-03-09T16:15:10.506 INFO:tasks.workunit.client.1.vm05.stdout:6/257: symlink d17/d22/l59 0 2026-03-09T16:15:10.508 INFO:tasks.workunit.client.1.vm05.stdout:9/295: dread - d4/d10/d35/d36/d48/d54/f5a zero size 2026-03-09T16:15:10.511 INFO:tasks.workunit.client.1.vm05.stdout:2/286: mknod db/dd/d15/d1f/d21/c64 0 2026-03-09T16:15:10.513 INFO:tasks.workunit.client.1.vm05.stdout:1/302: mknod d7/dd/d21/c6d 0 2026-03-09T16:15:10.513 INFO:tasks.workunit.client.1.vm05.stdout:5/291: dwrite d8/d18/d1b/f32 [4194304,4194304] 0 2026-03-09T16:15:10.513 INFO:tasks.workunit.client.1.vm05.stdout:0/313: unlink d5/d11/d4f/c5a 0 2026-03-09T16:15:10.521 INFO:tasks.workunit.client.1.vm05.stdout:0/314: read d5/d1b/d3b/f3c [34874,68064] 0 2026-03-09T16:15:10.521 INFO:tasks.workunit.client.1.vm05.stdout:8/234: creat d4/d6/d53/f5a x:0 0 0 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: pgmap v14: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 25 MiB/s rd, 113 MiB/s wr, 308 op/s 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: Standby manager daemon vm05.dygxfv restarted 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:15:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:10 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/3719435524' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:15:10.530 INFO:tasks.workunit.client.1.vm05.stdout:6/258: creat d17/d22/d27/d34/d4b/f5a x:0 0 0 2026-03-09T16:15:10.535 INFO:tasks.workunit.client.1.vm05.stdout:4/269: dread d5/f6 [0,4194304] 0 2026-03-09T16:15:10.536 INFO:tasks.workunit.client.1.vm05.stdout:2/287: symlink db/dd/d15/d1f/d20/l65 0 2026-03-09T16:15:10.540 INFO:tasks.workunit.client.1.vm05.stdout:5/292: dread d8/d18/d1b/f36 [0,4194304] 0 2026-03-09T16:15:10.542 INFO:tasks.workunit.client.1.vm05.stdout:7/321: rename d1/d2/d8/d6a to d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d6d 0 2026-03-09T16:15:10.552 INFO:tasks.workunit.client.1.vm05.stdout:1/303: mkdir d7/d15/d6e 0 2026-03-09T16:15:10.552 INFO:tasks.workunit.client.1.vm05.stdout:0/315: symlink d5/d1b/l62 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:6/259: write d17/d22/d27/d34/f47 [2656544,115798] 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:4/270: unlink d5/de/d15/d21/d27/f30 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:2/288: fsync db/dd/d15/d3f/f5c 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:2/289: fdatasync db/dd/d15/d3f/f4a 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:4/271: write d5/de/d15/d21/d39/f46 [535604,37798] 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:7/322: creat d1/d19/d3c/f6e x:0 0 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:0/316: chown d5/db/d5b/f35 47824398 1 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:1/304: mknod d7/dd/d21/d3b/c6f 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:0/317: dread - d5/d2c/f41 zero size 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:5/293: symlink d8/d59/d5b/l63 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:2/290: symlink db/dd/d15/d3f/l66 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:8/235: getdents d4/d6/db 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:1/305: fdatasync d7/d27/f57 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:7/323: creat d1/d2/d8/dc/d18/f6f x:0 0 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:0/318: creat d5/d2c/f63 x:0 0 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:1/306: creat d7/d15/d45/f70 x:0 0 0 2026-03-09T16:15:10.553 INFO:tasks.workunit.client.1.vm05.stdout:6/260: creat d17/f5b x:0 0 0 2026-03-09T16:15:10.554 INFO:tasks.workunit.client.1.vm05.stdout:4/272: symlink d5/de/d2f/l61 0 2026-03-09T16:15:10.558 INFO:tasks.workunit.client.1.vm05.stdout:8/236: truncate d4/d6/db/dc/d2e/f47 4237685 0 2026-03-09T16:15:10.558 INFO:tasks.workunit.client.1.vm05.stdout:3/224: rename d0/d9/d22/f14 to d0/f46 0 2026-03-09T16:15:10.559 INFO:tasks.workunit.client.1.vm05.stdout:3/225: fdatasync d0/d9/f2f 0 2026-03-09T16:15:10.559 INFO:tasks.workunit.client.1.vm05.stdout:7/324: rmdir d1/d2/d8/dc/d14 39 2026-03-09T16:15:10.559 INFO:tasks.workunit.client.1.vm05.stdout:0/319: dwrite d5/d2c/f28 [0,4194304] 0 2026-03-09T16:15:10.562 INFO:tasks.workunit.client.1.vm05.stdout:3/226: write d0/d9/f2f [885583,35524] 0 2026-03-09T16:15:10.570 INFO:tasks.workunit.client.1.vm05.stdout:1/307: dread d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:10.570 INFO:tasks.workunit.client.1.vm05.stdout:8/237: dread d4/d6/d3a/f25 [0,4194304] 0 2026-03-09T16:15:10.571 INFO:tasks.workunit.client.1.vm05.stdout:0/320: unlink d5/c3f 0 2026-03-09T16:15:10.571 INFO:tasks.workunit.client.1.vm05.stdout:2/291: sync 2026-03-09T16:15:10.572 INFO:tasks.workunit.client.1.vm05.stdout:1/308: read d7/fc [3127354,86000] 0 2026-03-09T16:15:10.574 INFO:tasks.workunit.client.1.vm05.stdout:3/227: mknod d0/d9/d22/c47 0 2026-03-09T16:15:10.574 INFO:tasks.workunit.client.1.vm05.stdout:4/273: creat d5/d19/d37/d60/f62 x:0 0 0 2026-03-09T16:15:10.575 INFO:tasks.workunit.client.1.vm05.stdout:3/228: chown d0/d9/f1d 14720 1 2026-03-09T16:15:10.576 INFO:tasks.workunit.client.1.vm05.stdout:9/296: rename d4/fa to d4/d10/d35/d2b/f63 0 2026-03-09T16:15:10.577 INFO:tasks.workunit.client.1.vm05.stdout:9/297: stat d4/d10/d35/d36/d48/d4c/c51 0 2026-03-09T16:15:10.578 INFO:tasks.workunit.client.1.vm05.stdout:9/298: write d4/d10/d35/d2b/d38/f5e [288010,127330] 0 2026-03-09T16:15:10.578 INFO:tasks.workunit.client.1.vm05.stdout:2/292: write f7 [2382537,110951] 0 2026-03-09T16:15:10.580 INFO:tasks.workunit.client.1.vm05.stdout:9/299: write d4/d10/d35/d2b/d38/f5e [1220248,37497] 0 2026-03-09T16:15:10.582 INFO:tasks.workunit.client.1.vm05.stdout:2/293: write db/dd/d15/f28 [1372793,49084] 0 2026-03-09T16:15:10.587 INFO:tasks.workunit.client.1.vm05.stdout:2/294: fdatasync db/dd/d15/d1f/d21/f5d 0 2026-03-09T16:15:10.587 INFO:tasks.workunit.client.1.vm05.stdout:2/295: chown db/dd/d15/d1f/f24 78 1 2026-03-09T16:15:10.589 INFO:tasks.workunit.client.1.vm05.stdout:2/296: write db/dd/d15/d4c/d56/f62 [945920,47783] 0 2026-03-09T16:15:10.590 INFO:tasks.workunit.client.1.vm05.stdout:5/294: rename d8/d18/d1b/d2e/f51 to d8/d18/d1b/d47/d4e/f64 0 2026-03-09T16:15:10.590 INFO:tasks.workunit.client.1.vm05.stdout:2/297: chown db/dd/d15/f48 7933771 1 2026-03-09T16:15:10.591 INFO:tasks.workunit.client.1.vm05.stdout:2/298: write db/dd/d15/d46/f4e [773220,122503] 0 2026-03-09T16:15:10.595 INFO:tasks.workunit.client.1.vm05.stdout:0/321: unlink d5/db/d48/f58 0 2026-03-09T16:15:10.595 INFO:tasks.workunit.client.1.vm05.stdout:6/261: link d17/d22/f3d d17/d22/d27/d44/f5c 0 2026-03-09T16:15:10.606 INFO:tasks.workunit.client.1.vm05.stdout:1/309: mkdir d7/dd/d21/d63/d71 0 2026-03-09T16:15:10.610 INFO:tasks.workunit.client.1.vm05.stdout:1/310: dwrite d7/dd/de/d52/f58 [0,4194304] 0 2026-03-09T16:15:10.626 INFO:tasks.workunit.client.1.vm05.stdout:2/299: mkdir db/dd/d15/d46/d67 0 2026-03-09T16:15:10.628 INFO:tasks.workunit.client.1.vm05.stdout:0/322: creat d5/db/d1d/f64 x:0 0 0 2026-03-09T16:15:10.634 INFO:tasks.workunit.client.1.vm05.stdout:1/311: unlink d7/dd/d21/d3b/f42 0 2026-03-09T16:15:10.638 INFO:tasks.workunit.client.1.vm05.stdout:1/312: dwrite d7/fb [8388608,4194304] 0 2026-03-09T16:15:10.646 INFO:tasks.workunit.client.1.vm05.stdout:5/295: rename d8/c15 to d8/d18/d1b/d47/d4e/c65 0 2026-03-09T16:15:10.647 INFO:tasks.workunit.client.1.vm05.stdout:8/238: write d4/f1c [3706299,51637] 0 2026-03-09T16:15:10.647 INFO:tasks.workunit.client.1.vm05.stdout:8/239: write d4/f1c [786890,20233] 0 2026-03-09T16:15:10.648 INFO:tasks.workunit.client.1.vm05.stdout:8/240: stat d4/d6/db/d54/l56 0 2026-03-09T16:15:10.654 INFO:tasks.workunit.client.1.vm05.stdout:6/262: mkdir d17/d5d 0 2026-03-09T16:15:10.655 INFO:tasks.workunit.client.1.vm05.stdout:6/263: dread - d17/d22/d27/d34/d4b/f5a zero size 2026-03-09T16:15:10.657 INFO:tasks.workunit.client.1.vm05.stdout:7/325: getdents d1/d2/d8/dc/d14 0 2026-03-09T16:15:10.657 INFO:tasks.workunit.client.1.vm05.stdout:9/300: creat d4/f64 x:0 0 0 2026-03-09T16:15:10.657 INFO:tasks.workunit.client.1.vm05.stdout:4/274: link d5/f3e d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f63 0 2026-03-09T16:15:10.658 INFO:tasks.workunit.client.1.vm05.stdout:9/301: stat d4/d10/l3d 0 2026-03-09T16:15:10.658 INFO:tasks.workunit.client.1.vm05.stdout:4/275: read d5/de/d15/f25 [2563411,95140] 0 2026-03-09T16:15:10.661 INFO:tasks.workunit.client.1.vm05.stdout:4/276: dread d5/de/d15/d21/d31/f49 [0,4194304] 0 2026-03-09T16:15:10.661 INFO:tasks.workunit.client.1.vm05.stdout:4/277: read - d5/f59 zero size 2026-03-09T16:15:10.662 INFO:tasks.workunit.client.1.vm05.stdout:4/278: truncate d5/d19/d37/d60/f62 74242 0 2026-03-09T16:15:10.668 INFO:tasks.workunit.client.1.vm05.stdout:5/296: creat d8/d59/d5b/f66 x:0 0 0 2026-03-09T16:15:10.669 INFO:tasks.workunit.client.1.vm05.stdout:7/326: symlink d1/d19/d2a/l70 0 2026-03-09T16:15:10.669 INFO:tasks.workunit.client.1.vm05.stdout:9/302: mkdir d4/d10/d35/d2b/d38/d65 0 2026-03-09T16:15:10.670 INFO:tasks.workunit.client.1.vm05.stdout:7/327: chown d1/d2/d11 25215338 1 2026-03-09T16:15:10.671 INFO:tasks.workunit.client.1.vm05.stdout:4/279: fdatasync d5/d19/f48 0 2026-03-09T16:15:10.675 INFO:tasks.workunit.client.1.vm05.stdout:1/313: getdents d7/dd/d21/d44/d5c 0 2026-03-09T16:15:10.675 INFO:tasks.workunit.client.1.vm05.stdout:0/323: creat d5/db/d5f/f65 x:0 0 0 2026-03-09T16:15:10.676 INFO:tasks.workunit.client.1.vm05.stdout:1/314: fsync d7/d27/f4c 0 2026-03-09T16:15:10.677 INFO:tasks.workunit.client.1.vm05.stdout:1/315: fdatasync d7/dd/de/d52/f58 0 2026-03-09T16:15:10.677 INFO:tasks.workunit.client.1.vm05.stdout:9/303: dwrite d4/d10/f18 [0,4194304] 0 2026-03-09T16:15:10.679 INFO:tasks.workunit.client.1.vm05.stdout:4/280: creat d5/de/d15/d21/d31/f64 x:0 0 0 2026-03-09T16:15:10.683 INFO:tasks.workunit.client.1.vm05.stdout:2/300: rename db/dd/d15/d1f/d20/d23/c11 to db/dd/d15/c68 0 2026-03-09T16:15:10.683 INFO:tasks.workunit.client.1.vm05.stdout:3/229: write d0/f46 [1544422,95856] 0 2026-03-09T16:15:10.691 INFO:tasks.workunit.client.1.vm05.stdout:3/230: chown d0/l42 482513 1 2026-03-09T16:15:10.692 INFO:tasks.workunit.client.1.vm05.stdout:1/316: mkdir d7/d62/d72 0 2026-03-09T16:15:10.692 INFO:tasks.workunit.client.1.vm05.stdout:5/297: mknod d8/d18/c67 0 2026-03-09T16:15:10.692 INFO:tasks.workunit.client.1.vm05.stdout:9/304: unlink d4/d10/d35/d36/d48/d54/f5a 0 2026-03-09T16:15:10.692 INFO:tasks.workunit.client.1.vm05.stdout:5/298: read d8/d18/d1b/d2e/d43/f41 [3262016,79373] 0 2026-03-09T16:15:10.693 INFO:tasks.workunit.client.1.vm05.stdout:4/281: mknod d5/de/d15/d21/d31/c65 0 2026-03-09T16:15:10.696 INFO:tasks.workunit.client.1.vm05.stdout:8/241: rename d4/d6/d3a/d40/l4d to d4/d6/db/df/d4f/l5b 0 2026-03-09T16:15:10.697 INFO:tasks.workunit.client.1.vm05.stdout:8/242: truncate d4/d6/f24 4762666 0 2026-03-09T16:15:10.697 INFO:tasks.workunit.client.1.vm05.stdout:8/243: chown d4/d6/f9 12 1 2026-03-09T16:15:10.699 INFO:tasks.workunit.client.1.vm05.stdout:8/244: dread d4/d6/db/df/f18 [0,4194304] 0 2026-03-09T16:15:10.699 INFO:tasks.workunit.client.1.vm05.stdout:3/231: chown d0/d9/f2b 1588822 1 2026-03-09T16:15:10.700 INFO:tasks.workunit.client.1.vm05.stdout:6/264: write d17/d22/d27/d44/f5c [3716566,50437] 0 2026-03-09T16:15:10.700 INFO:tasks.workunit.client.1.vm05.stdout:1/317: symlink d7/dd/d21/d39/d5a/l73 0 2026-03-09T16:15:10.704 INFO:tasks.workunit.client.1.vm05.stdout:5/299: mkdir d8/d18/d1b/d47/d68 0 2026-03-09T16:15:10.705 INFO:tasks.workunit.client.1.vm05.stdout:4/282: mknod d5/de/d15/d21/d27/d3c/c66 0 2026-03-09T16:15:10.706 INFO:tasks.workunit.client.1.vm05.stdout:3/232: dread d0/fd [0,4194304] 0 2026-03-09T16:15:10.706 INFO:tasks.workunit.client.1.vm05.stdout:2/301: unlink db/dd/d15/d1f/d20/c2e 0 2026-03-09T16:15:10.707 INFO:tasks.workunit.client.1.vm05.stdout:8/245: mknod d4/d6/db/df/d4f/c5c 0 2026-03-09T16:15:10.708 INFO:tasks.workunit.client.1.vm05.stdout:8/246: dread - d4/d6/d3a/f49 zero size 2026-03-09T16:15:10.709 INFO:tasks.workunit.client.1.vm05.stdout:8/247: dread - d4/d6/d53/f5a zero size 2026-03-09T16:15:10.710 INFO:tasks.workunit.client.1.vm05.stdout:4/283: dread d5/de/d15/d21/d27/f29 [0,4194304] 0 2026-03-09T16:15:10.716 INFO:tasks.workunit.client.1.vm05.stdout:0/324: write d5/d1b/f25 [574432,61432] 0 2026-03-09T16:15:10.718 INFO:tasks.workunit.client.1.vm05.stdout:3/233: rename d0/d9/d22/c47 to d0/d9/d22/c48 0 2026-03-09T16:15:10.719 INFO:tasks.workunit.client.1.vm05.stdout:5/300: dwrite d8/d18/d1b/d47/d4e/f4f [0,4194304] 0 2026-03-09T16:15:10.720 INFO:tasks.workunit.client.1.vm05.stdout:4/284: dwrite d5/f35 [0,4194304] 0 2026-03-09T16:15:10.725 INFO:tasks.workunit.client.1.vm05.stdout:3/234: chown d0/l44 2073842 1 2026-03-09T16:15:10.727 INFO:tasks.workunit.client.1.vm05.stdout:8/248: mkdir d4/d6/db/dc/d5d 0 2026-03-09T16:15:10.727 INFO:tasks.workunit.client.1.vm05.stdout:2/302: dwrite db/dd/d15/d4c/d56/f62 [0,4194304] 0 2026-03-09T16:15:10.728 INFO:tasks.workunit.client.1.vm05.stdout:8/249: chown d4 184157 1 2026-03-09T16:15:10.739 INFO:tasks.workunit.client.1.vm05.stdout:7/328: rename d1/d19 to d1/d2/d8/dc/d1b/d71 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:0/325: dwrite d5/f7 [0,4194304] 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:6/265: dread d17/f30 [0,4194304] 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:9/305: getdents d4/d10/d35/d2b 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:4/285: creat d5/de/d15/d21/d31/f67 x:0 0 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:8/250: truncate d4/d6/db/dc/d2e/f47 4890915 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:1/318: rename d7/d27/f4c to d7/d15/d16/f74 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:2/303: creat db/dd/d15/d3f/d5b/f69 x:0 0 0 2026-03-09T16:15:10.748 INFO:tasks.workunit.client.1.vm05.stdout:7/329: rmdir d1/d2/d8/dc/d1b/d30/d4b 39 2026-03-09T16:15:10.749 INFO:tasks.workunit.client.1.vm05.stdout:0/326: mkdir d5/db/d48/d66 0 2026-03-09T16:15:10.751 INFO:tasks.workunit.client.1.vm05.stdout:2/304: chown db 202 1 2026-03-09T16:15:10.755 INFO:tasks.workunit.client.1.vm05.stdout:8/251: dwrite d4/f13 [0,4194304] 0 2026-03-09T16:15:10.764 INFO:tasks.workunit.client.1.vm05.stdout:6/266: symlink d17/d22/d27/d44/l5e 0 2026-03-09T16:15:10.764 INFO:tasks.workunit.client.1.vm05.stdout:2/305: dwrite db/dd/d15/d1f/f24 [4194304,4194304] 0 2026-03-09T16:15:10.764 INFO:tasks.workunit.client.1.vm05.stdout:3/235: creat d0/f49 x:0 0 0 2026-03-09T16:15:10.764 INFO:tasks.workunit.client.1.vm05.stdout:0/327: fdatasync d5/db/f12 0 2026-03-09T16:15:10.770 INFO:tasks.workunit.client.1.vm05.stdout:4/286: fsync d5/de/d15/d21/d39/f44 0 2026-03-09T16:15:10.773 INFO:tasks.workunit.client.1.vm05.stdout:9/306: link d4/d10/d35/d2b/d31/f55 d4/f66 0 2026-03-09T16:15:10.773 INFO:tasks.workunit.client.1.vm05.stdout:3/236: rmdir d0/d33 39 2026-03-09T16:15:10.774 INFO:tasks.workunit.client.1.vm05.stdout:6/267: dwrite d17/d22/d27/d34/d4b/f5a [0,4194304] 0 2026-03-09T16:15:10.780 INFO:tasks.workunit.client.1.vm05.stdout:7/330: rename d1/d2/d8/dc/d1b/d71/d2a to d1/d2/d8/dc/d72 0 2026-03-09T16:15:10.780 INFO:tasks.workunit.client.1.vm05.stdout:2/306: mkdir db/dd/d15/d3f/d5b/d60/d6a 0 2026-03-09T16:15:10.780 INFO:tasks.workunit.client.1.vm05.stdout:4/287: creat d5/de/d15/d21/d31/f68 x:0 0 0 2026-03-09T16:15:10.780 INFO:tasks.workunit.client.1.vm05.stdout:0/328: symlink d5/db/d48/d66/l67 0 2026-03-09T16:15:10.781 INFO:tasks.workunit.client.1.vm05.stdout:7/331: mknod d1/d2/d8/dc/d72/c73 0 2026-03-09T16:15:10.782 INFO:tasks.workunit.client.1.vm05.stdout:7/332: write d1/d2/f22 [1414186,43661] 0 2026-03-09T16:15:10.783 INFO:tasks.workunit.client.1.vm05.stdout:3/237: rename d0/d33 to d0/d33/d4a 22 2026-03-09T16:15:10.784 INFO:tasks.workunit.client.1.vm05.stdout:6/268: readlink d17/d1d/l25 0 2026-03-09T16:15:10.786 INFO:tasks.workunit.client.1.vm05.stdout:7/333: creat d1/d2/d8/dc/d1b/d71/f74 x:0 0 0 2026-03-09T16:15:10.786 INFO:tasks.workunit.client.1.vm05.stdout:2/307: write db/dd/d15/d1f/f36 [3427054,90569] 0 2026-03-09T16:15:10.787 INFO:tasks.workunit.client.1.vm05.stdout:6/269: read d17/f1c [4245992,28126] 0 2026-03-09T16:15:10.788 INFO:tasks.workunit.client.1.vm05.stdout:4/288: getdents d5/de/d15/d21/d39/d5d 0 2026-03-09T16:15:10.793 INFO:tasks.workunit.client.1.vm05.stdout:3/238: dwrite d0/f49 [0,4194304] 0 2026-03-09T16:15:10.798 INFO:tasks.workunit.client.1.vm05.stdout:3/239: write d0/f49 [3929064,68327] 0 2026-03-09T16:15:10.798 INFO:tasks.workunit.client.1.vm05.stdout:0/329: dwrite d5/db/f12 [4194304,4194304] 0 2026-03-09T16:15:10.798 INFO:tasks.workunit.client.1.vm05.stdout:6/270: dread d17/f30 [0,4194304] 0 2026-03-09T16:15:10.798 INFO:tasks.workunit.client.1.vm05.stdout:3/240: chown d0/f49 20 1 2026-03-09T16:15:10.800 INFO:tasks.workunit.client.1.vm05.stdout:5/301: sync 2026-03-09T16:15:10.801 INFO:tasks.workunit.client.1.vm05.stdout:7/334: dwrite d1/d2/d8/dc/d33/f57 [0,4194304] 0 2026-03-09T16:15:10.801 INFO:tasks.workunit.client.1.vm05.stdout:5/302: write d8/d18/d1b/f2a [5102153,99131] 0 2026-03-09T16:15:10.812 INFO:tasks.workunit.client.1.vm05.stdout:6/271: readlink d17/l26 0 2026-03-09T16:15:10.812 INFO:tasks.workunit.client.1.vm05.stdout:5/303: symlink d8/d53/l69 0 2026-03-09T16:15:10.812 INFO:tasks.workunit.client.1.vm05.stdout:7/335: truncate d1/d2/d8/dc/d14/f41 3037417 0 2026-03-09T16:15:10.813 INFO:tasks.workunit.client.1.vm05.stdout:2/308: getdents db 0 2026-03-09T16:15:10.814 INFO:tasks.workunit.client.1.vm05.stdout:2/309: fsync db/dd/d15/d3f/f4a 0 2026-03-09T16:15:10.815 INFO:tasks.workunit.client.1.vm05.stdout:3/241: link d0/d9/d22/f30 d0/d9/f4b 0 2026-03-09T16:15:10.815 INFO:tasks.workunit.client.1.vm05.stdout:6/272: mknod d17/d22/d27/d58/c5f 0 2026-03-09T16:15:10.816 INFO:tasks.workunit.client.1.vm05.stdout:7/336: dread d1/d2/d8/dc/d33/f57 [0,4194304] 0 2026-03-09T16:15:10.816 INFO:tasks.workunit.client.1.vm05.stdout:5/304: rename d8/d18/d1b/d47/d48/l5a to d8/d18/d1b/l6a 0 2026-03-09T16:15:10.817 INFO:tasks.workunit.client.1.vm05.stdout:2/310: creat db/dd/f6b x:0 0 0 2026-03-09T16:15:10.817 INFO:tasks.workunit.client.1.vm05.stdout:7/337: dread d1/d2/d11/f25 [0,4194304] 0 2026-03-09T16:15:10.817 INFO:tasks.workunit.client.1.vm05.stdout:5/305: dread - d8/d18/d1b/d47/d4e/f57 zero size 2026-03-09T16:15:10.820 INFO:tasks.workunit.client.1.vm05.stdout:3/242: truncate d0/d9/d22/f30 6301225 0 2026-03-09T16:15:10.822 INFO:tasks.workunit.client.1.vm05.stdout:5/306: dwrite d8/d18/d1b/d2e/f3c [0,4194304] 0 2026-03-09T16:15:10.822 INFO:tasks.workunit.client.1.vm05.stdout:6/273: rename d17/d22/f2c to d17/f60 0 2026-03-09T16:15:10.822 INFO:tasks.workunit.client.1.vm05.stdout:7/338: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c75 0 2026-03-09T16:15:10.824 INFO:tasks.workunit.client.1.vm05.stdout:3/243: dread d0/d9/f2c [0,4194304] 0 2026-03-09T16:15:10.825 INFO:tasks.workunit.client.1.vm05.stdout:7/339: mkdir d1/d2/d8/d67/d76 0 2026-03-09T16:15:10.826 INFO:tasks.workunit.client.1.vm05.stdout:6/274: symlink d17/l61 0 2026-03-09T16:15:10.826 INFO:tasks.workunit.client.1.vm05.stdout:2/311: rename db/dd/d15/d1f/d20/l65 to db/dd/l6c 0 2026-03-09T16:15:10.827 INFO:tasks.workunit.client.1.vm05.stdout:7/340: chown d1/d2/d8/dc/d1b/f66 156 1 2026-03-09T16:15:10.828 INFO:tasks.workunit.client.1.vm05.stdout:3/244: dread d0/fd [0,4194304] 0 2026-03-09T16:15:10.830 INFO:tasks.workunit.client.1.vm05.stdout:5/307: dwrite d8/d18/d1b/d2e/d43/f41 [0,4194304] 0 2026-03-09T16:15:10.832 INFO:tasks.workunit.client.1.vm05.stdout:2/312: dwrite db/dd/d15/f51 [0,4194304] 0 2026-03-09T16:15:10.842 INFO:tasks.workunit.client.1.vm05.stdout:6/275: mknod d17/d5d/c62 0 2026-03-09T16:15:10.857 INFO:tasks.workunit.client.1.vm05.stdout:3/245: dwrite d0/d9/f37 [0,4194304] 0 2026-03-09T16:15:10.857 INFO:tasks.workunit.client.1.vm05.stdout:6/276: creat d17/d22/d27/f63 x:0 0 0 2026-03-09T16:15:10.857 INFO:tasks.workunit.client.1.vm05.stdout:5/308: mkdir d8/d18/d1b/d6b 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:7/341: link d1/d2/d11/c43 d1/d2/d8/d31/c77 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:3/246: dwrite d0/fd [0,4194304] 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:7/342: read d1/d2/d8/dc/f1e [621510,88061] 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:5/309: stat d8/d18/c54 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:7/343: dwrite d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:7/344: dread - d1/d2/d8/dc/d1b/f66 zero size 2026-03-09T16:15:10.858 INFO:tasks.workunit.client.1.vm05.stdout:5/310: dwrite d8/d18/d1b/d2e/f52 [0,4194304] 0 2026-03-09T16:15:10.871 INFO:tasks.workunit.client.1.vm05.stdout:5/311: mknod d8/d18/d1b/d6b/c6c 0 2026-03-09T16:15:10.903 INFO:tasks.workunit.client.1.vm05.stdout:7/345: link d1/d2/d8/dc/d1b/d30/d4b/d65/c4d d1/d2/d11/c78 0 2026-03-09T16:15:10.903 INFO:tasks.workunit.client.1.vm05.stdout:5/312: stat d8/d18/f20 0 2026-03-09T16:15:10.903 INFO:tasks.workunit.client.1.vm05.stdout:5/313: rename d8/l24 to d8/d18/d1b/d47/d68/l6d 0 2026-03-09T16:15:10.903 INFO:tasks.workunit.client.1.vm05.stdout:5/314: chown d8/c12 11702 1 2026-03-09T16:15:10.903 INFO:tasks.workunit.client.1.vm05.stdout:7/346: rename d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c64 to d1/d2/d8/dc/d1b/d30/c79 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/315: mknod d8/d18/d1b/d2e/d43/c6e 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/347: dread - d1/d2/d8/dc/d1b/f66 zero size 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/316: write d8/d18/d1b/f31 [2052188,116710] 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/348: creat d1/d2/d8/dc/d1b/d30/d4b/f7a x:0 0 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/317: creat d8/f6f x:0 0 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/349: dread - d1/d2/d8/dc/d1b/d71/d3c/f60 zero size 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/318: creat d8/d18/d1b/d47/d68/f70 x:0 0 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/350: dread - d1/d2/d8/dc/d1b/f62 zero size 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/319: mknod d8/d18/c71 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/351: getdents d1/d2/d8/dc/d18/d5d 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/320: dwrite d8/d59/d5b/f66 [0,4194304] 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:7/352: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f49 zero size 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/321: rename d8/fd to d8/d5e/f72 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:5/322: mkdir d8/d18/d1b/d47/d48/d73 0 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:9/307: sync 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:8/252: sync 2026-03-09T16:15:10.904 INFO:tasks.workunit.client.1.vm05.stdout:9/308: creat d4/d10/d35/d36/f67 x:0 0 0 2026-03-09T16:15:10.906 INFO:tasks.workunit.client.1.vm05.stdout:9/309: creat d4/d10/d35/d36/d48/f68 x:0 0 0 2026-03-09T16:15:10.906 INFO:tasks.workunit.client.1.vm05.stdout:9/310: dread - d4/d10/d35/d36/f67 zero size 2026-03-09T16:15:10.909 INFO:tasks.workunit.client.1.vm05.stdout:9/311: link d4/d10/d35/d36/d48/d4c/c51 d4/d10/d35/d36/c69 0 2026-03-09T16:15:10.911 INFO:tasks.workunit.client.1.vm05.stdout:9/312: rename d4/f64 to d4/d10/d35/d2b/d38/d65/f6a 0 2026-03-09T16:15:10.913 INFO:tasks.workunit.client.1.vm05.stdout:9/313: rename d4/d10/d35/d2b/f34 to d4/f6b 0 2026-03-09T16:15:10.915 INFO:tasks.workunit.client.1.vm05.stdout:9/314: creat d4/d10/d35/d36/d48/d60/f6c x:0 0 0 2026-03-09T16:15:10.918 INFO:tasks.workunit.client.1.vm05.stdout:9/315: dwrite d4/f5b [0,4194304] 0 2026-03-09T16:15:10.927 INFO:tasks.workunit.client.1.vm05.stdout:9/316: truncate d4/f61 1043349 0 2026-03-09T16:15:10.941 INFO:tasks.workunit.client.1.vm05.stdout:9/317: write f2 [1314614,9357] 0 2026-03-09T16:15:10.942 INFO:tasks.workunit.client.1.vm05.stdout:9/318: symlink d4/d10/d35/d36/d48/d54/d59/l6d 0 2026-03-09T16:15:10.942 INFO:tasks.workunit.client.1.vm05.stdout:9/319: chown d4/f66 3408 1 2026-03-09T16:15:10.944 INFO:tasks.workunit.client.1.vm05.stdout:2/313: sync 2026-03-09T16:15:10.947 INFO:tasks.workunit.client.1.vm05.stdout:9/320: link d4/d10/d35/d2b/f45 d4/d10/d35/d36/d48/f6e 0 2026-03-09T16:15:10.953 INFO:tasks.workunit.client.1.vm05.stdout:2/314: getdents db/dd/d15/d4c 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/321: truncate d4/d10/f15 2770238 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/322: dread - d4/d10/d35/d2b/d38/d65/f6a zero size 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/323: readlink d4/l1f 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/324: getdents d4/d10/d35/d36/d48/d54 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/325: write d4/d10/d35/d36/d48/d60/f6c [853393,35424] 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/326: link d4/d10/c46 d4/d10/d35/d36/d48/d54/c6f 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/327: dread - d4/d10/d35/d2b/f45 zero size 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/328: mkdir d4/d70 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/329: rename d4/d10/d35/d36/d48/d4c/c51 to d4/d10/d35/d2b/d38/d65/c71 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/330: mknod d4/d10/d35/d36/c72 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/331: write d4/d10/d35/d2b/d31/f55 [336999,51725] 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/332: unlink d4/d10/d35/d2b/d31/c37 0 2026-03-09T16:15:10.971 INFO:tasks.workunit.client.1.vm05.stdout:9/333: dread d4/f61 [0,4194304] 0 2026-03-09T16:15:10.972 INFO:tasks.workunit.client.1.vm05.stdout:9/334: mknod d4/d10/d35/d36/d48/d60/c73 0 2026-03-09T16:15:10.973 INFO:tasks.workunit.client.1.vm05.stdout:9/335: chown d4/f6 11709 1 2026-03-09T16:15:10.974 INFO:tasks.workunit.client.1.vm05.stdout:9/336: mknod d4/d10/d35/d2b/c74 0 2026-03-09T16:15:10.975 INFO:tasks.workunit.client.1.vm05.stdout:9/337: fdatasync d4/d10/d35/d2b/d38/f62 0 2026-03-09T16:15:10.988 INFO:tasks.workunit.client.1.vm05.stdout:9/338: dwrite d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:10.991 INFO:tasks.workunit.client.1.vm05.stdout:9/339: creat d4/d10/f75 x:0 0 0 2026-03-09T16:15:10.992 INFO:tasks.workunit.client.1.vm05.stdout:9/340: creat d4/d10/d35/d2b/d31/f76 x:0 0 0 2026-03-09T16:15:10.993 INFO:tasks.workunit.client.1.vm05.stdout:9/341: creat d4/d10/d35/d36/f77 x:0 0 0 2026-03-09T16:15:11.030 INFO:tasks.workunit.client.1.vm05.stdout:4/289: dread d5/de/d15/f38 [0,4194304] 0 2026-03-09T16:15:11.033 INFO:tasks.workunit.client.1.vm05.stdout:9/342: sync 2026-03-09T16:15:11.039 INFO:tasks.workunit.client.1.vm05.stdout:4/290: dwrite d5/f59 [0,4194304] 0 2026-03-09T16:15:11.039 INFO:tasks.workunit.client.1.vm05.stdout:9/343: creat d4/d10/d35/d2b/d38/f78 x:0 0 0 2026-03-09T16:15:11.039 INFO:tasks.workunit.client.1.vm05.stdout:9/344: write d4/d10/d35/d2b/d38/f4b [586294,79558] 0 2026-03-09T16:15:11.048 INFO:tasks.workunit.client.1.vm05.stdout:9/345: dwrite d4/d10/d35/d2b/d38/f62 [0,4194304] 0 2026-03-09T16:15:11.051 INFO:tasks.workunit.client.1.vm05.stdout:9/346: chown d4/d10/f75 83679 1 2026-03-09T16:15:11.093 INFO:tasks.workunit.client.1.vm05.stdout:4/291: sync 2026-03-09T16:15:11.096 INFO:tasks.workunit.client.1.vm05.stdout:4/292: rename d5/d19/l33 to d5/d19/l69 0 2026-03-09T16:15:11.097 INFO:tasks.workunit.client.1.vm05.stdout:4/293: truncate d5/de/f23 1390635 0 2026-03-09T16:15:11.099 INFO:tasks.workunit.client.1.vm05.stdout:4/294: creat d5/de/d15/d21/d39/d5d/f6a x:0 0 0 2026-03-09T16:15:11.103 INFO:tasks.workunit.client.1.vm05.stdout:9/347: sync 2026-03-09T16:15:11.108 INFO:tasks.workunit.client.1.vm05.stdout:9/348: dwrite d4/d10/f52 [0,4194304] 0 2026-03-09T16:15:11.109 INFO:tasks.workunit.client.1.vm05.stdout:9/349: chown d4/d10/d35/d2b/d38/f78 55 1 2026-03-09T16:15:11.109 INFO:tasks.workunit.client.1.vm05.stdout:9/350: stat d4/f5b 0 2026-03-09T16:15:11.110 INFO:tasks.workunit.client.1.vm05.stdout:9/351: write f2 [3996498,47280] 0 2026-03-09T16:15:11.116 INFO:tasks.workunit.client.1.vm05.stdout:9/352: truncate d4/f3c 746197 0 2026-03-09T16:15:11.119 INFO:tasks.workunit.client.1.vm05.stdout:9/353: rename d4/d10/d35/d36/l39 to d4/d10/d35/d36/d48/d54/d59/l79 0 2026-03-09T16:15:11.121 INFO:tasks.workunit.client.1.vm05.stdout:9/354: rmdir d4/d70 0 2026-03-09T16:15:11.126 INFO:tasks.workunit.client.1.vm05.stdout:9/355: dwrite d4/d10/f22 [0,4194304] 0 2026-03-09T16:15:11.130 INFO:tasks.workunit.client.1.vm05.stdout:9/356: creat d4/d10/d35/d36/d48/d54/d59/f7a x:0 0 0 2026-03-09T16:15:11.130 INFO:tasks.workunit.client.1.vm05.stdout:9/357: chown d4 243876 1 2026-03-09T16:15:11.131 INFO:tasks.workunit.client.1.vm05.stdout:9/358: stat d4/c47 0 2026-03-09T16:15:11.132 INFO:tasks.workunit.client.1.vm05.stdout:9/359: symlink d4/d10/d35/d36/d48/d54/l7b 0 2026-03-09T16:15:11.133 INFO:tasks.workunit.client.1.vm05.stdout:9/360: chown d4/d10/d35/d2b/f45 12049778 1 2026-03-09T16:15:11.133 INFO:tasks.workunit.client.1.vm05.stdout:9/361: stat d4/d10/d35/d36/d48/l4e 0 2026-03-09T16:15:11.134 INFO:tasks.workunit.client.1.vm05.stdout:9/362: chown d4/d10/d35/d2b/d38/f4b 8 1 2026-03-09T16:15:11.180 INFO:tasks.workunit.client.1.vm05.stdout:8/253: read d4/d6/db/dc/f30 [1525812,15323] 0 2026-03-09T16:15:11.180 INFO:tasks.workunit.client.1.vm05.stdout:4/295: read d5/de/d15/d21/d27/d3c/f4d [236063,10826] 0 2026-03-09T16:15:11.181 INFO:tasks.workunit.client.1.vm05.stdout:8/254: creat d4/d6/db/f5e x:0 0 0 2026-03-09T16:15:11.187 INFO:tasks.workunit.client.1.vm05.stdout:8/255: creat d4/d6/f5f x:0 0 0 2026-03-09T16:15:11.193 INFO:tasks.workunit.client.1.vm05.stdout:4/296: dwrite d5/de/d15/d21/d39/f44 [0,4194304] 0 2026-03-09T16:15:11.194 INFO:tasks.workunit.client.1.vm05.stdout:8/256: creat d4/d6/db/d59/f60 x:0 0 0 2026-03-09T16:15:11.194 INFO:tasks.workunit.client.1.vm05.stdout:8/257: unlink d4/d6/f24 0 2026-03-09T16:15:11.201 INFO:tasks.workunit.client.1.vm05.stdout:5/323: dread d8/d18/d1b/f32 [0,4194304] 0 2026-03-09T16:15:11.203 INFO:tasks.workunit.client.1.vm05.stdout:8/258: getdents d4/d6/db/dc 0 2026-03-09T16:15:11.203 INFO:tasks.workunit.client.1.vm05.stdout:8/259: chown d4/c16 2775 1 2026-03-09T16:15:11.205 INFO:tasks.workunit.client.1.vm05.stdout:5/324: link d8/d18/c27 d8/d59/d5b/c74 0 2026-03-09T16:15:11.205 INFO:tasks.workunit.client.1.vm05.stdout:5/325: chown d8/d18/d1b/l42 520070953 1 2026-03-09T16:15:11.207 INFO:tasks.workunit.client.1.vm05.stdout:5/326: mkdir d8/d59/d75 0 2026-03-09T16:15:11.208 INFO:tasks.workunit.client.1.vm05.stdout:5/327: mkdir d8/d18/d1b/d47/d4e/d76 0 2026-03-09T16:15:11.214 INFO:tasks.workunit.client.1.vm05.stdout:5/328: rename d8/d18/c71 to d8/d18/c77 0 2026-03-09T16:15:11.217 INFO:tasks.workunit.client.1.vm05.stdout:5/329: dread d8/d59/d5b/f66 [0,4194304] 0 2026-03-09T16:15:11.219 INFO:tasks.workunit.client.1.vm05.stdout:5/330: mkdir d8/d18/d1b/d78 0 2026-03-09T16:15:11.236 INFO:tasks.workunit.client.1.vm05.stdout:5/331: dread d8/d18/d1b/d2e/f35 [0,4194304] 0 2026-03-09T16:15:11.245 INFO:tasks.workunit.client.1.vm05.stdout:8/260: sync 2026-03-09T16:15:11.246 INFO:tasks.workunit.client.1.vm05.stdout:8/261: readlink d4/d6/db/df/d4f/l5b 0 2026-03-09T16:15:11.249 INFO:tasks.workunit.client.1.vm05.stdout:8/262: getdents d4/d6/d53 0 2026-03-09T16:15:11.249 INFO:tasks.workunit.client.1.vm05.stdout:8/263: readlink d4/d6/db/dc/l48 0 2026-03-09T16:15:11.251 INFO:tasks.workunit.client.1.vm05.stdout:8/264: symlink d4/d6/db/l61 0 2026-03-09T16:15:11.255 INFO:tasks.workunit.client.1.vm05.stdout:2/315: fdatasync db/dd/d15/d1f/f25 0 2026-03-09T16:15:11.268 INFO:tasks.workunit.client.1.vm05.stdout:1/319: dwrite d7/f9 [0,4194304] 0 2026-03-09T16:15:11.298 INFO:tasks.workunit.client.1.vm05.stdout:1/320: mknod d7/dd/d21/d39/d48/d5d/c75 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/321: mknod d7/dd/d21/d44/d5c/c76 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/322: symlink d7/d62/d72/l77 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/330: dread d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/331: stat d5/d1b/d3b 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/323: chown d7/fc 672 1 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/332: mkdir d5/d11/d4f/d68 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/333: read d5/f5c [1176326,75972] 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/324: mkdir d7/dd/d21/d44/d5c/d78 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:5/332: dread d8/d18/d1b/f2a [0,4194304] 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/334: creat d5/db/d5b/f69 x:0 0 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/325: creat d7/d62/d72/f79 x:0 0 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:1/326: readlink d7/d15/d16/l2c 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:0/335: rename d5/db/d1d/f4a to d5/d1b/f6a 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:5/333: rename d8/l17 to d8/l79 0 2026-03-09T16:15:11.299 INFO:tasks.workunit.client.1.vm05.stdout:5/334: fdatasync d8/d18/d1b/f2d 0 2026-03-09T16:15:11.300 INFO:tasks.workunit.client.1.vm05.stdout:5/335: write d8/d18/d1b/d47/d48/f61 [719442,38495] 0 2026-03-09T16:15:11.302 INFO:tasks.workunit.client.1.vm05.stdout:0/336: unlink d5/d2c/l5e 0 2026-03-09T16:15:11.302 INFO:tasks.workunit.client.1.vm05.stdout:5/336: mkdir d8/d53/d7a 0 2026-03-09T16:15:11.310 INFO:tasks.workunit.client.1.vm05.stdout:5/337: truncate d8/d18/d1b/f2c 1469291 0 2026-03-09T16:15:11.397 INFO:tasks.workunit.client.1.vm05.stdout:0/337: dread d5/db/d1d/f2e [0,4194304] 0 2026-03-09T16:15:11.398 INFO:tasks.workunit.client.1.vm05.stdout:0/338: write d5/db/d1d/f2e [4760807,12574] 0 2026-03-09T16:15:11.398 INFO:tasks.workunit.client.1.vm05.stdout:0/339: dread - d5/d2c/f63 zero size 2026-03-09T16:15:11.400 INFO:tasks.workunit.client.1.vm05.stdout:0/340: creat d5/d11/d4f/d68/f6b x:0 0 0 2026-03-09T16:15:11.402 INFO:tasks.workunit.client.1.vm05.stdout:0/341: rename d5/db/d1d/l22 to d5/db/d48/d66/l6c 0 2026-03-09T16:15:11.405 INFO:tasks.workunit.client.1.vm05.stdout:0/342: truncate d5/db/d5b/f35 3081949 0 2026-03-09T16:15:11.405 INFO:tasks.workunit.client.1.vm05.stdout:0/343: chown d5/db/d1d/f64 12192 1 2026-03-09T16:15:11.405 INFO:tasks.workunit.client.1.vm05.stdout:2/316: getdents db/dd/d15/d1f/d20 0 2026-03-09T16:15:11.408 INFO:tasks.workunit.client.1.vm05.stdout:6/277: write d17/f1a [28806,103753] 0 2026-03-09T16:15:11.412 INFO:tasks.workunit.client.1.vm05.stdout:0/344: dwrite d5/d1b/d30/f55 [0,4194304] 0 2026-03-09T16:15:11.412 INFO:tasks.workunit.client.1.vm05.stdout:6/278: symlink d17/d22/d27/d34/l64 0 2026-03-09T16:15:11.412 INFO:tasks.workunit.client.1.vm05.stdout:0/345: truncate d5/d2c/f3a 337679 0 2026-03-09T16:15:11.415 INFO:tasks.workunit.client.1.vm05.stdout:6/279: unlink d17/d1d/f38 0 2026-03-09T16:15:11.415 INFO:tasks.workunit.client.1.vm05.stdout:2/317: creat db/dd/f6d x:0 0 0 2026-03-09T16:15:11.416 INFO:tasks.workunit.client.1.vm05.stdout:0/346: stat d5/d1b/d3b/f3c 0 2026-03-09T16:15:11.420 INFO:tasks.workunit.client.1.vm05.stdout:6/280: stat d17/d1d/c2b 0 2026-03-09T16:15:11.430 INFO:tasks.workunit.client.1.vm05.stdout:0/347: creat d5/db/d1d/f6d x:0 0 0 2026-03-09T16:15:11.431 INFO:tasks.workunit.client.1.vm05.stdout:7/353: dwrite d1/d2/d8/dc/d1b/f62 [0,4194304] 0 2026-03-09T16:15:11.431 INFO:tasks.workunit.client.1.vm05.stdout:7/354: write d1/d2/d8/dc/d1b/f5a [1973597,43499] 0 2026-03-09T16:15:11.431 INFO:tasks.workunit.client.1.vm05.stdout:7/355: dread - d1/d2/d8/dc/d1b/d71/d3c/f6e zero size 2026-03-09T16:15:11.431 INFO:tasks.workunit.client.1.vm05.stdout:2/318: read db/dd/d15/d3f/f4a [3739600,31548] 0 2026-03-09T16:15:11.436 INFO:tasks.workunit.client.1.vm05.stdout:6/281: mkdir d17/d22/d27/d34/d42/d65 0 2026-03-09T16:15:11.436 INFO:tasks.workunit.client.1.vm05.stdout:6/282: chown d17/d22/d27 4851 1 2026-03-09T16:15:11.438 INFO:tasks.workunit.client.1.vm05.stdout:0/348: dwrite d5/d2c/f41 [0,4194304] 0 2026-03-09T16:15:11.443 INFO:tasks.workunit.client.1.vm05.stdout:2/319: write db/dd/d15/d4c/f58 [1437206,130514] 0 2026-03-09T16:15:11.448 INFO:tasks.workunit.client.1.vm05.stdout:0/349: chown d5/d1b/d30/f2f 3 1 2026-03-09T16:15:11.448 INFO:tasks.workunit.client.1.vm05.stdout:6/283: dwrite d17/d22/d27/f63 [0,4194304] 0 2026-03-09T16:15:11.448 INFO:tasks.workunit.client.1.vm05.stdout:2/320: write db/dd/d15/d1f/d21/f5d [627449,56553] 0 2026-03-09T16:15:11.457 INFO:tasks.workunit.client.1.vm05.stdout:7/356: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c7b 0 2026-03-09T16:15:11.460 INFO:tasks.workunit.client.1.vm05.stdout:6/284: creat d17/d22/d27/d34/f66 x:0 0 0 2026-03-09T16:15:11.463 INFO:tasks.workunit.client.1.vm05.stdout:9/363: rmdir d4/d10/d35/d36/d48 39 2026-03-09T16:15:11.464 INFO:tasks.workunit.client.1.vm05.stdout:6/285: rename d17/d22/d27/f63 to d17/d1d/f67 0 2026-03-09T16:15:11.464 INFO:tasks.workunit.client.1.vm05.stdout:6/286: chown d17/d22/d27/f2a 66243 1 2026-03-09T16:15:11.468 INFO:tasks.workunit.client.1.vm05.stdout:6/287: dwrite d17/d22/d27/d44/f48 [0,4194304] 0 2026-03-09T16:15:11.470 INFO:tasks.workunit.client.1.vm05.stdout:6/288: readlink d17/l51 0 2026-03-09T16:15:11.470 INFO:tasks.workunit.client.1.vm05.stdout:6/289: readlink d17/l26 0 2026-03-09T16:15:11.470 INFO:tasks.workunit.client.1.vm05.stdout:7/357: symlink d1/l7c 0 2026-03-09T16:15:11.471 INFO:tasks.workunit.client.1.vm05.stdout:0/350: link d5/d1b/f47 d5/db/f6e 0 2026-03-09T16:15:11.473 INFO:tasks.workunit.client.1.vm05.stdout:4/297: dwrite d5/de/d15/d21/d27/d3c/f3d [0,4194304] 0 2026-03-09T16:15:11.476 INFO:tasks.workunit.client.1.vm05.stdout:9/364: creat d4/d10/d35/f7c x:0 0 0 2026-03-09T16:15:11.477 INFO:tasks.workunit.client.1.vm05.stdout:6/290: mkdir d17/d22/d27/d34/d42/d68 0 2026-03-09T16:15:11.478 INFO:tasks.workunit.client.1.vm05.stdout:0/351: write d5/db/d5f/f65 [260482,32052] 0 2026-03-09T16:15:11.478 INFO:tasks.workunit.client.1.vm05.stdout:4/298: mknod d5/de/d15/d21/d27/d3c/c6b 0 2026-03-09T16:15:11.480 INFO:tasks.workunit.client.1.vm05.stdout:7/358: mkdir d1/d2/d8/dc/d1b/d30/d7d 0 2026-03-09T16:15:11.481 INFO:tasks.workunit.client.1.vm05.stdout:6/291: creat d17/d22/d27/d34/d42/d53/f69 x:0 0 0 2026-03-09T16:15:11.484 INFO:tasks.workunit.client.1.vm05.stdout:0/352: creat d5/d1b/d3b/f6f x:0 0 0 2026-03-09T16:15:11.485 INFO:tasks.workunit.client.1.vm05.stdout:6/292: truncate d17/d22/d27/d34/d42/d53/f69 337157 0 2026-03-09T16:15:11.486 INFO:tasks.workunit.client.1.vm05.stdout:7/359: chown d1/d2/d8/lb 115032526 1 2026-03-09T16:15:11.486 INFO:tasks.workunit.client.1.vm05.stdout:4/299: creat d5/d19/d37/d60/f6c x:0 0 0 2026-03-09T16:15:11.489 INFO:tasks.workunit.client.1.vm05.stdout:6/293: rename d17/l3f to d17/d22/d27/d34/d42/d53/l6a 0 2026-03-09T16:15:11.491 INFO:tasks.workunit.client.1.vm05.stdout:0/353: dread d5/d2c/f41 [0,4194304] 0 2026-03-09T16:15:11.491 INFO:tasks.workunit.client.1.vm05.stdout:6/294: rename d17/d22/d27/d44/f5c to d17/d22/d27/f6b 0 2026-03-09T16:15:11.492 INFO:tasks.workunit.client.1.vm05.stdout:0/354: write d5/d2c/f28 [1782101,62327] 0 2026-03-09T16:15:11.493 INFO:tasks.workunit.client.1.vm05.stdout:4/300: dwrite d5/d19/d37/d60/f6c [0,4194304] 0 2026-03-09T16:15:11.498 INFO:tasks.workunit.client.1.vm05.stdout:6/295: creat d17/d22/d27/d34/d4b/f6c x:0 0 0 2026-03-09T16:15:11.500 INFO:tasks.workunit.client.1.vm05.stdout:6/296: creat d17/d22/d27/d34/d4b/f6d x:0 0 0 2026-03-09T16:15:11.501 INFO:tasks.workunit.client.1.vm05.stdout:4/301: creat d5/de/d15/d21/f6d x:0 0 0 2026-03-09T16:15:11.502 INFO:tasks.workunit.client.1.vm05.stdout:6/297: creat d17/d22/d27/d34/f6e x:0 0 0 2026-03-09T16:15:11.502 INFO:tasks.workunit.client.1.vm05.stdout:6/298: chown d17/d22/d27/d44/l5e 1324307 1 2026-03-09T16:15:11.502 INFO:tasks.workunit.client.1.vm05.stdout:4/302: mknod d5/c6e 0 2026-03-09T16:15:11.503 INFO:tasks.workunit.client.1.vm05.stdout:6/299: symlink d17/d4f/l6f 0 2026-03-09T16:15:11.504 INFO:tasks.workunit.client.1.vm05.stdout:6/300: fdatasync d17/d22/f3d 0 2026-03-09T16:15:11.505 INFO:tasks.workunit.client.1.vm05.stdout:4/303: mknod d5/d19/c6f 0 2026-03-09T16:15:11.507 INFO:tasks.workunit.client.1.vm05.stdout:8/265: write d4/d6/db/dc/f17 [2855665,98511] 0 2026-03-09T16:15:11.509 INFO:tasks.workunit.client.1.vm05.stdout:6/301: creat d17/d4f/f70 x:0 0 0 2026-03-09T16:15:11.512 INFO:tasks.workunit.client.1.vm05.stdout:6/302: creat d17/d5d/f71 x:0 0 0 2026-03-09T16:15:11.513 INFO:tasks.workunit.client.1.vm05.stdout:8/266: symlink d4/d6/db/dc/d5d/l62 0 2026-03-09T16:15:11.513 INFO:tasks.workunit.client.1.vm05.stdout:8/267: readlink l3 0 2026-03-09T16:15:11.513 INFO:tasks.workunit.client.1.vm05.stdout:8/268: chown d4/c16 30 1 2026-03-09T16:15:11.514 INFO:tasks.workunit.client.1.vm05.stdout:6/303: creat d17/d22/d27/d58/f72 x:0 0 0 2026-03-09T16:15:11.514 INFO:tasks.workunit.client.1.vm05.stdout:6/304: stat d17/d22/d27/d34/d42 0 2026-03-09T16:15:11.515 INFO:tasks.workunit.client.1.vm05.stdout:8/269: unlink d4/d6/db/dc/c21 0 2026-03-09T16:15:11.518 INFO:tasks.workunit.client.1.vm05.stdout:8/270: creat d4/d6/d3a/d15/f63 x:0 0 0 2026-03-09T16:15:11.573 INFO:tasks.workunit.client.1.vm05.stdout:7/360: dread d1/d2/f4f [0,4194304] 0 2026-03-09T16:15:11.576 INFO:tasks.workunit.client.1.vm05.stdout:1/327: write d7/fc [3087192,47575] 0 2026-03-09T16:15:11.582 INFO:tasks.workunit.client.1.vm05.stdout:1/328: mknod d7/dd/de/c7a 0 2026-03-09T16:15:11.582 INFO:tasks.workunit.client.1.vm05.stdout:5/338: dwrite d8/d18/d1b/d47/f4c [0,4194304] 0 2026-03-09T16:15:11.588 INFO:tasks.workunit.client.1.vm05.stdout:7/361: rename d1/d2/d8/dc/d1b/d30/d4b/d65/l58 to d1/d2/d8/dc/d14/l7e 0 2026-03-09T16:15:11.597 INFO:tasks.workunit.client.1.vm05.stdout:7/362: write d1/d2/d8/dc/f3b [1502816,12953] 0 2026-03-09T16:15:11.597 INFO:tasks.workunit.client.1.vm05.stdout:3/247: dwrite d0/d9/f1d [4194304,4194304] 0 2026-03-09T16:15:11.605 INFO:tasks.workunit.client.1.vm05.stdout:7/363: read d1/d2/d11/f54 [539143,67934] 0 2026-03-09T16:15:11.605 INFO:tasks.workunit.client.1.vm05.stdout:3/248: mkdir d0/d9/d22/d4c 0 2026-03-09T16:15:11.605 INFO:tasks.workunit.client.1.vm05.stdout:1/329: creat d7/dd/d21/d63/d71/f7b x:0 0 0 2026-03-09T16:15:11.606 INFO:tasks.workunit.client.1.vm05.stdout:3/249: write d0/d9/f2f [914684,81245] 0 2026-03-09T16:15:11.607 INFO:tasks.workunit.client.1.vm05.stdout:5/339: creat d8/f7b x:0 0 0 2026-03-09T16:15:11.608 INFO:tasks.workunit.client.1.vm05.stdout:1/330: chown d7/dd/de/d52/f58 144668 1 2026-03-09T16:15:11.613 INFO:tasks.workunit.client.1.vm05.stdout:5/340: dread d8/d18/d1b/f2a [0,4194304] 0 2026-03-09T16:15:11.614 INFO:tasks.workunit.client.1.vm05.stdout:5/341: fsync d8/d18/d1b/d2e/d43/f41 0 2026-03-09T16:15:11.622 INFO:tasks.workunit.client.1.vm05.stdout:1/331: read d7/dd/de/f23 [106110,93474] 0 2026-03-09T16:15:11.628 INFO:tasks.workunit.client.1.vm05.stdout:7/364: creat d1/d2/d8/dc/d1b/d30/d4b/d65/f7f x:0 0 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:9/365: write d4/d10/f15 [2152516,114165] 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:9/366: chown d4/f61 181619 1 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:1/332: mknod d7/dd/de/c7c 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:1/333: fdatasync d7/dd/d21/d3b/f65 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:7/365: unlink d1/d2/f4f 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:1/334: chown d7/l1d 7382 1 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:5/342: mknod d8/c7c 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:5/343: mknod d8/d18/d1b/d47/d4e/c7d 0 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:1/335: dread - d7/dd/d21/d44/f46 zero size 2026-03-09T16:15:11.640 INFO:tasks.workunit.client.1.vm05.stdout:4/304: truncate d5/fb 350123 0 2026-03-09T16:15:11.641 INFO:tasks.workunit.client.1.vm05.stdout:9/367: getdents d4/d10/d35/d36/d48 0 2026-03-09T16:15:11.645 INFO:tasks.workunit.client.1.vm05.stdout:8/271: truncate d4/d6/f58 3374796 0 2026-03-09T16:15:11.645 INFO:tasks.workunit.client.1.vm05.stdout:7/366: dread d1/d2/d8/dc/d1b/d71/f46 [0,4194304] 0 2026-03-09T16:15:11.647 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:11 vm03.local ceph-mon[51019]: mgrmap e30: vm03.gbgzmu(active, since 23s), standbys: vm05.dygxfv 2026-03-09T16:15:11.647 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:11 vm03.local ceph-mon[51019]: pgmap v15: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 13 MiB/s rd, 87 MiB/s wr, 225 op/s 2026-03-09T16:15:11.648 INFO:tasks.workunit.client.1.vm05.stdout:0/355: dwrite d5/db/d1d/f59 [0,4194304] 0 2026-03-09T16:15:11.653 INFO:tasks.workunit.client.1.vm05.stdout:6/305: dwrite d17/d1d/f33 [0,4194304] 0 2026-03-09T16:15:11.656 INFO:tasks.workunit.client.1.vm05.stdout:0/356: dread d5/d2c/f41 [0,4194304] 0 2026-03-09T16:15:11.675 INFO:tasks.workunit.client.1.vm05.stdout:5/344: mkdir d8/d53/d7e 0 2026-03-09T16:15:11.676 INFO:tasks.workunit.client.1.vm05.stdout:5/345: write d8/d18/d1b/d2e/f35 [417846,63911] 0 2026-03-09T16:15:11.678 INFO:tasks.workunit.client.1.vm05.stdout:3/250: rmdir d0/d9 39 2026-03-09T16:15:11.679 INFO:tasks.workunit.client.1.vm05.stdout:3/251: readlink d0/l39 0 2026-03-09T16:15:11.679 INFO:tasks.workunit.client.1.vm05.stdout:7/367: dread d1/d2/d11/f25 [0,4194304] 0 2026-03-09T16:15:11.684 INFO:tasks.workunit.client.1.vm05.stdout:8/272: symlink d4/d6/db/d54/l64 0 2026-03-09T16:15:11.685 INFO:tasks.workunit.client.1.vm05.stdout:6/306: mkdir d17/d5d/d73 0 2026-03-09T16:15:11.687 INFO:tasks.workunit.client.1.vm05.stdout:0/357: rmdir d5/d11/d4f 39 2026-03-09T16:15:11.697 INFO:tasks.workunit.client.1.vm05.stdout:6/307: write d17/d22/d27/d34/d4b/f5a [1442558,123627] 0 2026-03-09T16:15:11.698 INFO:tasks.workunit.client.1.vm05.stdout:6/308: truncate d17/d22/d27/d34/d4b/f6c 883422 0 2026-03-09T16:15:11.698 INFO:tasks.workunit.client.1.vm05.stdout:2/321: dread db/dd/d15/d1f/d21/f39 [0,4194304] 0 2026-03-09T16:15:11.698 INFO:tasks.workunit.client.1.vm05.stdout:0/358: dread d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:11.710 INFO:tasks.workunit.client.1.vm05.stdout:5/346: symlink d8/d59/l7f 0 2026-03-09T16:15:11.710 INFO:tasks.workunit.client.1.vm05.stdout:5/347: chown d8/d53 51128 1 2026-03-09T16:15:11.710 INFO:tasks.workunit.client.1.vm05.stdout:5/348: fsync d8/d18/f20 0 2026-03-09T16:15:11.714 INFO:tasks.workunit.client.1.vm05.stdout:7/368: creat d1/d2/d8/dc/d18/f80 x:0 0 0 2026-03-09T16:15:11.720 INFO:tasks.workunit.client.1.vm05.stdout:8/273: creat d4/d6/d3a/d15/f65 x:0 0 0 2026-03-09T16:15:11.735 INFO:tasks.workunit.client.1.vm05.stdout:3/252: dwrite d0/d9/fa [4194304,4194304] 0 2026-03-09T16:15:11.749 INFO:tasks.workunit.client.1.vm05.stdout:8/274: creat d4/d6/d3a/d15/f66 x:0 0 0 2026-03-09T16:15:11.755 INFO:tasks.workunit.client.1.vm05.stdout:2/322: mknod db/dd/d15/d46/d67/c6e 0 2026-03-09T16:15:11.757 INFO:tasks.workunit.client.1.vm05.stdout:8/275: dwrite d4/d6/d3a/d40/f4e [0,4194304] 0 2026-03-09T16:15:11.760 INFO:tasks.workunit.client.1.vm05.stdout:8/276: truncate d4/d6/d3a/f28 683508 0 2026-03-09T16:15:11.767 INFO:tasks.workunit.client.1.vm05.stdout:4/305: truncate d5/d19/d37/d60/f6c 1871196 0 2026-03-09T16:15:11.767 INFO:tasks.workunit.client.1.vm05.stdout:4/306: write d5/de/d15/d21/d39/f42 [297167,41936] 0 2026-03-09T16:15:11.769 INFO:tasks.workunit.client.1.vm05.stdout:0/359: mkdir d5/d11/d4f/d70 0 2026-03-09T16:15:11.769 INFO:tasks.workunit.client.1.vm05.stdout:0/360: chown d5/db/d1d/f2e 3290 1 2026-03-09T16:15:11.770 INFO:tasks.workunit.client.1.vm05.stdout:5/349: mkdir d8/d18/d1b/d47/d48/d73/d80 0 2026-03-09T16:15:11.772 INFO:tasks.workunit.client.1.vm05.stdout:3/253: creat d0/d9/f4d x:0 0 0 2026-03-09T16:15:11.773 INFO:tasks.workunit.client.1.vm05.stdout:3/254: dread - d0/d9/f4d zero size 2026-03-09T16:15:11.773 INFO:tasks.workunit.client.1.vm05.stdout:3/255: chown d0/d33/c3b 66598904 1 2026-03-09T16:15:11.775 INFO:tasks.workunit.client.1.vm05.stdout:8/277: mkdir d4/d6/d3a/d67 0 2026-03-09T16:15:11.775 INFO:tasks.workunit.client.1.vm05.stdout:9/368: getdents d4/d10/d35/d36/d48 0 2026-03-09T16:15:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:11 vm05.local ceph-mon[58702]: mgrmap e30: vm03.gbgzmu(active, since 23s), standbys: vm05.dygxfv 2026-03-09T16:15:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:11 vm05.local ceph-mon[58702]: pgmap v15: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 13 MiB/s rd, 87 MiB/s wr, 225 op/s 2026-03-09T16:15:11.776 INFO:tasks.workunit.client.1.vm05.stdout:9/369: dread - d4/d10/d35/d36/d48/f6e zero size 2026-03-09T16:15:11.778 INFO:tasks.workunit.client.1.vm05.stdout:0/361: unlink d5/d2c/f3a 0 2026-03-09T16:15:11.780 INFO:tasks.workunit.client.1.vm05.stdout:5/350: write d8/d18/d1b/f2a [3375321,6764] 0 2026-03-09T16:15:11.786 INFO:tasks.workunit.client.1.vm05.stdout:1/336: truncate d7/d15/d16/f29 107106 0 2026-03-09T16:15:11.791 INFO:tasks.workunit.client.1.vm05.stdout:8/278: symlink d4/d6/d3a/d3c/l68 0 2026-03-09T16:15:11.796 INFO:tasks.workunit.client.1.vm05.stdout:9/370: creat d4/d10/d35/d36/d48/d60/f7d x:0 0 0 2026-03-09T16:15:11.797 INFO:tasks.workunit.client.1.vm05.stdout:9/371: write d4/d10/d35/d2b/d38/f4b [346964,107977] 0 2026-03-09T16:15:11.797 INFO:tasks.workunit.client.1.vm05.stdout:2/323: creat db/dd/d15/f6f x:0 0 0 2026-03-09T16:15:11.797 INFO:tasks.workunit.client.1.vm05.stdout:3/256: mkdir d0/d9/d22/d4c/d4e 0 2026-03-09T16:15:11.797 INFO:tasks.workunit.client.1.vm05.stdout:5/351: mknod d8/d59/c81 0 2026-03-09T16:15:11.797 INFO:tasks.workunit.client.1.vm05.stdout:2/324: write db/dd/f32 [1980914,69652] 0 2026-03-09T16:15:11.799 INFO:tasks.workunit.client.1.vm05.stdout:8/279: fsync d4/d6/d3a/f25 0 2026-03-09T16:15:11.800 INFO:tasks.workunit.client.1.vm05.stdout:0/362: mkdir d5/d11/d71 0 2026-03-09T16:15:11.800 INFO:tasks.workunit.client.1.vm05.stdout:9/372: creat d4/d10/d35/d36/d48/d54/f7e x:0 0 0 2026-03-09T16:15:11.803 INFO:tasks.workunit.client.1.vm05.stdout:3/257: rename d0/d33/l27 to d0/d9/d22/d4c/l4f 0 2026-03-09T16:15:11.806 INFO:tasks.workunit.client.1.vm05.stdout:5/352: unlink d8/d18/d1b/d47/d4e/f4f 0 2026-03-09T16:15:11.809 INFO:tasks.workunit.client.1.vm05.stdout:6/309: truncate f16 800640 0 2026-03-09T16:15:11.811 INFO:tasks.workunit.client.1.vm05.stdout:7/369: dwrite d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:11.813 INFO:tasks.workunit.client.1.vm05.stdout:4/307: truncate d5/de/d15/d21/d39/f44 1044813 0 2026-03-09T16:15:11.832 INFO:tasks.workunit.client.1.vm05.stdout:8/280: unlink d4/d6/d3a/f32 0 2026-03-09T16:15:11.836 INFO:tasks.workunit.client.1.vm05.stdout:8/281: dwrite d4/f13 [0,4194304] 0 2026-03-09T16:15:11.842 INFO:tasks.workunit.client.1.vm05.stdout:8/282: dread d4/d6/db/dc/f17 [0,4194304] 0 2026-03-09T16:15:11.843 INFO:tasks.workunit.client.1.vm05.stdout:8/283: dread - d4/d6/d3a/d40/f52 zero size 2026-03-09T16:15:11.844 INFO:tasks.workunit.client.1.vm05.stdout:8/284: readlink d4/d6/db/dc/d3b/l3d 0 2026-03-09T16:15:11.845 INFO:tasks.workunit.client.1.vm05.stdout:8/285: stat d4/d6/db/dc/d3b/l3d 0 2026-03-09T16:15:11.846 INFO:tasks.workunit.client.1.vm05.stdout:8/286: truncate d4/d6/db/dc/f41 1024044 0 2026-03-09T16:15:11.847 INFO:tasks.workunit.client.1.vm05.stdout:6/310: creat d17/d22/d27/d34/d42/d53/f74 x:0 0 0 2026-03-09T16:15:11.847 INFO:tasks.workunit.client.1.vm05.stdout:7/370: creat d1/d2/d8/dc/d1b/d30/d5e/f81 x:0 0 0 2026-03-09T16:15:11.847 INFO:tasks.workunit.client.1.vm05.stdout:4/308: creat d5/de/d15/d21/d39/d5d/f70 x:0 0 0 2026-03-09T16:15:11.848 INFO:tasks.workunit.client.1.vm05.stdout:8/287: write d4/d6/db/dc/f2a [893802,64418] 0 2026-03-09T16:15:11.850 INFO:tasks.workunit.client.1.vm05.stdout:8/288: write d4/d6/db/dc/f41 [1307681,116177] 0 2026-03-09T16:15:11.855 INFO:tasks.workunit.client.1.vm05.stdout:0/363: creat d5/db/d48/d66/f72 x:0 0 0 2026-03-09T16:15:11.856 INFO:tasks.workunit.client.1.vm05.stdout:0/364: truncate d5/d1b/d3b/f6f 405044 0 2026-03-09T16:15:11.857 INFO:tasks.workunit.client.1.vm05.stdout:1/337: getdents d7/dd/d21/d3b 0 2026-03-09T16:15:11.858 INFO:tasks.workunit.client.1.vm05.stdout:3/258: creat d0/d9/d22/d4c/d4e/f50 x:0 0 0 2026-03-09T16:15:11.862 INFO:tasks.workunit.client.1.vm05.stdout:1/338: write d7/d15/d16/f53 [487670,119395] 0 2026-03-09T16:15:11.862 INFO:tasks.workunit.client.1.vm05.stdout:5/353: creat d8/d53/d7e/f82 x:0 0 0 2026-03-09T16:15:11.863 INFO:tasks.workunit.client.1.vm05.stdout:5/354: chown d8/d18/d1b/d47/d4e/f64 73204 1 2026-03-09T16:15:11.864 INFO:tasks.workunit.client.1.vm05.stdout:2/325: creat db/dd/d15/f70 x:0 0 0 2026-03-09T16:15:11.864 INFO:tasks.workunit.client.1.vm05.stdout:3/259: dwrite d0/d9/d22/f2a [0,4194304] 0 2026-03-09T16:15:11.865 INFO:tasks.workunit.client.1.vm05.stdout:9/373: link d4/d10/d35/d36/c72 d4/d10/d35/c7f 0 2026-03-09T16:15:11.865 INFO:tasks.workunit.client.1.vm05.stdout:8/289: creat d4/d6/d3a/d3c/f69 x:0 0 0 2026-03-09T16:15:11.866 INFO:tasks.workunit.client.1.vm05.stdout:9/374: write d4/f66 [251451,126141] 0 2026-03-09T16:15:11.870 INFO:tasks.workunit.client.1.vm05.stdout:2/326: write db/dd/f32 [1022543,83656] 0 2026-03-09T16:15:11.873 INFO:tasks.workunit.client.1.vm05.stdout:8/290: dread - d4/d6/d3a/d3c/f45 zero size 2026-03-09T16:15:11.875 INFO:tasks.workunit.client.1.vm05.stdout:0/365: dwrite d5/db/d5b/f69 [0,4194304] 0 2026-03-09T16:15:11.878 INFO:tasks.workunit.client.1.vm05.stdout:1/339: creat d7/dd/de/d52/f7d x:0 0 0 2026-03-09T16:15:11.878 INFO:tasks.workunit.client.1.vm05.stdout:2/327: dwrite db/dd/d15/d46/f4e [0,4194304] 0 2026-03-09T16:15:11.889 INFO:tasks.workunit.client.1.vm05.stdout:7/371: mknod d1/d2/d11/c82 0 2026-03-09T16:15:11.891 INFO:tasks.workunit.client.1.vm05.stdout:6/311: creat d17/d22/d27/d34/d42/d65/f75 x:0 0 0 2026-03-09T16:15:11.898 INFO:tasks.workunit.client.1.vm05.stdout:3/260: creat d0/d9/f51 x:0 0 0 2026-03-09T16:15:11.904 INFO:tasks.workunit.client.1.vm05.stdout:9/375: creat d4/d10/f80 x:0 0 0 2026-03-09T16:15:11.907 INFO:tasks.workunit.client.1.vm05.stdout:8/291: mkdir d4/d6/d3a/d40/d6a 0 2026-03-09T16:15:11.907 INFO:tasks.workunit.client.1.vm05.stdout:8/292: truncate d4/f13 4389803 0 2026-03-09T16:15:11.909 INFO:tasks.workunit.client.1.vm05.stdout:3/261: mknod d0/d9/d22/d4c/c52 0 2026-03-09T16:15:11.909 INFO:tasks.workunit.client.1.vm05.stdout:9/376: symlink d4/d10/d35/d36/d48/d54/d59/l81 0 2026-03-09T16:15:11.919 INFO:tasks.workunit.client.1.vm05.stdout:1/340: mknod d7/dd/d21/d3b/d55/c7e 0 2026-03-09T16:15:11.920 INFO:tasks.workunit.client.1.vm05.stdout:1/341: chown d7/dd/d21/d39/d48/d5d/c75 991395 1 2026-03-09T16:15:11.921 INFO:tasks.workunit.client.1.vm05.stdout:8/293: mknod d4/d6/d53/c6b 0 2026-03-09T16:15:11.922 INFO:tasks.workunit.client.1.vm05.stdout:8/294: dread - d4/d6/f5f zero size 2026-03-09T16:15:11.928 INFO:tasks.workunit.client.1.vm05.stdout:6/312: mknod d17/d22/d27/d34/d42/d68/c76 0 2026-03-09T16:15:11.931 INFO:tasks.workunit.client.1.vm05.stdout:6/313: chown d17/d22/d27/d34/d42/d68 125384176 1 2026-03-09T16:15:11.931 INFO:tasks.workunit.client.1.vm05.stdout:9/377: mkdir d4/d10/d35/d2b/d31/d82 0 2026-03-09T16:15:11.931 INFO:tasks.workunit.client.1.vm05.stdout:6/314: truncate d17/d22/d27/d58/f72 435059 0 2026-03-09T16:15:11.931 INFO:tasks.workunit.client.1.vm05.stdout:0/366: creat d5/f73 x:0 0 0 2026-03-09T16:15:11.931 INFO:tasks.workunit.client.1.vm05.stdout:2/328: getdents db/dd/d15/d4c 0 2026-03-09T16:15:11.932 INFO:tasks.workunit.client.1.vm05.stdout:2/329: stat db/dd/d15/d1f 0 2026-03-09T16:15:11.932 INFO:tasks.workunit.client.1.vm05.stdout:9/378: mknod d4/d10/d35/d36/d48/d4c/c83 0 2026-03-09T16:15:11.933 INFO:tasks.workunit.client.1.vm05.stdout:1/342: link d7/dd/d21/d44/f46 d7/dd/d21/d39/d5a/d50/f7f 0 2026-03-09T16:15:11.934 INFO:tasks.workunit.client.1.vm05.stdout:6/315: creat d17/d4f/f77 x:0 0 0 2026-03-09T16:15:11.935 INFO:tasks.workunit.client.1.vm05.stdout:0/367: rename d5/d2c/d49/l4d to d5/d2c/d49/l74 0 2026-03-09T16:15:11.935 INFO:tasks.workunit.client.1.vm05.stdout:3/262: fsync d0/d9/d22/d4c/d4e/f50 0 2026-03-09T16:15:11.943 INFO:tasks.workunit.client.1.vm05.stdout:6/316: creat d17/d5d/f78 x:0 0 0 2026-03-09T16:15:11.943 INFO:tasks.workunit.client.1.vm05.stdout:9/379: read d4/d10/d35/d2b/d31/f55 [246137,109577] 0 2026-03-09T16:15:11.944 INFO:tasks.workunit.client.1.vm05.stdout:9/380: write d4/d10/d35/d36/f49 [345017,77008] 0 2026-03-09T16:15:11.946 INFO:tasks.workunit.client.1.vm05.stdout:0/368: rename d5/d11/l19 to d5/d2c/l75 0 2026-03-09T16:15:11.946 INFO:tasks.workunit.client.1.vm05.stdout:4/309: write d5/d19/f48 [2812503,27080] 0 2026-03-09T16:15:11.948 INFO:tasks.workunit.client.1.vm05.stdout:1/343: mknod d7/dd/d21/d44/d5c/d78/c80 0 2026-03-09T16:15:11.948 INFO:tasks.workunit.client.1.vm05.stdout:4/310: stat d5/de/d15/d21 0 2026-03-09T16:15:11.948 INFO:tasks.workunit.client.1.vm05.stdout:0/369: unlink d5/db/d1d/f64 0 2026-03-09T16:15:11.949 INFO:tasks.workunit.client.1.vm05.stdout:9/381: symlink d4/d10/d35/d36/d48/d4c/l84 0 2026-03-09T16:15:11.950 INFO:tasks.workunit.client.1.vm05.stdout:9/382: stat d4/d10 0 2026-03-09T16:15:11.950 INFO:tasks.workunit.client.1.vm05.stdout:9/383: write d4/d10/d35/d2b/f2f [417387,48460] 0 2026-03-09T16:15:11.951 INFO:tasks.workunit.client.1.vm05.stdout:9/384: chown d4/d10/f1d 3440730 1 2026-03-09T16:15:11.951 INFO:tasks.workunit.client.1.vm05.stdout:9/385: write d4/d10/d35/f32 [1073847,18390] 0 2026-03-09T16:15:11.954 INFO:tasks.workunit.client.1.vm05.stdout:1/344: chown d7/d15/l1a 936366 1 2026-03-09T16:15:11.954 INFO:tasks.workunit.client.1.vm05.stdout:4/311: mknod d5/d19/d37/d60/c71 0 2026-03-09T16:15:11.956 INFO:tasks.workunit.client.1.vm05.stdout:4/312: truncate d5/f45 989208 0 2026-03-09T16:15:11.956 INFO:tasks.workunit.client.1.vm05.stdout:4/313: write d5/d19/f48 [3402186,96577] 0 2026-03-09T16:15:11.958 INFO:tasks.workunit.client.1.vm05.stdout:2/330: getdents db/dd/d15/d46/d67 0 2026-03-09T16:15:11.964 INFO:tasks.workunit.client.1.vm05.stdout:6/317: link d17/d1d/f33 d17/d22/f79 0 2026-03-09T16:15:11.964 INFO:tasks.workunit.client.1.vm05.stdout:4/314: creat d5/de/d15/d21/d31/f72 x:0 0 0 2026-03-09T16:15:11.964 INFO:tasks.workunit.client.1.vm05.stdout:3/263: link d0/l1c d0/d9/d22/d4c/l53 0 2026-03-09T16:15:11.964 INFO:tasks.workunit.client.1.vm05.stdout:6/318: write d17/d22/d27/d34/d42/d53/f74 [820746,126252] 0 2026-03-09T16:15:11.965 INFO:tasks.workunit.client.1.vm05.stdout:6/319: truncate d17/d4f/f70 545283 0 2026-03-09T16:15:11.965 INFO:tasks.workunit.client.1.vm05.stdout:2/331: dwrite f7 [4194304,4194304] 0 2026-03-09T16:15:11.967 INFO:tasks.workunit.client.1.vm05.stdout:2/332: write db/dd/d15/d1f/d21/f39 [8954212,32768] 0 2026-03-09T16:15:11.968 INFO:tasks.workunit.client.1.vm05.stdout:2/333: write db/dd/d15/d3f/f4a [927322,35530] 0 2026-03-09T16:15:11.968 INFO:tasks.workunit.client.1.vm05.stdout:3/264: rename d0/d9/f1d to d0/d9/d22/f54 0 2026-03-09T16:15:11.975 INFO:tasks.workunit.client.1.vm05.stdout:3/265: dread d0/d9/fa [4194304,4194304] 0 2026-03-09T16:15:11.975 INFO:tasks.workunit.client.1.vm05.stdout:4/315: write d5/d19/f32 [1897441,72922] 0 2026-03-09T16:15:11.980 INFO:tasks.workunit.client.1.vm05.stdout:4/316: dwrite f1 [0,4194304] 0 2026-03-09T16:15:11.985 INFO:tasks.workunit.client.1.vm05.stdout:1/345: link d7/dd/de/c7c d7/dd/d21/d44/d5c/c81 0 2026-03-09T16:15:11.985 INFO:tasks.workunit.client.1.vm05.stdout:1/346: chown d7/dd/d21/d39/d5a 1023129391 1 2026-03-09T16:15:11.985 INFO:tasks.workunit.client.1.vm05.stdout:6/320: creat d17/d22/d27/d44/f7a x:0 0 0 2026-03-09T16:15:11.985 INFO:tasks.workunit.client.1.vm05.stdout:2/334: rmdir db/dd/d15/d1f/d20/d23 39 2026-03-09T16:15:11.985 INFO:tasks.workunit.client.1.vm05.stdout:8/295: sync 2026-03-09T16:15:11.987 INFO:tasks.workunit.client.1.vm05.stdout:5/355: truncate d8/d5e/f72 5232631 0 2026-03-09T16:15:11.989 INFO:tasks.workunit.client.1.vm05.stdout:8/296: stat d4/d6/db/f5e 0 2026-03-09T16:15:11.990 INFO:tasks.workunit.client.1.vm05.stdout:9/386: sync 2026-03-09T16:15:11.995 INFO:tasks.workunit.client.1.vm05.stdout:8/297: chown d4/d6/db/dc/d5d/l62 6410363 1 2026-03-09T16:15:11.995 INFO:tasks.workunit.client.1.vm05.stdout:6/321: dread d17/f30 [0,4194304] 0 2026-03-09T16:15:11.995 INFO:tasks.workunit.client.1.vm05.stdout:1/347: symlink d7/dd/d21/d63/d71/l82 0 2026-03-09T16:15:11.995 INFO:tasks.workunit.client.1.vm05.stdout:7/372: truncate d1/d2/d8/dc/d1b/f5a 859421 0 2026-03-09T16:15:12.000 INFO:tasks.workunit.client.1.vm05.stdout:3/266: link d0/d9/d22/f30 d0/d9/d22/d4c/d4e/f55 0 2026-03-09T16:15:12.001 INFO:tasks.workunit.client.1.vm05.stdout:5/356: stat d8/l14 0 2026-03-09T16:15:12.001 INFO:tasks.workunit.client.1.vm05.stdout:8/298: creat d4/d6/d53/f6c x:0 0 0 2026-03-09T16:15:12.002 INFO:tasks.workunit.client.1.vm05.stdout:1/348: mknod d7/dd/d21/d39/d48/d5d/c83 0 2026-03-09T16:15:12.002 INFO:tasks.workunit.client.1.vm05.stdout:2/335: dwrite db/f12 [0,4194304] 0 2026-03-09T16:15:12.005 INFO:tasks.workunit.client.1.vm05.stdout:5/357: write d8/f7b [480013,100605] 0 2026-03-09T16:15:12.006 INFO:tasks.workunit.client.1.vm05.stdout:7/373: creat d1/d2/d8/dc/d1b/d30/d4b/d65/f83 x:0 0 0 2026-03-09T16:15:12.006 INFO:tasks.workunit.client.1.vm05.stdout:2/336: write db/dd/d15/f48 [519437,113586] 0 2026-03-09T16:15:12.015 INFO:tasks.workunit.client.1.vm05.stdout:9/387: sync 2026-03-09T16:15:12.015 INFO:tasks.workunit.client.1.vm05.stdout:1/349: chown d7/dd/d21/d39/d48/c51 120131763 1 2026-03-09T16:15:12.019 INFO:tasks.workunit.client.1.vm05.stdout:7/374: dwrite d1/d2/f5 [4194304,4194304] 0 2026-03-09T16:15:12.029 INFO:tasks.workunit.client.1.vm05.stdout:8/299: dread d4/f10 [0,4194304] 0 2026-03-09T16:15:12.033 INFO:tasks.workunit.client.1.vm05.stdout:2/337: symlink db/dd/d15/d3f/d5b/d60/l71 0 2026-03-09T16:15:12.037 INFO:tasks.workunit.client.1.vm05.stdout:7/375: stat d1/d2/d8/dc/d1b/d30/d4b/d65/c3a 0 2026-03-09T16:15:12.038 INFO:tasks.workunit.client.1.vm05.stdout:9/388: unlink d4/d10/d35/c7f 0 2026-03-09T16:15:12.040 INFO:tasks.workunit.client.1.vm05.stdout:8/300: mknod d4/d6/d3a/d67/c6d 0 2026-03-09T16:15:12.042 INFO:tasks.workunit.client.1.vm05.stdout:7/376: sync 2026-03-09T16:15:12.043 INFO:tasks.workunit.client.1.vm05.stdout:8/301: dread - d4/d6/f1f zero size 2026-03-09T16:15:12.044 INFO:tasks.workunit.client.1.vm05.stdout:2/338: symlink db/dd/d15/d3f/d5b/d60/d6a/l72 0 2026-03-09T16:15:12.045 INFO:tasks.workunit.client.1.vm05.stdout:0/370: truncate d5/d1b/d30/f29 1493839 0 2026-03-09T16:15:12.046 INFO:tasks.workunit.client.1.vm05.stdout:0/371: chown d5/c45 7 1 2026-03-09T16:15:12.047 INFO:tasks.workunit.client.1.vm05.stdout:0/372: chown d5/d1b/f61 29937 1 2026-03-09T16:15:12.049 INFO:tasks.workunit.client.1.vm05.stdout:2/339: rename db/dd/d15/d1f/f24 to db/dd/d15/d46/d67/f73 0 2026-03-09T16:15:12.049 INFO:tasks.workunit.client.1.vm05.stdout:1/350: link d7/dd/d21/f2b d7/d27/f84 0 2026-03-09T16:15:12.053 INFO:tasks.workunit.client.1.vm05.stdout:9/389: creat d4/d10/d35/d36/f85 x:0 0 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:4/317: write d5/de/d15/d21/f26 [196633,81078] 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:7/377: creat d1/f84 x:0 0 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:9/390: fdatasync d4/d10/f75 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:4/318: dread - d5/d19/d37/f51 zero size 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:4/319: chown d5/c20 0 1 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:0/373: creat d5/f76 x:0 0 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:7/378: dread - d1/d2/d8/dc/d18/f80 zero size 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:9/391: unlink d4/d10/f52 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:1/351: dwrite d7/dd/de/f35 [0,4194304] 0 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:9/392: dread - d4/f6b zero size 2026-03-09T16:15:12.063 INFO:tasks.workunit.client.1.vm05.stdout:9/393: write d4/d10/f22 [556945,73776] 0 2026-03-09T16:15:12.064 INFO:tasks.workunit.client.1.vm05.stdout:2/340: read db/dd/d15/f28 [2066328,54897] 0 2026-03-09T16:15:12.064 INFO:tasks.workunit.client.1.vm05.stdout:9/394: chown d4/d10/d35/d36/f85 122 1 2026-03-09T16:15:12.069 INFO:tasks.workunit.client.1.vm05.stdout:4/320: readlink d5/d19/l69 0 2026-03-09T16:15:12.071 INFO:tasks.workunit.client.1.vm05.stdout:8/302: getdents d4/d6/d3a 0 2026-03-09T16:15:12.073 INFO:tasks.workunit.client.1.vm05.stdout:7/379: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f7f [0,4194304] 0 2026-03-09T16:15:12.076 INFO:tasks.workunit.client.1.vm05.stdout:4/321: dwrite d5/de/d15/d21/d31/f64 [0,4194304] 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:2/341: mknod db/dd/d15/d46/d67/c74 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:9/395: stat d4/d10/d35/c19 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:1/352: chown d7/c8 637144929 1 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:8/303: symlink d4/d6/d53/l6e 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:8/304: write d4/d6/db/dc/d2e/f47 [2436257,108227] 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:0/374: mkdir d5/db/d77 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:4/322: mknod d5/de/d15/d21/d27/c73 0 2026-03-09T16:15:12.093 INFO:tasks.workunit.client.1.vm05.stdout:9/396: write d4/f6b [350968,97302] 0 2026-03-09T16:15:12.096 INFO:tasks.workunit.client.1.vm05.stdout:9/397: write d4/d10/f1d [4266450,6408] 0 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:9/398: fdatasync d4/f3c 0 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:8/305: rmdir d4/d6/db/d59 39 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:8/306: readlink d4/d6/db/l61 0 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:7/380: creat d1/d2/d8/dc/d1b/d30/f85 x:0 0 0 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:7/381: chown d1/d2/d8/dc/d1b/d71/c37 225649 1 2026-03-09T16:15:12.101 INFO:tasks.workunit.client.1.vm05.stdout:9/399: symlink d4/d10/d35/d2b/d38/l86 0 2026-03-09T16:15:12.102 INFO:tasks.workunit.client.1.vm05.stdout:8/307: read d4/d6/db/df/f18 [2701235,68295] 0 2026-03-09T16:15:12.103 INFO:tasks.workunit.client.1.vm05.stdout:0/375: creat d5/d1b/f78 x:0 0 0 2026-03-09T16:15:12.104 INFO:tasks.workunit.client.1.vm05.stdout:3/267: truncate d0/d9/d22/f2e 1923731 0 2026-03-09T16:15:12.105 INFO:tasks.workunit.client.1.vm05.stdout:4/323: creat d5/de/d15/f74 x:0 0 0 2026-03-09T16:15:12.107 INFO:tasks.workunit.client.1.vm05.stdout:9/400: creat d4/d10/d35/d36/d48/f87 x:0 0 0 2026-03-09T16:15:12.107 INFO:tasks.workunit.client.1.vm05.stdout:9/401: read - d4/d10/d35/d2b/d38/f78 zero size 2026-03-09T16:15:12.108 INFO:tasks.workunit.client.1.vm05.stdout:8/308: creat d4/d6/d3a/d3c/f6f x:0 0 0 2026-03-09T16:15:12.109 INFO:tasks.workunit.client.1.vm05.stdout:8/309: fsync d4/d6/d3a/d3c/f45 0 2026-03-09T16:15:12.110 INFO:tasks.workunit.client.1.vm05.stdout:7/382: rename d1/d2/d8/dc/d72/d6c to d1/d2/d11/d86 0 2026-03-09T16:15:12.110 INFO:tasks.workunit.client.1.vm05.stdout:8/310: fdatasync d4/d6/f5f 0 2026-03-09T16:15:12.111 INFO:tasks.workunit.client.1.vm05.stdout:9/402: dwrite d4/d10/f80 [0,4194304] 0 2026-03-09T16:15:12.111 INFO:tasks.workunit.client.1.vm05.stdout:8/311: chown d4/d6/db/dc/f2a 16597843 1 2026-03-09T16:15:12.112 INFO:tasks.workunit.client.1.vm05.stdout:8/312: fdatasync d4/d6/d3a/d15/f66 0 2026-03-09T16:15:12.112 INFO:tasks.workunit.client.1.vm05.stdout:8/313: stat d4/d6/db/d54/l56 0 2026-03-09T16:15:12.115 INFO:tasks.workunit.client.1.vm05.stdout:4/324: symlink d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/l75 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:4/325: write d5/d19/f48 [5027867,107459] 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:7/383: dwrite d1/d2/d11/f1c [0,4194304] 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:4/326: dread d5/de/d15/d21/d27/f2c [0,4194304] 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:9/403: creat d4/d10/d35/d2b/d31/d82/f88 x:0 0 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:4/327: rmdir d5/de/d15/d21/d27/d3c/d5c/d5f 39 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:9/404: mkdir d4/d10/d35/d36/d48/d4c/d89 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:4/328: write d5/de/d15/d21/d39/d5d/f6a [858422,130593] 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:9/405: readlink d4/d10/d35/d2b/d38/l86 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:7/384: symlink d1/d2/d8/d67/d76/l87 0 2026-03-09T16:15:12.137 INFO:tasks.workunit.client.1.vm05.stdout:4/329: chown d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/l75 6099602 1 2026-03-09T16:15:12.138 INFO:tasks.workunit.client.1.vm05.stdout:9/406: rename d4/d10/f22 to d4/d10/f8a 0 2026-03-09T16:15:12.138 INFO:tasks.workunit.client.1.vm05.stdout:7/385: creat d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f88 x:0 0 0 2026-03-09T16:15:12.138 INFO:tasks.workunit.client.1.vm05.stdout:7/386: write d1/d2/d8/dc/d1b/d30/d4b/d65/f83 [692462,16581] 0 2026-03-09T16:15:12.138 INFO:tasks.workunit.client.1.vm05.stdout:8/314: read d4/d6/d3a/f25 [1753518,14911] 0 2026-03-09T16:15:12.143 INFO:tasks.workunit.client.1.vm05.stdout:7/387: dwrite d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:12.156 INFO:tasks.workunit.client.1.vm05.stdout:8/315: dwrite d4/d6/db/dc/d2e/f47 [4194304,4194304] 0 2026-03-09T16:15:12.156 INFO:tasks.workunit.client.1.vm05.stdout:4/330: dread d5/de/f16 [0,4194304] 0 2026-03-09T16:15:12.156 INFO:tasks.workunit.client.1.vm05.stdout:4/331: dwrite d5/de/d15/d21/d39/d5d/f6a [0,4194304] 0 2026-03-09T16:15:12.157 INFO:tasks.workunit.client.1.vm05.stdout:3/268: sync 2026-03-09T16:15:12.158 INFO:tasks.workunit.client.1.vm05.stdout:3/269: write d0/d9/d22/d4c/d4e/f50 [75603,42565] 0 2026-03-09T16:15:12.160 INFO:tasks.workunit.client.1.vm05.stdout:3/270: readlink d0/l34 0 2026-03-09T16:15:12.162 INFO:tasks.workunit.client.1.vm05.stdout:9/407: rename d4/l30 to d4/d10/d35/d2b/d31/d82/l8b 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:5/358: write f1 [4980028,125855] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:5/359: dread - d8/d18/d1b/d47/d4e/f64 zero size 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:4/332: creat d5/d19/d37/d60/f76 x:0 0 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:9/408: dwrite d4/d10/d35/d36/d48/d54/d59/f5c [0,4194304] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:9/409: write d4/d10/d35/d2b/f2f [349419,109813] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:9/410: chown d4/d10/l3e 555 1 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:5/360: dwrite d8/d59/f5f [4194304,4194304] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:5/361: stat d8/d3d/f3f 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:5/362: dread d8/d18/d1b/d2e/d43/f41 [0,4194304] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:4/333: rename d5/c13 to d5/de/c77 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:4/334: read d5/d19/f32 [605512,123495] 0 2026-03-09T16:15:12.182 INFO:tasks.workunit.client.1.vm05.stdout:4/335: readlink d5/lc 0 2026-03-09T16:15:12.186 INFO:tasks.workunit.client.1.vm05.stdout:3/271: getdents d0/d9 0 2026-03-09T16:15:12.207 INFO:tasks.workunit.client.1.vm05.stdout:9/411: mknod d4/c8c 0 2026-03-09T16:15:12.213 INFO:tasks.workunit.client.1.vm05.stdout:4/336: rmdir d5/d19/d37 39 2026-03-09T16:15:12.216 INFO:tasks.workunit.client.1.vm05.stdout:6/322: truncate f16 14175 0 2026-03-09T16:15:12.218 INFO:tasks.workunit.client.1.vm05.stdout:2/342: rmdir db/dd 39 2026-03-09T16:15:12.219 INFO:tasks.workunit.client.1.vm05.stdout:8/316: getdents d4/d6/d3a 0 2026-03-09T16:15:12.220 INFO:tasks.workunit.client.1.vm05.stdout:9/412: creat d4/d10/f8d x:0 0 0 2026-03-09T16:15:12.229 INFO:tasks.workunit.client.1.vm05.stdout:1/353: dwrite d7/dd/d21/d44/f46 [0,4194304] 0 2026-03-09T16:15:12.229 INFO:tasks.workunit.client.1.vm05.stdout:1/354: dread - d7/d62/f69 zero size 2026-03-09T16:15:12.230 INFO:tasks.workunit.client.1.vm05.stdout:1/355: write d7/d27/f3c [765948,115886] 0 2026-03-09T16:15:12.231 INFO:tasks.workunit.client.1.vm05.stdout:1/356: truncate d7/d15/d16/f66 804468 0 2026-03-09T16:15:12.233 INFO:tasks.workunit.client.1.vm05.stdout:6/323: mkdir d17/d22/d27/d34/d4b/d7b 0 2026-03-09T16:15:12.237 INFO:tasks.workunit.client.1.vm05.stdout:3/272: sync 2026-03-09T16:15:12.238 INFO:tasks.workunit.client.1.vm05.stdout:3/273: write d0/f49 [3778449,46388] 0 2026-03-09T16:15:12.238 INFO:tasks.workunit.client.1.vm05.stdout:3/274: chown d0/d9 110343 1 2026-03-09T16:15:12.247 INFO:tasks.workunit.client.1.vm05.stdout:0/376: truncate d5/db/d1d/f59 415425 0 2026-03-09T16:15:12.258 INFO:tasks.workunit.client.1.vm05.stdout:4/337: write d5/d19/d37/d60/f6c [1675544,60957] 0 2026-03-09T16:15:12.258 INFO:tasks.workunit.client.1.vm05.stdout:4/338: fdatasync d5/d19/d37/d60/f76 0 2026-03-09T16:15:12.265 INFO:tasks.workunit.client.1.vm05.stdout:4/339: creat d5/de/d2f/f78 x:0 0 0 2026-03-09T16:15:12.269 INFO:tasks.workunit.client.1.vm05.stdout:9/413: sync 2026-03-09T16:15:12.274 INFO:tasks.workunit.client.1.vm05.stdout:4/340: dwrite d5/f45 [0,4194304] 0 2026-03-09T16:15:12.278 INFO:tasks.workunit.client.1.vm05.stdout:9/414: write d4/d10/d35/d36/d48/d54/f7e [838053,115904] 0 2026-03-09T16:15:12.294 INFO:tasks.workunit.client.1.vm05.stdout:7/388: truncate d1/d2/d8/d31/f39 812172 0 2026-03-09T16:15:12.295 INFO:tasks.workunit.client.1.vm05.stdout:6/324: getdents d17/d5d/d73 0 2026-03-09T16:15:12.303 INFO:tasks.workunit.client.1.vm05.stdout:7/389: chown d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f49 173528574 1 2026-03-09T16:15:12.306 INFO:tasks.workunit.client.1.vm05.stdout:3/275: truncate d0/d9/d22/d4c/d4e/f55 6521269 0 2026-03-09T16:15:12.306 INFO:tasks.workunit.client.1.vm05.stdout:3/276: stat d0/l39 0 2026-03-09T16:15:12.308 INFO:tasks.workunit.client.1.vm05.stdout:3/277: dread d0/fd [0,4194304] 0 2026-03-09T16:15:12.317 INFO:tasks.workunit.client.1.vm05.stdout:2/343: rename db/dd/d15/d1f/d21/f39 to db/dd/d15/d3f/f75 0 2026-03-09T16:15:12.319 INFO:tasks.workunit.client.1.vm05.stdout:9/415: creat d4/d10/d35/d36/d48/f8e x:0 0 0 2026-03-09T16:15:12.325 INFO:tasks.workunit.client.1.vm05.stdout:6/325: dwrite d17/d1d/f67 [4194304,4194304] 0 2026-03-09T16:15:12.326 INFO:tasks.workunit.client.1.vm05.stdout:9/416: read d4/d10/f3f [1610422,60576] 0 2026-03-09T16:15:12.328 INFO:tasks.workunit.client.1.vm05.stdout:6/326: read - d17/d22/d27/d34/d42/d65/f75 zero size 2026-03-09T16:15:12.334 INFO:tasks.workunit.client.1.vm05.stdout:5/363: dwrite d8/f11 [0,4194304] 0 2026-03-09T16:15:12.338 INFO:tasks.workunit.client.1.vm05.stdout:0/377: creat d5/f79 x:0 0 0 2026-03-09T16:15:12.339 INFO:tasks.workunit.client.1.vm05.stdout:0/378: dread - d5/d2c/f63 zero size 2026-03-09T16:15:12.339 INFO:tasks.workunit.client.1.vm05.stdout:0/379: write d5/d11/f40 [960680,127389] 0 2026-03-09T16:15:12.354 INFO:tasks.workunit.client.1.vm05.stdout:2/344: write db/dd/d15/d1f/f25 [6796461,128702] 0 2026-03-09T16:15:12.362 INFO:tasks.workunit.client.1.vm05.stdout:6/327: creat d17/d22/d27/d34/d42/d68/f7c x:0 0 0 2026-03-09T16:15:12.363 INFO:tasks.workunit.client.1.vm05.stdout:5/364: creat d8/d59/f83 x:0 0 0 2026-03-09T16:15:12.363 INFO:tasks.workunit.client.1.vm05.stdout:5/365: chown d8/d53/d7a 115 1 2026-03-09T16:15:12.363 INFO:tasks.workunit.client.1.vm05.stdout:3/278: chown d0/d9/f4b 898201 1 2026-03-09T16:15:12.364 INFO:tasks.workunit.client.1.vm05.stdout:0/380: creat d5/d2c/d49/f7a x:0 0 0 2026-03-09T16:15:12.364 INFO:tasks.workunit.client.1.vm05.stdout:5/366: chown f1 37212959 1 2026-03-09T16:15:12.365 INFO:tasks.workunit.client.1.vm05.stdout:3/279: write d0/d33/f3a [73520,109434] 0 2026-03-09T16:15:12.365 INFO:tasks.workunit.client.1.vm05.stdout:6/328: rename d17/d5d/c62 to d17/d4f/c7d 0 2026-03-09T16:15:12.367 INFO:tasks.workunit.client.1.vm05.stdout:6/329: dread - d17/d4f/f77 zero size 2026-03-09T16:15:12.370 INFO:tasks.workunit.client.1.vm05.stdout:0/381: write d5/d2c/f41 [921659,60930] 0 2026-03-09T16:15:12.370 INFO:tasks.workunit.client.1.vm05.stdout:6/330: dread d17/f3b [0,4194304] 0 2026-03-09T16:15:12.370 INFO:tasks.workunit.client.1.vm05.stdout:7/390: getdents d1/d2/d8/dc/d1b/d71/d3c 0 2026-03-09T16:15:12.370 INFO:tasks.workunit.client.1.vm05.stdout:6/331: write d17/d5d/f78 [808564,102249] 0 2026-03-09T16:15:12.372 INFO:tasks.workunit.client.1.vm05.stdout:6/332: chown d17/d22/d27/d44/f48 358983 1 2026-03-09T16:15:12.379 INFO:tasks.workunit.client.1.vm05.stdout:7/391: mknod d1/d2/d8/d67/d76/c89 0 2026-03-09T16:15:12.392 INFO:tasks.workunit.client.1.vm05.stdout:5/367: creat d8/d53/d7a/f84 x:0 0 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:2/345: getdents db/dd/d15/d1f/d20 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:5/368: dread d8/d18/d1b/f32 [0,4194304] 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:7/392: mkdir d1/d2/d11/d86/d8a 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:7/393: fdatasync d1/d2/f5 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:3/280: creat d0/f56 x:0 0 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:6/333: rename d17/d22/c2f to d17/d1d/c7e 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:6/334: stat d17/d22/d27/d34/d4b/f6c 0 2026-03-09T16:15:12.393 INFO:tasks.workunit.client.1.vm05.stdout:2/346: readlink l4 0 2026-03-09T16:15:12.396 INFO:tasks.workunit.client.1.vm05.stdout:5/369: creat d8/d1d/f85 x:0 0 0 2026-03-09T16:15:12.397 INFO:tasks.workunit.client.1.vm05.stdout:9/417: dread d4/d10/d35/f32 [0,4194304] 0 2026-03-09T16:15:12.406 INFO:tasks.workunit.client.1.vm05.stdout:5/370: dwrite d8/d1d/f44 [0,4194304] 0 2026-03-09T16:15:12.420 INFO:tasks.workunit.client.1.vm05.stdout:9/418: dread d4/d10/f8a [0,4194304] 0 2026-03-09T16:15:12.421 INFO:tasks.workunit.client.1.vm05.stdout:1/357: truncate d7/dd/de/d52/d5b/f5e 275728 0 2026-03-09T16:15:12.424 INFO:tasks.workunit.client.1.vm05.stdout:8/317: dwrite d4/d6/f58 [0,4194304] 0 2026-03-09T16:15:12.429 INFO:tasks.workunit.client.1.vm05.stdout:8/318: write d4/d6/db/dc/f41 [463340,3710] 0 2026-03-09T16:15:12.429 INFO:tasks.workunit.client.1.vm05.stdout:2/347: dread db/dd/d15/d4c/d56/f62 [0,4194304] 0 2026-03-09T16:15:12.443 INFO:tasks.workunit.client.1.vm05.stdout:0/382: dread d5/f8 [4194304,4194304] 0 2026-03-09T16:15:12.444 INFO:tasks.workunit.client.1.vm05.stdout:6/335: mkdir d17/d22/d27/d34/d4b/d7f 0 2026-03-09T16:15:12.451 INFO:tasks.workunit.client.1.vm05.stdout:7/394: getdents d1/d2/d11/d86/d8a 0 2026-03-09T16:15:12.452 INFO:tasks.workunit.client.1.vm05.stdout:7/395: chown d1/d2/d8/dc/d1b/d71/d3c 1872606 1 2026-03-09T16:15:12.455 INFO:tasks.workunit.client.1.vm05.stdout:1/358: unlink d7/dd/de/f6b 0 2026-03-09T16:15:12.457 INFO:tasks.workunit.client.1.vm05.stdout:1/359: chown d7/d27/f57 9 1 2026-03-09T16:15:12.459 INFO:tasks.workunit.client.1.vm05.stdout:8/319: symlink d4/d6/db/df/d4f/l70 0 2026-03-09T16:15:12.460 INFO:tasks.workunit.client.1.vm05.stdout:6/336: dread d17/d22/d27/d34/f47 [0,4194304] 0 2026-03-09T16:15:12.463 INFO:tasks.workunit.client.1.vm05.stdout:2/348: creat db/dd/d15/d3f/d55/f76 x:0 0 0 2026-03-09T16:15:12.465 INFO:tasks.workunit.client.1.vm05.stdout:8/320: dwrite d4/d6/db/dc/f41 [0,4194304] 0 2026-03-09T16:15:12.466 INFO:tasks.workunit.client.1.vm05.stdout:8/321: chown d4/d6/db/dc/f2a 44 1 2026-03-09T16:15:12.466 INFO:tasks.workunit.client.1.vm05.stdout:8/322: chown d4/d6/db/df/d4f 28778799 1 2026-03-09T16:15:12.487 INFO:tasks.workunit.client.1.vm05.stdout:4/341: truncate d5/de/d15/d21/d39/f42 182476 0 2026-03-09T16:15:12.496 INFO:tasks.workunit.client.1.vm05.stdout:5/371: unlink d8/c2f 0 2026-03-09T16:15:12.497 INFO:tasks.workunit.client.1.vm05.stdout:5/372: truncate d8/d18/d1b/d2e/f35 2586591 0 2026-03-09T16:15:12.498 INFO:tasks.workunit.client.1.vm05.stdout:7/396: fsync d1/d2/d8/dc/d1b/d71/f46 0 2026-03-09T16:15:12.499 INFO:tasks.workunit.client.1.vm05.stdout:9/419: link d4/d10/d35/d2b/d38/f78 d4/d10/d35/d36/d48/d60/f8f 0 2026-03-09T16:15:12.510 INFO:tasks.workunit.client.1.vm05.stdout:1/360: rename d7/d62/d72/l77 to d7/dd/d21/d44/l85 0 2026-03-09T16:15:12.520 INFO:tasks.workunit.client.1.vm05.stdout:8/323: mkdir d4/d6/d3a/d40/d71 0 2026-03-09T16:15:12.525 INFO:tasks.workunit.client.1.vm05.stdout:8/324: dwrite d4/d6/db/dc/f41 [0,4194304] 0 2026-03-09T16:15:12.525 INFO:tasks.workunit.client.1.vm05.stdout:2/349: dread db/f17 [0,4194304] 0 2026-03-09T16:15:12.532 INFO:tasks.workunit.client.1.vm05.stdout:9/420: symlink d4/d10/d35/d36/d48/l90 0 2026-03-09T16:15:12.537 INFO:tasks.workunit.client.1.vm05.stdout:3/281: dwrite d0/d9/fa [4194304,4194304] 0 2026-03-09T16:15:12.539 INFO:tasks.workunit.client.1.vm05.stdout:2/350: dwrite db/dd/d15/d46/d67/f73 [0,4194304] 0 2026-03-09T16:15:12.540 INFO:tasks.workunit.client.1.vm05.stdout:2/351: dread - db/dd/d15/f70 zero size 2026-03-09T16:15:12.546 INFO:tasks.workunit.client.1.vm05.stdout:2/352: dread - db/dd/f6d zero size 2026-03-09T16:15:12.549 INFO:tasks.workunit.client.1.vm05.stdout:3/282: dwrite d0/d9/d22/d4c/d4e/f50 [0,4194304] 0 2026-03-09T16:15:12.550 INFO:tasks.workunit.client.1.vm05.stdout:2/353: chown db/dd/d15/d3f/d5b/d60/d6a/l72 1219 1 2026-03-09T16:15:12.554 INFO:tasks.workunit.client.1.vm05.stdout:1/361: creat d7/dd/d21/d39/f86 x:0 0 0 2026-03-09T16:15:12.559 INFO:tasks.workunit.client.1.vm05.stdout:2/354: dread db/dd/d15/d46/d67/f73 [0,4194304] 0 2026-03-09T16:15:12.560 INFO:tasks.workunit.client.1.vm05.stdout:2/355: write db/dd/d15/d3f/f4a [3598088,63122] 0 2026-03-09T16:15:12.561 INFO:tasks.workunit.client.1.vm05.stdout:0/383: rmdir d5/d11/d71 0 2026-03-09T16:15:12.562 INFO:tasks.workunit.client.1.vm05.stdout:0/384: dread - d5/d1b/f56 zero size 2026-03-09T16:15:12.566 INFO:tasks.workunit.client.1.vm05.stdout:8/325: mknod d4/d6/db/dc/d5d/c72 0 2026-03-09T16:15:12.567 INFO:tasks.workunit.client.1.vm05.stdout:8/326: fdatasync d4/d6/d3a/f28 0 2026-03-09T16:15:12.569 INFO:tasks.workunit.client.1.vm05.stdout:5/373: rename d8/d18/l37 to d8/d18/d1b/d47/d48/d73/l86 0 2026-03-09T16:15:12.576 INFO:tasks.workunit.client.1.vm05.stdout:2/356: creat db/dd/d15/d46/d67/f77 x:0 0 0 2026-03-09T16:15:12.579 INFO:tasks.workunit.client.1.vm05.stdout:0/385: unlink d5/d1b/d30/f2f 0 2026-03-09T16:15:12.579 INFO:tasks.workunit.client.1.vm05.stdout:2/357: dread db/dd/d15/d1f/f25 [4194304,4194304] 0 2026-03-09T16:15:12.579 INFO:tasks.workunit.client.1.vm05.stdout:0/386: chown d5/d1b/f25 124830373 1 2026-03-09T16:15:12.583 INFO:tasks.workunit.client.1.vm05.stdout:8/327: mkdir d4/d6/db/df/d73 0 2026-03-09T16:15:12.583 INFO:tasks.workunit.client.1.vm05.stdout:8/328: write d4/f23 [748036,105463] 0 2026-03-09T16:15:12.584 INFO:tasks.workunit.client.1.vm05.stdout:8/329: read - d4/d6/d3a/f49 zero size 2026-03-09T16:15:12.586 INFO:tasks.workunit.client.1.vm05.stdout:1/362: rename d7/dd/d21/d44/d5c/d78 to d7/dd/d21/d39/d87 0 2026-03-09T16:15:12.587 INFO:tasks.workunit.client.1.vm05.stdout:2/358: dread db/dd/d15/d1f/f36 [0,4194304] 0 2026-03-09T16:15:12.588 INFO:tasks.workunit.client.1.vm05.stdout:2/359: write db/dd/f1b [4608609,90746] 0 2026-03-09T16:15:12.589 INFO:tasks.workunit.client.1.vm05.stdout:6/337: link d17/d1d/c3e d17/d5d/d73/c80 0 2026-03-09T16:15:12.590 INFO:tasks.workunit.client.1.vm05.stdout:0/387: creat d5/db/d5f/f7b x:0 0 0 2026-03-09T16:15:12.591 INFO:tasks.workunit.client.1.vm05.stdout:4/342: getdents d5/de/d15/d21/d31 0 2026-03-09T16:15:12.593 INFO:tasks.workunit.client.1.vm05.stdout:2/360: dread db/dd/d15/d46/d67/f73 [0,4194304] 0 2026-03-09T16:15:12.594 INFO:tasks.workunit.client.1.vm05.stdout:8/330: symlink d4/d6/d3a/d3c/l74 0 2026-03-09T16:15:12.594 INFO:tasks.workunit.client.1.vm05.stdout:9/421: creat d4/d10/d35/d36/f91 x:0 0 0 2026-03-09T16:15:12.595 INFO:tasks.workunit.client.1.vm05.stdout:0/388: dwrite d5/d2c/f41 [0,4194304] 0 2026-03-09T16:15:12.597 INFO:tasks.workunit.client.1.vm05.stdout:3/283: creat d0/f57 x:0 0 0 2026-03-09T16:15:12.599 INFO:tasks.workunit.client.1.vm05.stdout:2/361: mkdir db/dd/d15/d1f/d20/d23/d78 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:5/374: link d8/d1d/l5d d8/d18/d1b/d47/l87 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:1/363: truncate d7/dd/d21/f2b 2654119 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:1/364: chown d7/dd/f19 338777420 1 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:9/422: rename d4/d10/d35/d36/d48/d60/c73 to d4/d10/d35/d2b/d31/c92 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:1/365: chown d7/dd/de/d52/d5b 1268 1 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:3/284: dwrite d0/d33/f29 [0,4194304] 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:3/285: write d0/d33/f3a [943430,38918] 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:6/338: rename d17/d22/d27/d34/c56 to d17/d22/d27/d34/c81 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:2/362: rename db/dd/d15/d1f/c45 to db/dd/d15/d4c/d56/c79 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:9/423: creat d4/d10/d35/d36/d48/d4c/f93 x:0 0 0 2026-03-09T16:15:12.612 INFO:tasks.workunit.client.1.vm05.stdout:9/424: write d4/d10/f8d [218754,104389] 0 2026-03-09T16:15:12.617 INFO:tasks.workunit.client.1.vm05.stdout:8/331: mkdir d4/d6/db/d75 0 2026-03-09T16:15:12.620 INFO:tasks.workunit.client.1.vm05.stdout:4/343: creat d5/de/d15/d21/f79 x:0 0 0 2026-03-09T16:15:12.620 INFO:tasks.workunit.client.1.vm05.stdout:0/389: creat d5/db/f7c x:0 0 0 2026-03-09T16:15:12.622 INFO:tasks.workunit.client.1.vm05.stdout:1/366: rename d7/dd/c4f to d7/d62/d72/c88 0 2026-03-09T16:15:12.622 INFO:tasks.workunit.client.1.vm05.stdout:9/425: mkdir d4/d10/d35/d36/d48/d60/d94 0 2026-03-09T16:15:12.622 INFO:tasks.workunit.client.1.vm05.stdout:2/363: chown db/c30 959 1 2026-03-09T16:15:12.623 INFO:tasks.workunit.client.1.vm05.stdout:3/286: mknod d0/c58 0 2026-03-09T16:15:12.623 INFO:tasks.workunit.client.1.vm05.stdout:3/287: readlink d0/l39 0 2026-03-09T16:15:12.624 INFO:tasks.workunit.client.1.vm05.stdout:8/332: creat d4/d6/d3a/d40/f76 x:0 0 0 2026-03-09T16:15:12.625 INFO:tasks.workunit.client.1.vm05.stdout:0/390: stat d5/ca 0 2026-03-09T16:15:12.626 INFO:tasks.workunit.client.1.vm05.stdout:5/375: creat d8/d18/d1b/f88 x:0 0 0 2026-03-09T16:15:12.626 INFO:tasks.workunit.client.1.vm05.stdout:4/344: rename d5/f3b to d5/de/d15/d21/d27/f7a 0 2026-03-09T16:15:12.629 INFO:tasks.workunit.client.1.vm05.stdout:2/364: creat db/dd/d15/d1f/d20/d23/f7a x:0 0 0 2026-03-09T16:15:12.629 INFO:tasks.workunit.client.1.vm05.stdout:3/288: symlink d0/d9/d22/d4c/d4e/l59 0 2026-03-09T16:15:12.630 INFO:tasks.workunit.client.1.vm05.stdout:3/289: readlink d0/l34 0 2026-03-09T16:15:12.630 INFO:tasks.workunit.client.1.vm05.stdout:4/345: dread - d5/de/d15/d21/f3a zero size 2026-03-09T16:15:12.632 INFO:tasks.workunit.client.1.vm05.stdout:5/376: unlink d8/d18/d1b/d47/d4e/f57 0 2026-03-09T16:15:12.645 INFO:tasks.workunit.client.1.vm05.stdout:2/365: mkdir db/dd/d7b 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:4/346: mkdir d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:1/367: dread d7/f9 [0,4194304] 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:9/426: dread d4/d10/d35/d2b/d38/f62 [0,4194304] 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:9/427: chown d4/d10/d35/d2b 0 1 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:5/377: creat d8/d3d/f89 x:0 0 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:8/333: creat d4/f77 x:0 0 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:3/290: creat d0/f5a x:0 0 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:3/291: fsync d0/d9/f51 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:8/334: chown d4/d6/db/dc/l48 212 1 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:4/347: mknod d5/de/d15/d21/d39/d5d/c7c 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:8/335: chown d4/d6/db/dc/f17 12977 1 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:1/368: symlink d7/d15/d45/l89 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:2/366: creat db/dd/d15/d3f/d5b/d60/f7c x:0 0 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:1/369: fsync d7/dd/d21/d3b/f65 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:3/292: rename d0/d9/fa to d0/d9/d22/f5b 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:5/378: dwrite d8/d59/f83 [0,4194304] 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:8/336: symlink d4/d6/db/dc/d5d/l78 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:1/370: rename d7/l1d to d7/dd/d21/d39/d5a/d50/l8a 0 2026-03-09T16:15:12.665 INFO:tasks.workunit.client.1.vm05.stdout:1/371: chown d7/dd/de/f23 2268 1 2026-03-09T16:15:12.669 INFO:tasks.workunit.client.1.vm05.stdout:8/337: truncate d4/d6/db/df/f18 2069646 0 2026-03-09T16:15:12.671 INFO:tasks.workunit.client.1.vm05.stdout:8/338: getdents d4/d6/d3a/d67 0 2026-03-09T16:15:12.672 INFO:tasks.workunit.client.1.vm05.stdout:8/339: dread - d4/d6/d3a/d3c/f6f zero size 2026-03-09T16:15:12.676 INFO:tasks.workunit.client.1.vm05.stdout:8/340: dwrite d4/f23 [0,4194304] 0 2026-03-09T16:15:12.678 INFO:tasks.workunit.client.1.vm05.stdout:8/341: mkdir d4/d6/db/dc/d5d/d79 0 2026-03-09T16:15:12.678 INFO:tasks.workunit.client.1.vm05.stdout:8/342: fsync d4/f3e 0 2026-03-09T16:15:12.680 INFO:tasks.workunit.client.1.vm05.stdout:8/343: link d4/f13 d4/d6/db/dc/d5d/f7a 0 2026-03-09T16:15:12.683 INFO:tasks.workunit.client.1.vm05.stdout:8/344: link d4/d6/d3a/d15/f66 d4/d6/d3a/d40/f7b 0 2026-03-09T16:15:12.687 INFO:tasks.workunit.client.1.vm05.stdout:8/345: chown d4/d6/db/d59 30 1 2026-03-09T16:15:12.688 INFO:tasks.workunit.client.1.vm05.stdout:8/346: mkdir d4/d6/d3a/d7c 0 2026-03-09T16:15:12.689 INFO:tasks.workunit.client.1.vm05.stdout:8/347: mknod d4/d6/d3a/d15/c7d 0 2026-03-09T16:15:12.690 INFO:tasks.workunit.client.1.vm05.stdout:8/348: symlink d4/d6/d3a/d40/d71/l7e 0 2026-03-09T16:15:12.691 INFO:tasks.workunit.client.1.vm05.stdout:8/349: creat d4/d6/d53/f7f x:0 0 0 2026-03-09T16:15:12.733 INFO:tasks.workunit.client.1.vm05.stdout:2/367: sync 2026-03-09T16:15:12.733 INFO:tasks.workunit.client.1.vm05.stdout:1/372: sync 2026-03-09T16:15:12.735 INFO:tasks.workunit.client.1.vm05.stdout:1/373: write d7/d15/d16/f66 [1752572,46336] 0 2026-03-09T16:15:12.735 INFO:tasks.workunit.client.1.vm05.stdout:1/374: fdatasync d7/dd/d21/d39/d48/f59 0 2026-03-09T16:15:12.738 INFO:tasks.workunit.client.1.vm05.stdout:2/368: rename db/dd/f6d to db/dd/d15/d3f/d5b/f7d 0 2026-03-09T16:15:12.739 INFO:tasks.workunit.client.1.vm05.stdout:1/375: creat d7/dd/de/d52/d5b/f8b x:0 0 0 2026-03-09T16:15:12.740 INFO:tasks.workunit.client.1.vm05.stdout:1/376: write d7/d62/d72/f79 [158244,17809] 0 2026-03-09T16:15:12.741 INFO:tasks.workunit.client.1.vm05.stdout:1/377: chown d7/d15 11232 1 2026-03-09T16:15:12.744 INFO:tasks.workunit.client.1.vm05.stdout:1/378: mkdir d7/dd/d21/d39/d48/d8c 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:2/369: dwrite db/dd/d15/d3f/d55/f76 [0,4194304] 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/379: write d7/dd/d21/d39/d5a/f41 [2529476,3326] 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:2/370: mkdir db/dd/d15/d3f/d5b/d7e 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/380: creat d7/d15/f8d x:0 0 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:2/371: unlink db/l34 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/381: dread d7/f3f [0,4194304] 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/382: fsync d7/d15/d16/f5f 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/383: creat d7/dd/d21/d44/f8e x:0 0 0 2026-03-09T16:15:12.760 INFO:tasks.workunit.client.1.vm05.stdout:1/384: symlink d7/dd/d21/d63/d71/l8f 0 2026-03-09T16:15:12.763 INFO:tasks.workunit.client.1.vm05.stdout:0/391: truncate d5/d1b/f50 698384 0 2026-03-09T16:15:12.772 INFO:tasks.workunit.client.1.vm05.stdout:7/397: truncate d1/d2/d8/dc/f3b 87484 0 2026-03-09T16:15:12.773 INFO:tasks.workunit.client.1.vm05.stdout:4/348: dwrite d5/de/d15/d21/d39/f42 [0,4194304] 0 2026-03-09T16:15:12.773 INFO:tasks.workunit.client.1.vm05.stdout:4/349: chown d5/l1d 2115 1 2026-03-09T16:15:12.775 INFO:tasks.workunit.client.1.vm05.stdout:4/350: mkdir d5/de/d15/d21/d39/d5d/d7d 0 2026-03-09T16:15:12.775 INFO:tasks.workunit.client.1.vm05.stdout:7/398: getdents d1/d2/d8/dc/d72 0 2026-03-09T16:15:12.776 INFO:tasks.workunit.client.1.vm05.stdout:4/351: dread - d5/de/d15/d21/d27/d3c/d5c/d5f/f57 zero size 2026-03-09T16:15:12.777 INFO:tasks.workunit.client.1.vm05.stdout:4/352: readlink d5/lc 0 2026-03-09T16:15:12.777 INFO:tasks.workunit.client.1.vm05.stdout:5/379: stat d8/d18/d1b/d2e/l39 0 2026-03-09T16:15:12.778 INFO:tasks.workunit.client.1.vm05.stdout:4/353: stat d5/de/d15/d21/d27/d3c/d5c/d5f/c4b 0 2026-03-09T16:15:12.778 INFO:tasks.workunit.client.1.vm05.stdout:7/399: symlink d1/d2/d8/dc/d14/l8b 0 2026-03-09T16:15:12.781 INFO:tasks.workunit.client.1.vm05.stdout:7/400: write d1/d2/d8/dc/d33/f57 [2659186,63531] 0 2026-03-09T16:15:12.782 INFO:tasks.workunit.client.1.vm05.stdout:1/385: read d7/f4b [104167,18526] 0 2026-03-09T16:15:12.784 INFO:tasks.workunit.client.1.vm05.stdout:5/380: creat d8/d53/d7e/f8a x:0 0 0 2026-03-09T16:15:12.785 INFO:tasks.workunit.client.1.vm05.stdout:7/401: read d1/d2/d8/dc/f1e [3081669,48969] 0 2026-03-09T16:15:12.786 INFO:tasks.workunit.client.1.vm05.stdout:1/386: write d7/d15/d16/f74 [422641,95350] 0 2026-03-09T16:15:12.787 INFO:tasks.workunit.client.1.vm05.stdout:4/354: rename d5/de/d15/d21/d27/d3c/d5c/d5f/f5e to d5/de/d15/d21/d39/d5d/d7d/f7e 0 2026-03-09T16:15:12.791 INFO:tasks.workunit.client.1.vm05.stdout:7/402: mknod d1/d2/d8/dc/d33/c8c 0 2026-03-09T16:15:12.792 INFO:tasks.workunit.client.1.vm05.stdout:7/403: chown d1/d2/d8/d31/f51 55 1 2026-03-09T16:15:12.795 INFO:tasks.workunit.client.1.vm05.stdout:4/355: mknod d5/de/d15/d21/d39/d5d/d7d/c7f 0 2026-03-09T16:15:12.801 INFO:tasks.workunit.client.1.vm05.stdout:1/387: rename d7/dd/d21/d39/d5a/d50/f7f to d7/d62/f90 0 2026-03-09T16:15:12.802 INFO:tasks.workunit.client.1.vm05.stdout:1/388: write d7/dd/d21/d63/d71/f7b [207207,83146] 0 2026-03-09T16:15:12.807 INFO:tasks.workunit.client.1.vm05.stdout:4/356: fsync d5/de/d15/f25 0 2026-03-09T16:15:12.808 INFO:tasks.workunit.client.1.vm05.stdout:1/389: unlink d7/dd/f19 0 2026-03-09T16:15:12.815 INFO:tasks.workunit.client.1.vm05.stdout:5/381: sync 2026-03-09T16:15:12.816 INFO:tasks.workunit.client.1.vm05.stdout:5/382: dread - d8/d53/d7a/f84 zero size 2026-03-09T16:15:12.820 INFO:tasks.workunit.client.1.vm05.stdout:4/357: rmdir d5/d19/d37 39 2026-03-09T16:15:12.822 INFO:tasks.workunit.client.1.vm05.stdout:1/390: symlink d7/d27/l91 0 2026-03-09T16:15:12.824 INFO:tasks.workunit.client.1.vm05.stdout:5/383: mkdir d8/d59/d5b/d8b 0 2026-03-09T16:15:12.826 INFO:tasks.workunit.client.1.vm05.stdout:4/358: mknod d5/d19/c80 0 2026-03-09T16:15:12.851 INFO:tasks.workunit.client.1.vm05.stdout:4/359: stat d5/de/d15/d21/d27/d3c/d5c 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:9/428: getdents d4/d10/d35/d36/d48/d4c 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:9/429: fdatasync d4/d10/d35/d36/d48/d54/d59/f5c 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:5/384: creat d8/d18/d1b/d2e/f8c x:0 0 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:5/385: chown d8/d18/d1b/d47 6971767 1 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:5/386: fdatasync d8/d18/d1b/f2a 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:1/391: link d7/dd/d21/d44/l85 d7/d15/d45/l92 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:1/392: readlink d7/dd/d21/d63/d71/l82 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:1/393: fdatasync d7/dd/d21/d39/d48/f59 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:1/394: truncate d7/dd/de/d52/f7d 3073 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:3/293: truncate d0/d33/f3a 421394 0 2026-03-09T16:15:12.852 INFO:tasks.workunit.client.1.vm05.stdout:5/387: symlink d8/d18/d1b/d47/l8d 0 2026-03-09T16:15:12.853 INFO:tasks.workunit.client.1.vm05.stdout:8/350: read d4/d6/db/df/f18 [90941,26174] 0 2026-03-09T16:15:12.853 INFO:tasks.workunit.client.1.vm05.stdout:1/395: creat d7/dd/f93 x:0 0 0 2026-03-09T16:15:12.853 INFO:tasks.workunit.client.1.vm05.stdout:9/430: rename d4/c3a to d4/d10/d35/d2b/d38/c95 0 2026-03-09T16:15:12.853 INFO:tasks.workunit.client.1.vm05.stdout:9/431: chown d4/d10/d35/d36/d48/f68 6 1 2026-03-09T16:15:12.853 INFO:tasks.workunit.client.1.vm05.stdout:5/388: unlink d8/d1d/f1e 0 2026-03-09T16:15:12.855 INFO:tasks.workunit.client.1.vm05.stdout:8/351: mkdir d4/d6/db/df/d80 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:9/432: mkdir d4/d10/d35/d2b/d31/d96 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:8/352: write d4/d6/f44 [77490,124291] 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:5/389: mkdir d8/d5e/d8e 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:1/396: dread d7/d27/f3c [0,4194304] 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:2/372: truncate db/dd/d15/d4c/f58 1525922 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:9/433: truncate d4/d10/d35/d36/d48/d54/f7e 1447834 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:1/397: symlink d7/dd/d21/d39/d5a/l94 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:8/353: read d4/d6/f1b [1542459,31778] 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:8/354: stat d4/f77 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:2/373: mknod db/dd/d15/d1f/d21/c7f 0 2026-03-09T16:15:12.867 INFO:tasks.workunit.client.1.vm05.stdout:1/398: stat d7/dd/de/d52/d5b 0 2026-03-09T16:15:12.869 INFO:tasks.workunit.client.1.vm05.stdout:5/390: write d8/d18/d1b/f36 [10860,94509] 0 2026-03-09T16:15:12.870 INFO:tasks.workunit.client.1.vm05.stdout:6/339: dwrite d17/d22/d27/f2a [0,4194304] 0 2026-03-09T16:15:12.870 INFO:tasks.workunit.client.1.vm05.stdout:8/355: write d4/d6/f5f [787247,53826] 0 2026-03-09T16:15:12.873 INFO:tasks.workunit.client.1.vm05.stdout:6/340: write d17/d4f/f70 [210331,39723] 0 2026-03-09T16:15:12.882 INFO:tasks.workunit.client.1.vm05.stdout:5/391: write d8/d18/d1b/f31 [3155545,56660] 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:9/434: dwrite d4/d10/d35/d36/d48/d60/f6c [0,4194304] 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:1/399: mkdir d7/dd/d21/d3b/d55/d95 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:5/392: dread - d8/d59/f5c zero size 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:2/374: creat db/dd/d15/d3f/d55/f80 x:0 0 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:6/341: dread - d17/d22/d27/d34/d42/d65/f75 zero size 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:7/404: dwrite d1/d2/d11/f54 [0,4194304] 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:2/375: chown db/dd/d15/d1f/c3a 13 1 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:7/405: stat d1/d2/d11/c82 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:1/400: write d7/d15/d16/f66 [1344878,106318] 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:2/376: readlink db/dd/l16 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:2/377: chown db/dd/d15/f70 129047071 1 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:7/406: rename d1/d2/d8/dc/d18 to d1/d2/d8/d31/d8d 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:6/342: truncate d17/f3b 614993 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:1/401: write d7/dd/de/f3e [2073338,12208] 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:5/393: mkdir d8/d18/d1b/d47/d4e/d76/d8f 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:7/407: creat d1/d2/d8/dc/d1b/d30/d4b/d65/f8e x:0 0 0 2026-03-09T16:15:12.908 INFO:tasks.workunit.client.1.vm05.stdout:8/356: getdents d4 0 2026-03-09T16:15:12.915 INFO:tasks.workunit.client.1.vm05.stdout:8/357: chown d4/d6/f5f 25304705 1 2026-03-09T16:15:12.915 INFO:tasks.workunit.client.1.vm05.stdout:6/343: dwrite d17/d22/d27/d34/d4b/f6c [0,4194304] 0 2026-03-09T16:15:12.916 INFO:tasks.workunit.client.1.vm05.stdout:9/435: rmdir d4/d10/d35/d36/d48/d4c/d89 0 2026-03-09T16:15:12.925 INFO:tasks.workunit.client.1.vm05.stdout:7/408: dwrite d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:12.928 INFO:tasks.workunit.client.1.vm05.stdout:9/436: chown d4/d10/f80 142964484 1 2026-03-09T16:15:12.931 INFO:tasks.workunit.client.1.vm05.stdout:6/344: mknod d17/d22/d27/d34/d42/c82 0 2026-03-09T16:15:12.931 INFO:tasks.workunit.client.1.vm05.stdout:8/358: rename d4/d6/db/d54/l56 to d4/d6/db/dc/d5d/d79/l81 0 2026-03-09T16:15:12.931 INFO:tasks.workunit.client.1.vm05.stdout:5/394: dread d8/d1d/f44 [0,4194304] 0 2026-03-09T16:15:12.934 INFO:tasks.workunit.client.1.vm05.stdout:8/359: stat d4/d6/d3a/d3c/l68 0 2026-03-09T16:15:12.934 INFO:tasks.workunit.client.1.vm05.stdout:7/409: dwrite d1/d2/d8/dc/d1b/d71/f59 [0,4194304] 0 2026-03-09T16:15:12.936 INFO:tasks.workunit.client.1.vm05.stdout:1/402: dread d7/d27/f33 [0,4194304] 0 2026-03-09T16:15:12.938 INFO:tasks.workunit.client.1.vm05.stdout:6/345: dwrite d17/d4f/f70 [0,4194304] 0 2026-03-09T16:15:12.944 INFO:tasks.workunit.client.1.vm05.stdout:9/437: rename d4/d10/d35/d36/d48/d60/f7d to d4/d10/d35/d2b/d31/d96/f97 0 2026-03-09T16:15:12.944 INFO:tasks.workunit.client.1.vm05.stdout:7/410: dwrite d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:12.947 INFO:tasks.workunit.client.1.vm05.stdout:6/346: dwrite d17/f18 [0,4194304] 0 2026-03-09T16:15:12.948 INFO:tasks.workunit.client.1.vm05.stdout:5/395: mkdir d8/d18/d1b/d78/d90 0 2026-03-09T16:15:12.950 INFO:tasks.workunit.client.1.vm05.stdout:5/396: write d8/d18/d1b/d2e/f8c [690354,61172] 0 2026-03-09T16:15:12.951 INFO:tasks.workunit.client.1.vm05.stdout:1/403: dread d7/dd/d21/d39/d48/f59 [0,4194304] 0 2026-03-09T16:15:12.970 INFO:tasks.workunit.client.1.vm05.stdout:4/360: dread d5/d19/f1f [0,4194304] 0 2026-03-09T16:15:12.971 INFO:tasks.workunit.client.1.vm05.stdout:9/438: truncate d4/d10/d35/d36/d48/f6e 850986 0 2026-03-09T16:15:12.972 INFO:tasks.workunit.client.1.vm05.stdout:9/439: write d4/d10/d35/d36/f77 [240548,118753] 0 2026-03-09T16:15:12.974 INFO:tasks.workunit.client.1.vm05.stdout:7/411: creat d1/d2/d8/dc/d1b/d30/d4b/d65/f8f x:0 0 0 2026-03-09T16:15:12.977 INFO:tasks.workunit.client.1.vm05.stdout:6/347: mkdir d17/d5d/d73/d83 0 2026-03-09T16:15:12.980 INFO:tasks.workunit.client.1.vm05.stdout:4/361: unlink d5/de/d15/d21/f3a 0 2026-03-09T16:15:12.981 INFO:tasks.workunit.client.1.vm05.stdout:4/362: dread - d5/de/d15/f52 zero size 2026-03-09T16:15:12.981 INFO:tasks.workunit.client.1.vm05.stdout:4/363: stat d5/l8 0 2026-03-09T16:15:12.982 INFO:tasks.workunit.client.1.vm05.stdout:8/360: rmdir d4/d6/db/df/d73 0 2026-03-09T16:15:12.983 INFO:tasks.workunit.client.1.vm05.stdout:8/361: chown d4/d6/d3a/f28 701361528 1 2026-03-09T16:15:12.987 INFO:tasks.workunit.client.1.vm05.stdout:5/397: mkdir d8/d91 0 2026-03-09T16:15:12.988 INFO:tasks.workunit.client.1.vm05.stdout:6/348: creat d17/d5d/f84 x:0 0 0 2026-03-09T16:15:12.991 INFO:tasks.workunit.client.1.vm05.stdout:9/440: dwrite d4/d10/d35/d2b/d38/d65/f6a [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:4/364: truncate d5/de/d15/f38 2169736 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:4/365: readlink d5/l8 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/441: dwrite d4/d10/d35/d36/d48/d54/d59/f7a [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/412: symlink d1/d2/d8/dc/d1b/d30/d4b/l90 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/413: chown d1/d2/d8/d67/d76 49056 1 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:5/398: rename d8/d18/d1b/f88 to d8/d53/d7a/f92 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/349: creat d17/d22/d27/d34/f85 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/442: creat d4/d10/d35/d36/d48/d60/f98 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/350: creat d17/d22/d27/d44/f86 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/443: creat d4/d10/d35/d2b/d31/f99 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:8/362: link d4/d6/d3a/d40/c4a d4/d6/db/d75/c82 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/351: mkdir d17/d22/d27/d34/d42/d53/d87 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/352: write d17/d22/d27/d34/d42/d68/f7c [571169,49890] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/414: dread d1/d2/d8/d31/f39 [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/444: fsync d4/f17 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:4/366: link c3 d5/d19/d37/c81 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:8/363: truncate d4/d6/d3a/f28 586 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/445: fdatasync d4/d10/f8a 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/446: read - d4/d10/d35/d36/f67 zero size 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/415: unlink d1/d2/d8/d31/d8d/l55 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/353: creat d17/d22/d27/d34/d4b/d7b/f88 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:8/364: dread d4/d6/db/df/f18 [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/447: mknod d4/d10/d35/d2b/d31/c9a 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/416: read - d1/d2/d8/d31/d8d/f52 zero size 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/354: write d17/d22/d27/d34/f85 [521380,87430] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/448: read - d4/d10/d35/d2b/d31/f76 zero size 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/417: chown d1/d2/d8/dc/d1b/d30/d4b/d65 643001834 1 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:6/355: chown d17/d1d/f1e 8022957 1 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/449: chown d4/d10/d35/d2b/d31/c92 9849 1 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/418: mkdir d1/d2/d11/d86/d8a/d91 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/419: readlink d1/d2/d8/dc/d1b/d71/l50 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/450: link d4/d10/d35/f7c d4/d10/d35/d2b/d31/d96/f9b 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/420: creat d1/d2/d8/dc/d1b/d30/d5e/f92 x:0 0 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/451: mknod d4/d10/d35/d36/c9c 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/421: dread d1/d2/d8/dc/d1b/f62 [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/452: rename d4/d10/d35/d2b/f63 to d4/d10/d35/d2b/f9d 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/422: dwrite d1/d2/d11/f54 [0,4194304] 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:9/453: chown d4/d10/d35/d36/d48/d54/d59/f7a 117359 1 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/423: stat d1/d2/d8/dc/d1b/c2c 0 2026-03-09T16:15:13.032 INFO:tasks.workunit.client.1.vm05.stdout:7/424: rmdir d1/d2/d11 39 2026-03-09T16:15:13.033 INFO:tasks.workunit.client.1.vm05.stdout:7/425: dread d1/d2/d8/dc/d1b/d30/d4b/d65/f7f [0,4194304] 0 2026-03-09T16:15:13.035 INFO:tasks.workunit.client.1.vm05.stdout:7/426: dread d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:13.036 INFO:tasks.workunit.client.1.vm05.stdout:7/427: dread - d1/d2/d8/dc/d1b/f66 zero size 2026-03-09T16:15:13.082 INFO:tasks.workunit.client.1.vm05.stdout:7/428: sync 2026-03-09T16:15:13.087 INFO:tasks.workunit.client.1.vm05.stdout:7/429: creat d1/d2/d8/dc/d1b/d30/f93 x:0 0 0 2026-03-09T16:15:13.089 INFO:tasks.workunit.client.1.vm05.stdout:5/399: read d8/d18/d1b/f2c [498870,100277] 0 2026-03-09T16:15:13.091 INFO:tasks.workunit.client.1.vm05.stdout:7/430: dwrite d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:13.093 INFO:tasks.workunit.client.1.vm05.stdout:7/431: chown d1/d2/d8/dc/d33 76426142 1 2026-03-09T16:15:13.094 INFO:tasks.workunit.client.1.vm05.stdout:7/432: symlink d1/d2/d8/d31/d8d/d5d/l94 0 2026-03-09T16:15:13.094 INFO:tasks.workunit.client.1.vm05.stdout:7/433: dread - d1/d2/d8/d31/d8d/f52 zero size 2026-03-09T16:15:13.096 INFO:tasks.workunit.client.1.vm05.stdout:7/434: truncate d1/d2/d8/dc/d1b/d30/d4b/d65/f63 758852 0 2026-03-09T16:15:13.097 INFO:tasks.workunit.client.1.vm05.stdout:7/435: write d1/d2/d11/f54 [768159,26704] 0 2026-03-09T16:15:13.097 INFO:tasks.workunit.client.1.vm05.stdout:7/436: chown d1/d2/d8/d67/d76 6460203 1 2026-03-09T16:15:13.102 INFO:tasks.workunit.client.1.vm05.stdout:7/437: write d1/d2/d8/dc/d1b/d30/d4b/d65/f7f [1550581,45250] 0 2026-03-09T16:15:13.103 INFO:tasks.workunit.client.1.vm05.stdout:7/438: write d1/d2/d8/dc/d33/f57 [4082478,33456] 0 2026-03-09T16:15:13.104 INFO:tasks.workunit.client.1.vm05.stdout:7/439: write d1/f26 [3821544,64985] 0 2026-03-09T16:15:13.109 INFO:tasks.workunit.client.1.vm05.stdout:7/440: getdents d1/d2/d8/dc/d72 0 2026-03-09T16:15:13.113 INFO:tasks.workunit.client.1.vm05.stdout:7/441: dwrite d1/d2/d11/f54 [0,4194304] 0 2026-03-09T16:15:13.235 INFO:tasks.workunit.client.1.vm05.stdout:4/367: dread d5/de/d15/d21/f50 [0,4194304] 0 2026-03-09T16:15:13.236 INFO:tasks.workunit.client.1.vm05.stdout:4/368: mkdir d5/de/d82 0 2026-03-09T16:15:13.237 INFO:tasks.workunit.client.1.vm05.stdout:4/369: symlink d5/l83 0 2026-03-09T16:15:13.241 INFO:tasks.workunit.client.1.vm05.stdout:4/370: dwrite d5/de/d15/d21/d39/d5d/f70 [0,4194304] 0 2026-03-09T16:15:13.353 INFO:tasks.workunit.client.1.vm05.stdout:6/356: fdatasync f16 0 2026-03-09T16:15:13.397 INFO:tasks.workunit.client.1.vm05.stdout:1/404: getdents d7/d27 0 2026-03-09T16:15:13.402 INFO:tasks.workunit.client.1.vm05.stdout:1/405: stat d7/dd/d21/d39/d48/d5d/c75 0 2026-03-09T16:15:13.403 INFO:tasks.workunit.client.1.vm05.stdout:5/400: getdents d8/d18/d1b/d47 0 2026-03-09T16:15:13.403 INFO:tasks.workunit.client.1.vm05.stdout:3/294: write d0/d9/f2b [1190571,58686] 0 2026-03-09T16:15:13.407 INFO:tasks.workunit.client.1.vm05.stdout:0/392: write d5/d1b/f50 [1220819,21723] 0 2026-03-09T16:15:13.408 INFO:tasks.workunit.client.1.vm05.stdout:0/393: fdatasync d5/d1b/d3b/f6f 0 2026-03-09T16:15:13.410 INFO:tasks.workunit.client.1.vm05.stdout:3/295: chown d0/d9/d22/f54 15 1 2026-03-09T16:15:13.416 INFO:tasks.workunit.client.1.vm05.stdout:0/394: symlink d5/d1b/d3b/l7d 0 2026-03-09T16:15:13.418 INFO:tasks.workunit.client.1.vm05.stdout:3/296: link d0/f56 d0/d9/f5c 0 2026-03-09T16:15:13.450 INFO:tasks.workunit.client.1.vm05.stdout:3/297: read d0/d9/d22/f54 [333540,7696] 0 2026-03-09T16:15:13.450 INFO:tasks.workunit.client.1.vm05.stdout:0/395: dwrite d5/db/d1d/f59 [0,4194304] 0 2026-03-09T16:15:13.450 INFO:tasks.workunit.client.1.vm05.stdout:3/298: creat d0/d9/d22/d4c/d4e/f5d x:0 0 0 2026-03-09T16:15:13.450 INFO:tasks.workunit.client.1.vm05.stdout:3/299: link d0/d9/d22/d4c/d4e/f5d d0/d33/f5e 0 2026-03-09T16:15:13.547 INFO:tasks.workunit.client.1.vm05.stdout:3/300: read d0/d9/d22/f18 [417525,83593] 0 2026-03-09T16:15:13.550 INFO:tasks.workunit.client.1.vm05.stdout:3/301: mkdir d0/d9/d22/d5f 0 2026-03-09T16:15:13.550 INFO:tasks.workunit.client.1.vm05.stdout:3/302: readlink d0/l28 0 2026-03-09T16:15:13.553 INFO:tasks.workunit.client.1.vm05.stdout:3/303: creat d0/f60 x:0 0 0 2026-03-09T16:15:13.555 INFO:tasks.workunit.client.1.vm05.stdout:3/304: dread d0/fd [0,4194304] 0 2026-03-09T16:15:13.556 INFO:tasks.workunit.client.1.vm05.stdout:3/305: mknod d0/d9/d22/d4c/d4e/c61 0 2026-03-09T16:15:13.558 INFO:tasks.workunit.client.1.vm05.stdout:3/306: mknod d0/c62 0 2026-03-09T16:15:13.559 INFO:tasks.workunit.client.1.vm05.stdout:3/307: creat d0/d33/f63 x:0 0 0 2026-03-09T16:15:13.561 INFO:tasks.workunit.client.1.vm05.stdout:3/308: creat d0/d33/f64 x:0 0 0 2026-03-09T16:15:13.568 INFO:tasks.workunit.client.1.vm05.stdout:3/309: dread d0/f49 [0,4194304] 0 2026-03-09T16:15:13.568 INFO:tasks.workunit.client.1.vm05.stdout:3/310: truncate d0/f56 217825 0 2026-03-09T16:15:13.571 INFO:tasks.workunit.client.1.vm05.stdout:3/311: dread d0/d9/d22/d4c/d4e/f50 [0,4194304] 0 2026-03-09T16:15:13.607 INFO:tasks.workunit.client.1.vm05.stdout:8/365: rename d4/d6/db/d54 to d4/d6/d3a/d15/d83 0 2026-03-09T16:15:13.610 INFO:tasks.workunit.client.1.vm05.stdout:7/442: rename d1/d2/d11/f54 to d1/d2/d8/dc/d1b/d71/d3c/f95 0 2026-03-09T16:15:13.617 INFO:tasks.workunit.client.1.vm05.stdout:7/443: creat d1/d2/d11/d86/f96 x:0 0 0 2026-03-09T16:15:13.618 INFO:tasks.workunit.client.1.vm05.stdout:7/444: fdatasync d1/d2/d11/d86/f96 0 2026-03-09T16:15:13.619 INFO:tasks.workunit.client.1.vm05.stdout:7/445: dread d1/d2/d8/dc/d1b/d30/d4b/d65/f63 [0,4194304] 0 2026-03-09T16:15:13.620 INFO:tasks.workunit.client.1.vm05.stdout:8/366: mkdir d4/d6/db/d75/d84 0 2026-03-09T16:15:13.621 INFO:tasks.workunit.client.1.vm05.stdout:6/357: dread d17/f1c [0,4194304] 0 2026-03-09T16:15:13.628 INFO:tasks.workunit.client.1.vm05.stdout:7/446: creat d1/d2/d8/dc/d1b/d71/f97 x:0 0 0 2026-03-09T16:15:13.639 INFO:tasks.workunit.client.1.vm05.stdout:6/358: dwrite d17/d1d/f41 [0,4194304] 0 2026-03-09T16:15:13.642 INFO:tasks.workunit.client.1.vm05.stdout:7/447: sync 2026-03-09T16:15:13.646 INFO:tasks.workunit.client.1.vm05.stdout:7/448: dread d1/d2/d8/dc/d1b/d71/f59 [0,4194304] 0 2026-03-09T16:15:13.663 INFO:tasks.workunit.client.1.vm05.stdout:9/454: dwrite d4/d10/d35/d36/d48/d60/f8f [0,4194304] 0 2026-03-09T16:15:13.664 INFO:tasks.workunit.client.1.vm05.stdout:7/449: rename d1/d2/d8/d31/d8d/d5d/l69 to d1/d2/d8/dc/d14/l98 0 2026-03-09T16:15:13.673 INFO:tasks.workunit.client.1.vm05.stdout:4/371: truncate d5/f45 2439498 0 2026-03-09T16:15:13.674 INFO:tasks.workunit.client.1.vm05.stdout:4/372: dread - d5/de/d15/d21/f79 zero size 2026-03-09T16:15:13.674 INFO:tasks.workunit.client.1.vm05.stdout:9/455: creat d4/d10/d35/d36/d48/f9e x:0 0 0 2026-03-09T16:15:13.675 INFO:tasks.workunit.client.1.vm05.stdout:6/359: link d17/d5d/d73/c80 d17/d1d/c89 0 2026-03-09T16:15:13.676 INFO:tasks.workunit.client.1.vm05.stdout:6/360: fsync d17/d22/d27/d34/d42/d65/f75 0 2026-03-09T16:15:13.682 INFO:tasks.workunit.client.1.vm05.stdout:9/456: chown d4/d10/d35/c5d 25 1 2026-03-09T16:15:13.684 INFO:tasks.workunit.client.1.vm05.stdout:6/361: rmdir d17/d22/d27/d34/d4b 39 2026-03-09T16:15:13.689 INFO:tasks.workunit.client.1.vm05.stdout:9/457: creat d4/d10/d35/d36/d48/d54/d59/f9f x:0 0 0 2026-03-09T16:15:13.691 INFO:tasks.workunit.client.1.vm05.stdout:1/406: dwrite d7/dd/de/f38 [0,4194304] 0 2026-03-09T16:15:13.692 INFO:tasks.workunit.client.1.vm05.stdout:1/407: chown d7/d62 2340 1 2026-03-09T16:15:13.693 INFO:tasks.workunit.client.1.vm05.stdout:4/373: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/c84 0 2026-03-09T16:15:13.693 INFO:tasks.workunit.client.1.vm05.stdout:1/408: truncate d7/f3f 1272649 0 2026-03-09T16:15:13.694 INFO:tasks.workunit.client.1.vm05.stdout:2/378: dwrite db/dd/d15/d4c/f58 [0,4194304] 0 2026-03-09T16:15:13.730 INFO:tasks.workunit.client.1.vm05.stdout:0/396: dwrite d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:13.733 INFO:tasks.workunit.client.1.vm05.stdout:3/312: truncate d0/d9/f37 768678 0 2026-03-09T16:15:13.734 INFO:tasks.workunit.client.1.vm05.stdout:8/367: getdents d4/d6/db 0 2026-03-09T16:15:13.735 INFO:tasks.workunit.client.1.vm05.stdout:9/458: fsync d4/d10/d35/f7c 0 2026-03-09T16:15:13.736 INFO:tasks.workunit.client.1.vm05.stdout:1/409: mkdir d7/dd/de/d96 0 2026-03-09T16:15:13.750 INFO:tasks.workunit.client.1.vm05.stdout:0/397: dread d5/d2c/d49/f5d [0,4194304] 0 2026-03-09T16:15:13.751 INFO:tasks.workunit.client.1.vm05.stdout:0/398: write d5/d2c/d49/f7a [511278,11253] 0 2026-03-09T16:15:13.751 INFO:tasks.workunit.client.1.vm05.stdout:0/399: write d5/db/d1d/f2e [3485146,11275] 0 2026-03-09T16:15:13.756 INFO:tasks.workunit.client.1.vm05.stdout:8/368: dread d4/d6/db/dc/d2e/f47 [0,4194304] 0 2026-03-09T16:15:13.768 INFO:tasks.workunit.client.1.vm05.stdout:8/369: sync 2026-03-09T16:15:13.770 INFO:tasks.workunit.client.1.vm05.stdout:8/370: sync 2026-03-09T16:15:13.804 INFO:tasks.workunit.client.1.vm05.stdout:3/313: mknod d0/d9/d22/c65 0 2026-03-09T16:15:13.807 INFO:tasks.workunit.client.1.vm05.stdout:1/410: symlink d7/d15/d45/l97 0 2026-03-09T16:15:13.819 INFO:tasks.workunit.client.1.vm05.stdout:1/411: creat d7/dd/d21/d39/d48/d5d/f98 x:0 0 0 2026-03-09T16:15:13.819 INFO:tasks.workunit.client.1.vm05.stdout:1/412: dread - d7/d27/f57 zero size 2026-03-09T16:15:13.822 INFO:tasks.workunit.client.1.vm05.stdout:7/450: truncate d1/d2/d8/dc/d1b/f42 3427438 0 2026-03-09T16:15:13.822 INFO:tasks.workunit.client.1.vm05.stdout:1/413: write d7/d15/f8d [730766,57286] 0 2026-03-09T16:15:13.831 INFO:tasks.workunit.client.1.vm05.stdout:1/414: truncate d7/dd/f93 790539 0 2026-03-09T16:15:13.833 INFO:tasks.workunit.client.1.vm05.stdout:1/415: read d7/f9 [2640120,121176] 0 2026-03-09T16:15:13.836 INFO:tasks.workunit.client.1.vm05.stdout:7/451: creat d1/d2/d8/d67/f99 x:0 0 0 2026-03-09T16:15:13.837 INFO:tasks.workunit.client.1.vm05.stdout:0/400: getdents d5/d2c 0 2026-03-09T16:15:13.838 INFO:tasks.workunit.client.1.vm05.stdout:0/401: readlink d5/d11/d4f/l53 0 2026-03-09T16:15:13.844 INFO:tasks.workunit.client.1.vm05.stdout:7/452: rmdir d1/d2/d11/d86/d8a 39 2026-03-09T16:15:13.847 INFO:tasks.workunit.client.1.vm05.stdout:0/402: dwrite d5/d11/f1e [0,4194304] 0 2026-03-09T16:15:13.847 INFO:tasks.workunit.client.1.vm05.stdout:1/416: creat d7/dd/d21/d3b/d55/d95/f99 x:0 0 0 2026-03-09T16:15:13.852 INFO:tasks.workunit.client.1.vm05.stdout:0/403: readlink d5/d11/d4f/l53 0 2026-03-09T16:15:13.852 INFO:tasks.workunit.client.1.vm05.stdout:4/374: write d5/de/d15/f1b [7903453,57154] 0 2026-03-09T16:15:13.859 INFO:tasks.workunit.client.1.vm05.stdout:0/404: symlink d5/d2c/l7e 0 2026-03-09T16:15:13.860 INFO:tasks.workunit.client.1.vm05.stdout:4/375: fdatasync d5/de/d15/d21/f2a 0 2026-03-09T16:15:13.860 INFO:tasks.workunit.client.1.vm05.stdout:1/417: fsync d7/f4b 0 2026-03-09T16:15:13.868 INFO:tasks.workunit.client.1.vm05.stdout:0/405: creat d5/d2c/f7f x:0 0 0 2026-03-09T16:15:13.872 INFO:tasks.workunit.client.1.vm05.stdout:0/406: creat d5/d2c/d49/f80 x:0 0 0 2026-03-09T16:15:13.873 INFO:tasks.workunit.client.1.vm05.stdout:1/418: symlink d7/l9a 0 2026-03-09T16:15:13.876 INFO:tasks.workunit.client.1.vm05.stdout:0/407: creat d5/d11/d4f/f81 x:0 0 0 2026-03-09T16:15:13.881 INFO:tasks.workunit.client.1.vm05.stdout:0/408: mkdir d5/db/d5b/d82 0 2026-03-09T16:15:13.884 INFO:tasks.workunit.client.1.vm05.stdout:1/419: dread d7/dd/de/d52/d5b/f5e [0,4194304] 0 2026-03-09T16:15:13.886 INFO:tasks.workunit.client.1.vm05.stdout:1/420: chown d7/dd/d21/d3b 2673980 1 2026-03-09T16:15:13.888 INFO:tasks.workunit.client.1.vm05.stdout:1/421: chown d7/dd/d21/d3b/d55/c7e 6 1 2026-03-09T16:15:13.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:13 vm03.local ceph-mon[51019]: pgmap v16: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 13 MiB/s rd, 87 MiB/s wr, 225 op/s 2026-03-09T16:15:13.903 INFO:tasks.workunit.client.1.vm05.stdout:1/422: getdents d7/dd/d21/d39 0 2026-03-09T16:15:13.913 INFO:tasks.workunit.client.1.vm05.stdout:6/362: rename d17/d22/d27/d34/d4b/d7b to d17/d22/d27/d8a 0 2026-03-09T16:15:13.913 INFO:tasks.workunit.client.1.vm05.stdout:6/363: fdatasync d17/f31 0 2026-03-09T16:15:13.914 INFO:tasks.workunit.client.1.vm05.stdout:6/364: write d17/d22/d27/f6b [4653773,2641] 0 2026-03-09T16:15:13.917 INFO:tasks.workunit.client.1.vm05.stdout:6/365: mkdir d17/d22/d27/d8a/d8b 0 2026-03-09T16:15:13.918 INFO:tasks.workunit.client.1.vm05.stdout:6/366: stat d17/f5b 0 2026-03-09T16:15:13.918 INFO:tasks.workunit.client.1.vm05.stdout:6/367: fdatasync f16 0 2026-03-09T16:15:13.920 INFO:tasks.workunit.client.1.vm05.stdout:6/368: write d17/d5d/f71 [64059,69823] 0 2026-03-09T16:15:13.922 INFO:tasks.workunit.client.1.vm05.stdout:3/314: write d0/d9/d22/f2e [2003790,26159] 0 2026-03-09T16:15:13.923 INFO:tasks.workunit.client.1.vm05.stdout:6/369: creat d17/d4f/f8c x:0 0 0 2026-03-09T16:15:13.925 INFO:tasks.workunit.client.1.vm05.stdout:8/371: dwrite d4/d6/db/dc/f30 [0,4194304] 0 2026-03-09T16:15:13.926 INFO:tasks.workunit.client.1.vm05.stdout:3/315: creat d0/d9/d22/d5f/f66 x:0 0 0 2026-03-09T16:15:13.932 INFO:tasks.workunit.client.1.vm05.stdout:8/372: readlink d4/d6/d3a/d40/d71/l7e 0 2026-03-09T16:15:13.937 INFO:tasks.workunit.client.1.vm05.stdout:3/316: mkdir d0/d9/d22/d4c/d67 0 2026-03-09T16:15:13.937 INFO:tasks.workunit.client.1.vm05.stdout:6/370: dwrite d17/d5d/f84 [0,4194304] 0 2026-03-09T16:15:13.937 INFO:tasks.workunit.client.1.vm05.stdout:3/317: chown d0/d9/f2f 1996905930 1 2026-03-09T16:15:13.939 INFO:tasks.workunit.client.1.vm05.stdout:3/318: read d0/d9/f5c [33001,71058] 0 2026-03-09T16:15:13.941 INFO:tasks.workunit.client.1.vm05.stdout:3/319: write d0/f57 [588696,100434] 0 2026-03-09T16:15:13.946 INFO:tasks.workunit.client.1.vm05.stdout:3/320: symlink d0/d33/l68 0 2026-03-09T16:15:13.946 INFO:tasks.workunit.client.1.vm05.stdout:6/371: link d17/c45 d17/d5d/c8d 0 2026-03-09T16:15:13.948 INFO:tasks.workunit.client.1.vm05.stdout:8/373: dread d4/d6/db/dc/d2e/f47 [4194304,4194304] 0 2026-03-09T16:15:13.950 INFO:tasks.workunit.client.1.vm05.stdout:8/374: mkdir d4/d6/db/dc/d2e/d85 0 2026-03-09T16:15:13.951 INFO:tasks.workunit.client.1.vm05.stdout:6/372: truncate d17/d22/d27/d34/d42/d65/f75 461992 0 2026-03-09T16:15:13.957 INFO:tasks.workunit.client.1.vm05.stdout:6/373: write d17/d22/d27/d34/f6e [360801,70121] 0 2026-03-09T16:15:13.957 INFO:tasks.workunit.client.1.vm05.stdout:0/409: rmdir d5/d2c/d49 39 2026-03-09T16:15:13.957 INFO:tasks.workunit.client.1.vm05.stdout:8/375: mkdir d4/d6/db/df/d80/d86 0 2026-03-09T16:15:13.959 INFO:tasks.workunit.client.1.vm05.stdout:3/321: dwrite d0/d9/f2b [0,4194304] 0 2026-03-09T16:15:13.964 INFO:tasks.workunit.client.1.vm05.stdout:6/374: creat d17/d5d/f8e x:0 0 0 2026-03-09T16:15:13.968 INFO:tasks.workunit.client.1.vm05.stdout:0/410: dread d5/db/d1d/f59 [0,4194304] 0 2026-03-09T16:15:13.970 INFO:tasks.workunit.client.1.vm05.stdout:8/376: mkdir d4/d6/db/d59/d87 0 2026-03-09T16:15:13.973 INFO:tasks.workunit.client.1.vm05.stdout:6/375: dwrite d17/d5d/f78 [0,4194304] 0 2026-03-09T16:15:13.974 INFO:tasks.workunit.client.1.vm05.stdout:3/322: creat d0/f69 x:0 0 0 2026-03-09T16:15:13.975 INFO:tasks.workunit.client.1.vm05.stdout:6/376: stat d17/d5d/d73/d83 0 2026-03-09T16:15:13.978 INFO:tasks.workunit.client.1.vm05.stdout:1/423: dwrite d7/dd/de/d52/d5b/f5e [0,4194304] 0 2026-03-09T16:15:13.982 INFO:tasks.workunit.client.1.vm05.stdout:8/377: dwrite d4/d6/d3a/d40/f4e [0,4194304] 0 2026-03-09T16:15:13.982 INFO:tasks.workunit.client.1.vm05.stdout:0/411: dwrite d5/d11/f40 [0,4194304] 0 2026-03-09T16:15:13.987 INFO:tasks.workunit.client.1.vm05.stdout:8/378: stat d4/d6/d3a/f25 0 2026-03-09T16:15:13.988 INFO:tasks.workunit.client.1.vm05.stdout:6/377: symlink d17/d22/d27/d34/d42/d65/l8f 0 2026-03-09T16:15:13.988 INFO:tasks.workunit.client.1.vm05.stdout:7/453: creat d1/d2/d8/f9a x:0 0 0 2026-03-09T16:15:13.995 INFO:tasks.workunit.client.1.vm05.stdout:6/378: read d17/f4e [304356,40299] 0 2026-03-09T16:15:14.000 INFO:tasks.workunit.client.1.vm05.stdout:8/379: creat d4/d6/d3a/f88 x:0 0 0 2026-03-09T16:15:14.000 INFO:tasks.workunit.client.1.vm05.stdout:8/380: write d4/d6/db/dc/f41 [1807135,26676] 0 2026-03-09T16:15:14.001 INFO:tasks.workunit.client.1.vm05.stdout:1/424: dwrite d7/dd/d21/d39/d5a/f41 [0,4194304] 0 2026-03-09T16:15:14.003 INFO:tasks.workunit.client.1.vm05.stdout:8/381: dread - d4/d6/d3a/f88 zero size 2026-03-09T16:15:14.007 INFO:tasks.workunit.client.1.vm05.stdout:1/425: chown d7/dd/d21/d39/d5a/f54 58540 1 2026-03-09T16:15:14.014 INFO:tasks.workunit.client.1.vm05.stdout:7/454: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d6d/f6b [0,4194304] 0 2026-03-09T16:15:14.014 INFO:tasks.workunit.client.1.vm05.stdout:6/379: dwrite f16 [0,4194304] 0 2026-03-09T16:15:14.017 INFO:tasks.workunit.client.1.vm05.stdout:1/426: chown d7/dd/d21/d39/d5a/f41 6530565 1 2026-03-09T16:15:14.022 INFO:tasks.workunit.client.1.vm05.stdout:8/382: write d4/d6/db/dc/d2e/f46 [442134,50690] 0 2026-03-09T16:15:14.022 INFO:tasks.workunit.client.1.vm05.stdout:8/383: dread - d4/d6/d3a/d40/f52 zero size 2026-03-09T16:15:14.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:13 vm05.local ceph-mon[58702]: pgmap v16: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 13 MiB/s rd, 87 MiB/s wr, 225 op/s 2026-03-09T16:15:14.037 INFO:tasks.workunit.client.1.vm05.stdout:1/427: symlink d7/dd/d21/d3b/l9b 0 2026-03-09T16:15:14.037 INFO:tasks.workunit.client.1.vm05.stdout:1/428: truncate d7/dd/de/d52/d5b/f8b 467438 0 2026-03-09T16:15:14.039 INFO:tasks.workunit.client.1.vm05.stdout:8/384: write d4/d6/f1b [4457195,33281] 0 2026-03-09T16:15:14.042 INFO:tasks.workunit.client.1.vm05.stdout:7/455: creat d1/d2/d8/dc/d1b/d71/d3c/f9b x:0 0 0 2026-03-09T16:15:14.047 INFO:tasks.workunit.client.1.vm05.stdout:1/429: symlink d7/dd/d21/d39/d5a/l9c 0 2026-03-09T16:15:14.050 INFO:tasks.workunit.client.1.vm05.stdout:1/430: dread d7/f4b [0,4194304] 0 2026-03-09T16:15:14.053 INFO:tasks.workunit.client.1.vm05.stdout:8/385: link d4/d6/d3a/d40/f52 d4/d6/d53/f89 0 2026-03-09T16:15:14.059 INFO:tasks.workunit.client.1.vm05.stdout:8/386: mknod d4/d6/db/df/d80/c8a 0 2026-03-09T16:15:14.063 INFO:tasks.workunit.client.1.vm05.stdout:4/376: mknod d5/c85 0 2026-03-09T16:15:14.063 INFO:tasks.workunit.client.1.vm05.stdout:2/379: rename db/f5f to db/dd/d15/f81 0 2026-03-09T16:15:14.067 INFO:tasks.workunit.client.1.vm05.stdout:9/459: rename d4/d10/d35/d36/d48/d54/f7e to d4/d10/d35/d2b/d38/fa0 0 2026-03-09T16:15:14.068 INFO:tasks.workunit.client.1.vm05.stdout:4/377: creat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 x:0 0 0 2026-03-09T16:15:14.079 INFO:tasks.workunit.client.1.vm05.stdout:6/380: dread d17/d22/d27/d44/f48 [0,4194304] 0 2026-03-09T16:15:14.080 INFO:tasks.workunit.client.1.vm05.stdout:5/401: rename d8/d18/d1b/d2e/d43/f62 to d8/d18/d1b/d6b/f93 0 2026-03-09T16:15:14.082 INFO:tasks.workunit.client.1.vm05.stdout:5/402: read f1 [1675758,85921] 0 2026-03-09T16:15:14.100 INFO:tasks.workunit.client.1.vm05.stdout:4/378: dread d5/de/d15/d21/d27/f29 [0,4194304] 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:2/380: link db/dd/d15/d1f/c3a db/dd/d15/d1f/c82 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:5/403: chown d8/d18/d1b/c33 100191754 1 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:2/381: chown db/dd/d15/d3f/d55/f80 15687277 1 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:2/382: read - db/dd/d15/d1f/d20/d23/f7a zero size 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:1/431: dread d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:2/383: dread - db/dd/d15/d1f/d20/d23/f7a zero size 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:1/432: fdatasync d7/dd/d21/d63/d71/f7b 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:7/456: rename d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d6d to d1/d2/d8/dc/d9c 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:7/457: readlink d1/d2/la 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:6/381: rename d17/f1a to d17/d22/d27/d34/d42/d53/f90 0 2026-03-09T16:15:14.101 INFO:tasks.workunit.client.1.vm05.stdout:1/433: dwrite d7/fb [8388608,4194304] 0 2026-03-09T16:15:14.111 INFO:tasks.workunit.client.1.vm05.stdout:1/434: dwrite d7/dd/d21/f3d [0,4194304] 0 2026-03-09T16:15:14.113 INFO:tasks.workunit.client.1.vm05.stdout:2/384: link db/dd/d15/d1f/c82 db/dd/d15/d1f/c83 0 2026-03-09T16:15:14.114 INFO:tasks.workunit.client.1.vm05.stdout:2/385: write db/dd/d15/f51 [3171006,86440] 0 2026-03-09T16:15:14.115 INFO:tasks.workunit.client.1.vm05.stdout:1/435: truncate d7/f9 1787829 0 2026-03-09T16:15:14.115 INFO:tasks.workunit.client.1.vm05.stdout:2/386: write db/dd/d15/f48 [5131111,126387] 0 2026-03-09T16:15:14.116 INFO:tasks.workunit.client.1.vm05.stdout:1/436: write d7/d62/d72/f79 [691664,72803] 0 2026-03-09T16:15:14.118 INFO:tasks.workunit.client.1.vm05.stdout:9/460: sync 2026-03-09T16:15:14.119 INFO:tasks.workunit.client.1.vm05.stdout:5/404: dread d8/f55 [0,4194304] 0 2026-03-09T16:15:14.120 INFO:tasks.workunit.client.1.vm05.stdout:1/437: read d7/d15/f8d [770153,121833] 0 2026-03-09T16:15:14.122 INFO:tasks.workunit.client.1.vm05.stdout:2/387: unlink db/dd/d15/d1f/d20/d23/c27 0 2026-03-09T16:15:14.125 INFO:tasks.workunit.client.1.vm05.stdout:9/461: creat d4/d10/d35/d2b/fa1 x:0 0 0 2026-03-09T16:15:14.126 INFO:tasks.workunit.client.1.vm05.stdout:4/379: sync 2026-03-09T16:15:14.128 INFO:tasks.workunit.client.1.vm05.stdout:6/382: sync 2026-03-09T16:15:14.129 INFO:tasks.workunit.client.1.vm05.stdout:1/438: mknod d7/d27/c9d 0 2026-03-09T16:15:14.130 INFO:tasks.workunit.client.1.vm05.stdout:6/383: read - d17/d22/d27/d34/d4b/f6d zero size 2026-03-09T16:15:14.132 INFO:tasks.workunit.client.1.vm05.stdout:2/388: symlink db/dd/d15/d3f/d5b/d60/l84 0 2026-03-09T16:15:14.133 INFO:tasks.workunit.client.1.vm05.stdout:5/405: write d8/d18/d1b/f31 [2069117,52116] 0 2026-03-09T16:15:14.135 INFO:tasks.workunit.client.1.vm05.stdout:4/380: unlink d5/d19/l69 0 2026-03-09T16:15:14.135 INFO:tasks.workunit.client.1.vm05.stdout:7/458: dread d1/d2/d11/f1c [0,4194304] 0 2026-03-09T16:15:14.136 INFO:tasks.workunit.client.1.vm05.stdout:1/439: chown d7/dd/d21/f3d 0 1 2026-03-09T16:15:14.142 INFO:tasks.workunit.client.1.vm05.stdout:2/389: sync 2026-03-09T16:15:14.146 INFO:tasks.workunit.client.1.vm05.stdout:9/462: dwrite d4/f5b [4194304,4194304] 0 2026-03-09T16:15:14.148 INFO:tasks.workunit.client.1.vm05.stdout:7/459: write d1/d2/d8/dc/f1a [3712584,99213] 0 2026-03-09T16:15:14.150 INFO:tasks.workunit.client.1.vm05.stdout:6/384: dwrite f11 [0,4194304] 0 2026-03-09T16:15:14.151 INFO:tasks.workunit.client.1.vm05.stdout:9/463: write d4/d10/d35/d2b/f2c [3681298,63631] 0 2026-03-09T16:15:14.163 INFO:tasks.workunit.client.1.vm05.stdout:3/323: dwrite d0/d9/d22/f30 [4194304,4194304] 0 2026-03-09T16:15:14.163 INFO:tasks.workunit.client.1.vm05.stdout:3/324: fsync d0/d9/f2b 0 2026-03-09T16:15:14.163 INFO:tasks.workunit.client.1.vm05.stdout:3/325: stat d0/f56 0 2026-03-09T16:15:14.163 INFO:tasks.workunit.client.1.vm05.stdout:1/440: rmdir d7/dd/de/d52/d5b 39 2026-03-09T16:15:14.164 INFO:tasks.workunit.client.1.vm05.stdout:2/390: symlink db/dd/d15/d4c/l85 0 2026-03-09T16:15:14.168 INFO:tasks.workunit.client.1.vm05.stdout:0/412: write d5/db/f6e [2087080,39316] 0 2026-03-09T16:15:14.169 INFO:tasks.workunit.client.1.vm05.stdout:6/385: symlink d17/d5d/d73/d83/l91 0 2026-03-09T16:15:14.170 INFO:tasks.workunit.client.1.vm05.stdout:9/464: symlink d4/d10/d35/d36/d48/d60/d94/la2 0 2026-03-09T16:15:14.187 INFO:tasks.workunit.client.1.vm05.stdout:0/413: dwrite d5/d2c/f28 [4194304,4194304] 0 2026-03-09T16:15:14.187 INFO:tasks.workunit.client.1.vm05.stdout:0/414: chown d5/d1b/d30/c26 3 1 2026-03-09T16:15:14.188 INFO:tasks.workunit.client.1.vm05.stdout:3/326: dwrite d0/f60 [0,4194304] 0 2026-03-09T16:15:14.192 INFO:tasks.workunit.client.1.vm05.stdout:1/441: read d7/dd/de/f23 [21206,129633] 0 2026-03-09T16:15:14.194 INFO:tasks.workunit.client.1.vm05.stdout:6/386: dwrite d17/f30 [0,4194304] 0 2026-03-09T16:15:14.202 INFO:tasks.workunit.client.1.vm05.stdout:9/465: getdents d4/d10/d35 0 2026-03-09T16:15:14.202 INFO:tasks.workunit.client.1.vm05.stdout:9/466: readlink d4/l1f 0 2026-03-09T16:15:14.202 INFO:tasks.workunit.client.1.vm05.stdout:1/442: write d7/dd/d21/d63/d71/f7b [1276043,43001] 0 2026-03-09T16:15:14.203 INFO:tasks.workunit.client.1.vm05.stdout:1/443: fdatasync d7/d62/f69 0 2026-03-09T16:15:14.214 INFO:tasks.workunit.client.1.vm05.stdout:6/387: symlink d17/d22/d27/l92 0 2026-03-09T16:15:14.215 INFO:tasks.workunit.client.1.vm05.stdout:0/415: getdents d5/db/d5f 0 2026-03-09T16:15:14.216 INFO:tasks.workunit.client.1.vm05.stdout:6/388: creat d17/d4f/f93 x:0 0 0 2026-03-09T16:15:14.217 INFO:tasks.workunit.client.1.vm05.stdout:5/406: dread d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:14.223 INFO:tasks.workunit.client.1.vm05.stdout:3/327: dread d0/f57 [0,4194304] 0 2026-03-09T16:15:14.223 INFO:tasks.workunit.client.1.vm05.stdout:9/467: getdents d4/d10/d35/d2b/d31 0 2026-03-09T16:15:14.224 INFO:tasks.workunit.client.1.vm05.stdout:1/444: dwrite d7/dd/d21/d44/f46 [4194304,4194304] 0 2026-03-09T16:15:14.226 INFO:tasks.workunit.client.1.vm05.stdout:9/468: chown d4/d10/d35/d2b/d31/f99 12 1 2026-03-09T16:15:14.230 INFO:tasks.workunit.client.1.vm05.stdout:5/407: write d8/d53/d7a/f92 [871807,12147] 0 2026-03-09T16:15:14.232 INFO:tasks.workunit.client.1.vm05.stdout:3/328: symlink d0/l6a 0 2026-03-09T16:15:14.244 INFO:tasks.workunit.client.1.vm05.stdout:1/445: getdents d7/dd/de/d96 0 2026-03-09T16:15:14.244 INFO:tasks.workunit.client.1.vm05.stdout:9/469: dwrite d4/d10/f8a [0,4194304] 0 2026-03-09T16:15:14.244 INFO:tasks.workunit.client.1.vm05.stdout:8/387: write d4/d6/d3a/f28 [840977,128341] 0 2026-03-09T16:15:14.246 INFO:tasks.workunit.client.1.vm05.stdout:8/388: read - d4/d6/db/d59/f60 zero size 2026-03-09T16:15:14.252 INFO:tasks.workunit.client.1.vm05.stdout:8/389: chown d4/f1c 0 1 2026-03-09T16:15:14.252 INFO:tasks.workunit.client.1.vm05.stdout:1/446: rename d7/dd/d21/d2d/l2f to d7/dd/d21/d3b/d55/d95/l9e 0 2026-03-09T16:15:14.252 INFO:tasks.workunit.client.1.vm05.stdout:9/470: mkdir d4/d10/d35/d36/d48/d54/da3 0 2026-03-09T16:15:14.252 INFO:tasks.workunit.client.1.vm05.stdout:0/416: sync 2026-03-09T16:15:14.254 INFO:tasks.workunit.client.1.vm05.stdout:7/460: write d1/d2/d8/d31/d8d/f56 [1379406,125220] 0 2026-03-09T16:15:14.260 INFO:tasks.workunit.client.1.vm05.stdout:8/390: read d4/d6/d3a/f25 [2800341,75024] 0 2026-03-09T16:15:14.260 INFO:tasks.workunit.client.1.vm05.stdout:4/381: dwrite d5/f3e [0,4194304] 0 2026-03-09T16:15:14.266 INFO:tasks.workunit.client.1.vm05.stdout:1/447: creat d7/d62/d72/f9f x:0 0 0 2026-03-09T16:15:14.267 INFO:tasks.workunit.client.1.vm05.stdout:7/461: chown d1/d2/d11/d86/f96 37143754 1 2026-03-09T16:15:14.267 INFO:tasks.workunit.client.1.vm05.stdout:8/391: creat d4/d6/d3a/f8b x:0 0 0 2026-03-09T16:15:14.270 INFO:tasks.workunit.client.1.vm05.stdout:4/382: rmdir d5/de/d15 39 2026-03-09T16:15:14.271 INFO:tasks.workunit.client.1.vm05.stdout:4/383: stat d5/d19/d37/d60 0 2026-03-09T16:15:14.280 INFO:tasks.workunit.client.1.vm05.stdout:0/417: mkdir d5/d2c/d49/d83 0 2026-03-09T16:15:14.280 INFO:tasks.workunit.client.1.vm05.stdout:0/418: chown d5/d11/d4f/d70 65 1 2026-03-09T16:15:14.280 INFO:tasks.workunit.client.1.vm05.stdout:7/462: dwrite d1/d2/f22 [4194304,4194304] 0 2026-03-09T16:15:14.280 INFO:tasks.workunit.client.1.vm05.stdout:8/392: dwrite d4/d6/d3a/d3c/f45 [0,4194304] 0 2026-03-09T16:15:14.280 INFO:tasks.workunit.client.1.vm05.stdout:2/391: truncate db/dd/f32 2126917 0 2026-03-09T16:15:14.287 INFO:tasks.workunit.client.1.vm05.stdout:4/384: dwrite d5/de/d15/d21/d31/f64 [4194304,4194304] 0 2026-03-09T16:15:14.291 INFO:tasks.workunit.client.1.vm05.stdout:2/392: unlink db/dd/d15/c2f 0 2026-03-09T16:15:14.293 INFO:tasks.workunit.client.1.vm05.stdout:7/463: creat d1/d2/d8/dc/d33/f9d x:0 0 0 2026-03-09T16:15:14.297 INFO:tasks.workunit.client.1.vm05.stdout:2/393: mkdir db/dd/d15/d1f/d20/d86 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:0/419: creat d5/d2c/f84 x:0 0 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:2/394: mkdir db/dd/d15/d1f/d21/d87 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:4/385: creat d5/de/f87 x:0 0 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:8/393: dwrite d4/d6/f5f [0,4194304] 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:0/420: creat d5/db/d5f/f85 x:0 0 0 2026-03-09T16:15:14.304 INFO:tasks.workunit.client.1.vm05.stdout:8/394: chown d4/d6/f29 2493379 1 2026-03-09T16:15:14.305 INFO:tasks.workunit.client.1.vm05.stdout:4/386: symlink d5/de/d15/d21/d27/l88 0 2026-03-09T16:15:14.307 INFO:tasks.workunit.client.1.vm05.stdout:0/421: symlink d5/d1b/l86 0 2026-03-09T16:15:14.308 INFO:tasks.workunit.client.1.vm05.stdout:8/395: write d4/d6/d3a/d40/f4e [44512,42719] 0 2026-03-09T16:15:14.312 INFO:tasks.workunit.client.1.vm05.stdout:4/387: fdatasync d5/de/d15/f25 0 2026-03-09T16:15:14.313 INFO:tasks.workunit.client.1.vm05.stdout:4/388: dread - d5/de/d15/d21/d31/f72 zero size 2026-03-09T16:15:14.317 INFO:tasks.workunit.client.1.vm05.stdout:0/422: rename d5/db/d5f/f65 to d5/db/d5b/f87 0 2026-03-09T16:15:14.319 INFO:tasks.workunit.client.1.vm05.stdout:8/396: symlink d4/d55/l8c 0 2026-03-09T16:15:14.319 INFO:tasks.workunit.client.1.vm05.stdout:2/395: dwrite db/dd/d15/d3f/d5b/f7d [0,4194304] 0 2026-03-09T16:15:14.319 INFO:tasks.workunit.client.1.vm05.stdout:6/389: truncate d17/d22/d27/d34/d4b/f6c 315795 0 2026-03-09T16:15:14.320 INFO:tasks.workunit.client.1.vm05.stdout:3/329: write d0/f56 [962333,127425] 0 2026-03-09T16:15:14.327 INFO:tasks.workunit.client.1.vm05.stdout:0/423: mknod d5/d11/d4f/d70/c88 0 2026-03-09T16:15:14.336 INFO:tasks.workunit.client.1.vm05.stdout:8/397: rmdir d4/d55 39 2026-03-09T16:15:14.337 INFO:tasks.workunit.client.1.vm05.stdout:3/330: unlink d0/d9/c3d 0 2026-03-09T16:15:14.338 INFO:tasks.workunit.client.1.vm05.stdout:6/390: mknod d17/d22/c94 0 2026-03-09T16:15:14.348 INFO:tasks.workunit.client.1.vm05.stdout:6/391: fsync d17/d4f/f8c 0 2026-03-09T16:15:14.352 INFO:tasks.workunit.client.1.vm05.stdout:4/389: dread d5/de/d15/d21/d39/f44 [0,4194304] 0 2026-03-09T16:15:14.355 INFO:tasks.workunit.client.1.vm05.stdout:3/331: mkdir d0/d9/d22/d6b 0 2026-03-09T16:15:14.356 INFO:tasks.workunit.client.1.vm05.stdout:2/396: dwrite db/dd/d15/d3f/d55/f76 [0,4194304] 0 2026-03-09T16:15:14.358 INFO:tasks.workunit.client.1.vm05.stdout:8/398: write d4/d6/db/dc/f2a [1447204,107342] 0 2026-03-09T16:15:14.359 INFO:tasks.workunit.client.1.vm05.stdout:0/424: getdents d5/db/d48 0 2026-03-09T16:15:14.363 INFO:tasks.workunit.client.1.vm05.stdout:3/332: mknod d0/d33/c6c 0 2026-03-09T16:15:14.364 INFO:tasks.workunit.client.1.vm05.stdout:9/471: getdents d4/d10/d35/d36/d48/d54 0 2026-03-09T16:15:14.373 INFO:tasks.workunit.client.1.vm05.stdout:2/397: creat db/dd/d15/d46/f88 x:0 0 0 2026-03-09T16:15:14.373 INFO:tasks.workunit.client.1.vm05.stdout:7/464: write d1/d2/d11/f25 [976044,59658] 0 2026-03-09T16:15:14.376 INFO:tasks.workunit.client.1.vm05.stdout:4/390: dread d5/de/d15/f25 [0,4194304] 0 2026-03-09T16:15:14.382 INFO:tasks.workunit.client.1.vm05.stdout:0/425: read d5/db/f12 [213521,48660] 0 2026-03-09T16:15:14.384 INFO:tasks.workunit.client.1.vm05.stdout:9/472: dwrite d4/d10/d35/d36/d48/d54/d59/f5c [0,4194304] 0 2026-03-09T16:15:14.385 INFO:tasks.workunit.client.1.vm05.stdout:2/398: write db/dd/d15/d3f/d55/f76 [1039658,78964] 0 2026-03-09T16:15:14.387 INFO:tasks.workunit.client.1.vm05.stdout:2/399: fdatasync db/dd/f10 0 2026-03-09T16:15:14.388 INFO:tasks.workunit.client.1.vm05.stdout:3/333: rmdir d0/d9/d22/d4c/d67 0 2026-03-09T16:15:14.395 INFO:tasks.workunit.client.1.vm05.stdout:4/391: creat d5/de/d15/d21/d39/d5d/d7d/f89 x:0 0 0 2026-03-09T16:15:14.395 INFO:tasks.workunit.client.1.vm05.stdout:0/426: creat d5/db/d5b/d82/f89 x:0 0 0 2026-03-09T16:15:14.400 INFO:tasks.workunit.client.1.vm05.stdout:7/465: dwrite d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:14.404 INFO:tasks.workunit.client.1.vm05.stdout:9/473: unlink d4/d10/f3f 0 2026-03-09T16:15:14.409 INFO:tasks.workunit.client.1.vm05.stdout:2/400: symlink db/dd/d15/d3f/l89 0 2026-03-09T16:15:14.411 INFO:tasks.workunit.client.1.vm05.stdout:2/401: chown db/dd/d15/d3f 10018 1 2026-03-09T16:15:14.413 INFO:tasks.workunit.client.1.vm05.stdout:4/392: mkdir d5/de/d2f/d8a 0 2026-03-09T16:15:14.416 INFO:tasks.workunit.client.1.vm05.stdout:2/402: chown db/dd/d15/d46 33 1 2026-03-09T16:15:14.421 INFO:tasks.workunit.client.1.vm05.stdout:7/466: symlink d1/d2/d8/d67/l9e 0 2026-03-09T16:15:14.421 INFO:tasks.workunit.client.1.vm05.stdout:0/427: creat d5/d11/f8a x:0 0 0 2026-03-09T16:15:14.423 INFO:tasks.workunit.client.1.vm05.stdout:9/474: symlink d4/la4 0 2026-03-09T16:15:14.424 INFO:tasks.workunit.client.1.vm05.stdout:3/334: dwrite d0/f49 [4194304,4194304] 0 2026-03-09T16:15:14.426 INFO:tasks.workunit.client.1.vm05.stdout:0/428: chown d5/d2c/d49/d83 122 1 2026-03-09T16:15:14.432 INFO:tasks.workunit.client.1.vm05.stdout:7/467: creat d1/d2/d8/dc/d33/f9f x:0 0 0 2026-03-09T16:15:14.441 INFO:tasks.workunit.client.1.vm05.stdout:2/403: dwrite db/dd/d15/d1f/f49 [8388608,4194304] 0 2026-03-09T16:15:14.441 INFO:tasks.workunit.client.1.vm05.stdout:6/392: truncate d17/d4f/f70 1158815 0 2026-03-09T16:15:14.441 INFO:tasks.workunit.client.1.vm05.stdout:3/335: dread d0/fd [0,4194304] 0 2026-03-09T16:15:14.441 INFO:tasks.workunit.client.1.vm05.stdout:2/404: dread - db/dd/f6b zero size 2026-03-09T16:15:14.441 INFO:tasks.workunit.client.1.vm05.stdout:4/393: getdents d5/de/d15/d21/d27 0 2026-03-09T16:15:14.442 INFO:tasks.workunit.client.1.vm05.stdout:6/393: dwrite d17/d5d/f71 [0,4194304] 0 2026-03-09T16:15:14.442 INFO:tasks.workunit.client.1.vm05.stdout:7/468: fdatasync d1/d2/d8/dc/d1b/f66 0 2026-03-09T16:15:14.458 INFO:tasks.workunit.client.1.vm05.stdout:3/336: symlink d0/d9/d22/d6b/l6d 0 2026-03-09T16:15:14.461 INFO:tasks.workunit.client.1.vm05.stdout:7/469: dwrite d1/d2/d8/dc/d1b/d71/d3c/f6e [0,4194304] 0 2026-03-09T16:15:14.461 INFO:tasks.workunit.client.1.vm05.stdout:4/394: link d5/l8 d5/l8b 0 2026-03-09T16:15:14.461 INFO:tasks.workunit.client.1.vm05.stdout:2/405: stat db/dd/d15/d1f/d20/f53 0 2026-03-09T16:15:14.462 INFO:tasks.workunit.client.1.vm05.stdout:4/395: chown d5/de/d15/d21/d27 905 1 2026-03-09T16:15:14.463 INFO:tasks.workunit.client.1.vm05.stdout:4/396: fdatasync d5/d19/f48 0 2026-03-09T16:15:14.464 INFO:tasks.workunit.client.1.vm05.stdout:6/394: rename d17/d1d/f24 to d17/f95 0 2026-03-09T16:15:14.464 INFO:tasks.workunit.client.1.vm05.stdout:3/337: fdatasync d0/d9/f2c 0 2026-03-09T16:15:14.466 INFO:tasks.workunit.client.1.vm05.stdout:2/406: dread db/dd/d15/d1f/d20/f3d [0,4194304] 0 2026-03-09T16:15:14.469 INFO:tasks.workunit.client.1.vm05.stdout:4/397: mkdir d5/de/d15/d21/d39/d5d/d8c 0 2026-03-09T16:15:14.470 INFO:tasks.workunit.client.1.vm05.stdout:7/470: getdents d1/d2/d11/d86/d8a/d91 0 2026-03-09T16:15:14.470 INFO:tasks.workunit.client.1.vm05.stdout:3/338: dread - d0/d9/d22/d5f/f66 zero size 2026-03-09T16:15:14.470 INFO:tasks.workunit.client.1.vm05.stdout:2/407: creat db/dd/d15/d3f/d5b/d60/d6a/f8a x:0 0 0 2026-03-09T16:15:14.470 INFO:tasks.workunit.client.1.vm05.stdout:6/395: dread d17/f2d [0,4194304] 0 2026-03-09T16:15:14.477 INFO:tasks.workunit.client.1.vm05.stdout:3/339: creat d0/d9/f6e x:0 0 0 2026-03-09T16:15:14.482 INFO:tasks.workunit.client.1.vm05.stdout:3/340: chown d0/d9/d22/f2e 1 1 2026-03-09T16:15:14.482 INFO:tasks.workunit.client.1.vm05.stdout:6/396: read d17/d22/d27/d34/d42/d53/f90 [33719,95408] 0 2026-03-09T16:15:14.482 INFO:tasks.workunit.client.1.vm05.stdout:3/341: write d0/f5a [675376,95888] 0 2026-03-09T16:15:14.482 INFO:tasks.workunit.client.1.vm05.stdout:6/397: chown d17/d4f/l6f 30286939 1 2026-03-09T16:15:14.483 INFO:tasks.workunit.client.1.vm05.stdout:2/408: read db/dd/d15/d1f/f2b [315984,47237] 0 2026-03-09T16:15:14.483 INFO:tasks.workunit.client.1.vm05.stdout:7/471: dwrite d1/d2/d8/d31/d8d/f80 [0,4194304] 0 2026-03-09T16:15:14.484 INFO:tasks.workunit.client.1.vm05.stdout:6/398: creat d17/d4f/f96 x:0 0 0 2026-03-09T16:15:14.485 INFO:tasks.workunit.client.1.vm05.stdout:6/399: rmdir d17/d5d/d73 39 2026-03-09T16:15:14.486 INFO:tasks.workunit.client.1.vm05.stdout:7/472: creat d1/fa0 x:0 0 0 2026-03-09T16:15:14.486 INFO:tasks.workunit.client.1.vm05.stdout:6/400: chown d17/d22/d27/d34/d42/d68/f7c 229623 1 2026-03-09T16:15:14.487 INFO:tasks.workunit.client.1.vm05.stdout:7/473: stat d1/d2/d8/dc/d1b/d30/d4b/d65/f7f 0 2026-03-09T16:15:14.489 INFO:tasks.workunit.client.1.vm05.stdout:6/401: rename d17/d22/d27/d34/f6e to d17/d22/d27/d58/f97 0 2026-03-09T16:15:14.489 INFO:tasks.workunit.client.1.vm05.stdout:7/474: mknod d1/d2/d8/dc/d1b/d71/d3c/ca1 0 2026-03-09T16:15:14.498 INFO:tasks.workunit.client.1.vm05.stdout:5/408: fdatasync d8/d53/d7a/f92 0 2026-03-09T16:15:14.499 INFO:tasks.workunit.client.1.vm05.stdout:3/342: dread d0/d9/d22/f2a [0,4194304] 0 2026-03-09T16:15:14.500 INFO:tasks.workunit.client.1.vm05.stdout:5/409: stat d8/d18/d1b/d6b 0 2026-03-09T16:15:14.500 INFO:tasks.workunit.client.1.vm05.stdout:6/402: write d17/d22/d27/d8a/f88 [138129,116613] 0 2026-03-09T16:15:14.505 INFO:tasks.workunit.client.1.vm05.stdout:5/410: creat d8/d3d/f94 x:0 0 0 2026-03-09T16:15:14.505 INFO:tasks.workunit.client.1.vm05.stdout:2/409: dwrite db/dd/d15/d3f/d5b/f69 [0,4194304] 0 2026-03-09T16:15:14.512 INFO:tasks.workunit.client.1.vm05.stdout:3/343: rename d0/l34 to d0/d9/d22/d4c/d4e/l6f 0 2026-03-09T16:15:14.514 INFO:tasks.workunit.client.1.vm05.stdout:3/344: read d0/d9/d22/d4c/d4e/f55 [4823794,59746] 0 2026-03-09T16:15:14.514 INFO:tasks.workunit.client.1.vm05.stdout:5/411: fsync d8/d18/d1b/d47/d4e/f64 0 2026-03-09T16:15:14.514 INFO:tasks.workunit.client.1.vm05.stdout:2/410: write db/f12 [1690782,49244] 0 2026-03-09T16:15:14.519 INFO:tasks.workunit.client.1.vm05.stdout:6/403: link d17/d22/d27/d34/d42/d53/f55 d17/d22/d27/d34/d4b/f98 0 2026-03-09T16:15:14.519 INFO:tasks.workunit.client.1.vm05.stdout:7/475: mkdir d1/d2/d11/d86/da2 0 2026-03-09T16:15:14.520 INFO:tasks.workunit.client.1.vm05.stdout:5/412: dread - d8/d53/d7e/f8a zero size 2026-03-09T16:15:14.520 INFO:tasks.workunit.client.1.vm05.stdout:3/345: chown d0/d9/d22/f18 177 1 2026-03-09T16:15:14.521 INFO:tasks.workunit.client.1.vm05.stdout:6/404: write d17/f5b [872785,64371] 0 2026-03-09T16:15:14.522 INFO:tasks.workunit.client.1.vm05.stdout:7/476: dread - d1/d2/d8/dc/d1b/d30/d5e/f92 zero size 2026-03-09T16:15:14.522 INFO:tasks.workunit.client.1.vm05.stdout:6/405: chown d17/d22/d27/f6b 421921 1 2026-03-09T16:15:14.524 INFO:tasks.workunit.client.1.vm05.stdout:6/406: chown d17/d22/d27/l92 18 1 2026-03-09T16:15:14.525 INFO:tasks.workunit.client.1.vm05.stdout:7/477: truncate d1/d2/d11/d86/f96 890392 0 2026-03-09T16:15:14.529 INFO:tasks.workunit.client.1.vm05.stdout:2/411: rmdir db/dd/d15/d1f 39 2026-03-09T16:15:14.530 INFO:tasks.workunit.client.1.vm05.stdout:7/478: creat d1/d2/d11/d86/d8a/fa3 x:0 0 0 2026-03-09T16:15:14.535 INFO:tasks.workunit.client.1.vm05.stdout:6/407: fdatasync d17/d22/d27/d34/d42/d53/f55 0 2026-03-09T16:15:14.546 INFO:tasks.workunit.client.1.vm05.stdout:5/413: mkdir d8/d95 0 2026-03-09T16:15:14.546 INFO:tasks.workunit.client.1.vm05.stdout:6/408: chown d17/d1d/c1f 245222 1 2026-03-09T16:15:14.546 INFO:tasks.workunit.client.1.vm05.stdout:6/409: readlink d17/d22/l59 0 2026-03-09T16:15:14.546 INFO:tasks.workunit.client.1.vm05.stdout:6/410: readlink d17/d22/d27/l92 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:3/346: symlink d0/l70 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:4/398: fsync d5/de/d15/d21/d39/d5d/d7d/f89 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:7/479: symlink d1/d2/d8/dc/d72/la4 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:2/412: readlink db/dd/d15/d1f/d21/l33 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:6/411: unlink d17/d22/d27/d44/l57 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:3/347: mknod d0/d9/c71 0 2026-03-09T16:15:14.547 INFO:tasks.workunit.client.1.vm05.stdout:7/480: truncate d1/d2/d8/dc/d1b/d71/f97 186796 0 2026-03-09T16:15:14.553 INFO:tasks.workunit.client.1.vm05.stdout:4/399: creat d5/de/d15/d21/d27/d3c/d5c/d5f/f8d x:0 0 0 2026-03-09T16:15:14.562 INFO:tasks.workunit.client.1.vm05.stdout:9/475: truncate d4/d10/d35/d2b/d38/f78 128043 0 2026-03-09T16:15:14.563 INFO:tasks.workunit.client.1.vm05.stdout:4/400: write d5/de/d15/d21/d39/d5d/f6a [1289576,34775] 0 2026-03-09T16:15:14.567 INFO:tasks.workunit.client.1.vm05.stdout:4/401: chown d5/de/d15/d21/f79 8 1 2026-03-09T16:15:14.571 INFO:tasks.workunit.client.1.vm05.stdout:2/413: read db/dd/d15/d1f/d21/f29 [528421,100301] 0 2026-03-09T16:15:14.572 INFO:tasks.workunit.client.1.vm05.stdout:9/476: creat d4/d10/d35/d2b/d38/d65/fa5 x:0 0 0 2026-03-09T16:15:14.576 INFO:tasks.workunit.client.1.vm05.stdout:0/429: truncate d5/d2c/f41 2636661 0 2026-03-09T16:15:14.579 INFO:tasks.workunit.client.1.vm05.stdout:5/414: dread d8/f11 [0,4194304] 0 2026-03-09T16:15:14.579 INFO:tasks.workunit.client.1.vm05.stdout:6/412: truncate d17/f1b 921439 0 2026-03-09T16:15:14.582 INFO:tasks.workunit.client.1.vm05.stdout:3/348: dwrite d0/d9/d22/d4c/d4e/f55 [4194304,4194304] 0 2026-03-09T16:15:14.591 INFO:tasks.workunit.client.1.vm05.stdout:4/402: creat d5/de/d2f/f8e x:0 0 0 2026-03-09T16:15:14.591 INFO:tasks.workunit.client.1.vm05.stdout:1/448: truncate d7/f9 1042232 0 2026-03-09T16:15:14.592 INFO:tasks.workunit.client.1.vm05.stdout:9/477: creat d4/d10/d35/d2b/d38/fa6 x:0 0 0 2026-03-09T16:15:14.601 INFO:tasks.workunit.client.1.vm05.stdout:3/349: chown d0/d9/d22/d4c/d4e/f50 6429239 1 2026-03-09T16:15:14.604 INFO:tasks.workunit.client.1.vm05.stdout:3/350: chown d0/d9/f51 1965 1 2026-03-09T16:15:14.604 INFO:tasks.workunit.client.1.vm05.stdout:0/430: write d5/d1b/f6a [4340276,101851] 0 2026-03-09T16:15:14.605 INFO:tasks.workunit.client.1.vm05.stdout:4/403: chown d5/de/d15/d21/d27/d3c/d5c/d5f/c4b 7 1 2026-03-09T16:15:14.605 INFO:tasks.workunit.client.1.vm05.stdout:9/478: truncate d4/d10/d35/d36/d48/d60/f98 1040026 0 2026-03-09T16:15:14.609 INFO:tasks.workunit.client.1.vm05.stdout:9/479: stat d4/d10/d35/f44 0 2026-03-09T16:15:14.609 INFO:tasks.workunit.client.1.vm05.stdout:4/404: dread - d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 zero size 2026-03-09T16:15:14.609 INFO:tasks.workunit.client.1.vm05.stdout:5/415: write d8/d59/d5b/f66 [483761,25627] 0 2026-03-09T16:15:14.610 INFO:tasks.workunit.client.1.vm05.stdout:6/413: symlink d17/d4f/l99 0 2026-03-09T16:15:14.614 INFO:tasks.workunit.client.1.vm05.stdout:9/480: fdatasync d4/d10/d35/d36/d48/f9e 0 2026-03-09T16:15:14.615 INFO:tasks.workunit.client.1.vm05.stdout:1/449: creat d7/dd/d21/d39/d48/d5d/fa0 x:0 0 0 2026-03-09T16:15:14.615 INFO:tasks.workunit.client.1.vm05.stdout:7/481: getdents d1/d2/d8/dc/d1b/d71/d3c 0 2026-03-09T16:15:14.625 INFO:tasks.workunit.client.1.vm05.stdout:9/481: fdatasync d4/d10/f15 0 2026-03-09T16:15:14.630 INFO:tasks.workunit.client.1.vm05.stdout:0/431: mkdir d5/d2c/d49/d83/d8b 0 2026-03-09T16:15:14.632 INFO:tasks.workunit.client.1.vm05.stdout:1/450: read d7/d15/d16/f74 [337482,22906] 0 2026-03-09T16:15:14.637 INFO:tasks.workunit.client.1.vm05.stdout:6/414: creat d17/d5d/d73/d83/f9a x:0 0 0 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:5/416: mknod d8/d18/d1b/d47/d4e/d76/d8f/c96 0 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:0/432: chown d5/db/f12 11097 1 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:6/415: dread d17/d5d/f78 [0,4194304] 0 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:6/416: read d17/d5d/f78 [2145526,34668] 0 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:7/482: creat d1/d2/d8/dc/d1b/d30/d7d/fa5 x:0 0 0 2026-03-09T16:15:14.645 INFO:tasks.workunit.client.1.vm05.stdout:4/405: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:14.649 INFO:tasks.workunit.client.1.vm05.stdout:6/417: chown d17/d22/d27/d34/d4b 236518 1 2026-03-09T16:15:14.652 INFO:tasks.workunit.client.1.vm05.stdout:3/351: dwrite d0/d33/f5e [0,4194304] 0 2026-03-09T16:15:14.657 INFO:tasks.workunit.client.1.vm05.stdout:9/482: sync 2026-03-09T16:15:14.664 INFO:tasks.workunit.client.1.vm05.stdout:4/406: creat d5/de/d15/d21/d27/f8f x:0 0 0 2026-03-09T16:15:14.666 INFO:tasks.workunit.client.1.vm05.stdout:3/352: symlink d0/d9/l72 0 2026-03-09T16:15:14.666 INFO:tasks.workunit.client.1.vm05.stdout:2/414: truncate db/dd/d15/d3f/d5b/f69 4057741 0 2026-03-09T16:15:14.667 INFO:tasks.workunit.client.1.vm05.stdout:1/451: mknod d7/dd/de/d52/ca1 0 2026-03-09T16:15:14.670 INFO:tasks.workunit.client.1.vm05.stdout:4/407: write d5/de/d15/d21/d39/f53 [626737,102163] 0 2026-03-09T16:15:14.670 INFO:tasks.workunit.client.1.vm05.stdout:9/483: creat d4/d10/d35/d2b/d31/d96/fa7 x:0 0 0 2026-03-09T16:15:14.670 INFO:tasks.workunit.client.1.vm05.stdout:6/418: link d17/f60 d17/d22/d27/d34/d42/f9b 0 2026-03-09T16:15:14.671 INFO:tasks.workunit.client.1.vm05.stdout:3/353: creat d0/d9/f73 x:0 0 0 2026-03-09T16:15:14.671 INFO:tasks.workunit.client.1.vm05.stdout:4/408: chown d5/de/d15/d21/d27/d3c/f3d 443 1 2026-03-09T16:15:14.671 INFO:tasks.workunit.client.1.vm05.stdout:2/415: mknod db/dd/d15/d3f/d5b/d60/c8b 0 2026-03-09T16:15:14.671 INFO:tasks.workunit.client.1.vm05.stdout:9/484: dread - d4/d10/d35/d36/d48/d54/d59/f9f zero size 2026-03-09T16:15:14.672 INFO:tasks.workunit.client.1.vm05.stdout:7/483: creat d1/d2/d11/d86/fa6 x:0 0 0 2026-03-09T16:15:14.673 INFO:tasks.workunit.client.1.vm05.stdout:6/419: fsync f11 0 2026-03-09T16:15:14.679 INFO:tasks.workunit.client.1.vm05.stdout:2/416: fdatasync db/dd/d15/d1f/f36 0 2026-03-09T16:15:14.680 INFO:tasks.workunit.client.1.vm05.stdout:1/452: symlink d7/dd/de/d52/d5b/la2 0 2026-03-09T16:15:14.688 INFO:tasks.workunit.client.1.vm05.stdout:7/484: truncate d1/d2/d8/dc/d1b/d30/d5e/f81 344933 0 2026-03-09T16:15:14.701 INFO:tasks.workunit.client.1.vm05.stdout:2/417: dwrite db/dd/d15/f70 [0,4194304] 0 2026-03-09T16:15:14.702 INFO:tasks.workunit.client.1.vm05.stdout:5/417: getdents d8/d53/d7a 0 2026-03-09T16:15:14.702 INFO:tasks.workunit.client.1.vm05.stdout:6/420: rmdir d17/d22/d27 39 2026-03-09T16:15:14.706 INFO:tasks.workunit.client.1.vm05.stdout:2/418: fsync db/dd/f1b 0 2026-03-09T16:15:14.706 INFO:tasks.workunit.client.1.vm05.stdout:3/354: rename d0/d9/d22/c24 to d0/d9/c74 0 2026-03-09T16:15:14.708 INFO:tasks.workunit.client.1.vm05.stdout:1/453: mkdir d7/d62/da3 0 2026-03-09T16:15:14.708 INFO:tasks.workunit.client.1.vm05.stdout:1/454: chown d7/dd/de/d96 274062 1 2026-03-09T16:15:14.719 INFO:tasks.workunit.client.1.vm05.stdout:4/409: dwrite d5/de/d15/f34 [0,4194304] 0 2026-03-09T16:15:14.729 INFO:tasks.workunit.client.1.vm05.stdout:1/455: dwrite d7/dd/de/d52/f7d [0,4194304] 0 2026-03-09T16:15:14.736 INFO:tasks.workunit.client.1.vm05.stdout:1/456: chown d7/dd/d21/d39/d5a/f54 6 1 2026-03-09T16:15:14.737 INFO:tasks.workunit.client.1.vm05.stdout:9/485: rename d4/d10/f2a to d4/d10/d35/d2b/d31/fa8 0 2026-03-09T16:15:14.737 INFO:tasks.workunit.client.1.vm05.stdout:6/421: write d17/d22/d27/d34/d42/d53/f74 [1824259,129208] 0 2026-03-09T16:15:14.737 INFO:tasks.workunit.client.1.vm05.stdout:5/418: symlink d8/d18/d1b/d47/d4e/l97 0 2026-03-09T16:15:14.744 INFO:tasks.workunit.client.1.vm05.stdout:6/422: fsync d17/d1d/f41 0 2026-03-09T16:15:14.752 INFO:tasks.workunit.client.1.vm05.stdout:3/355: fsync d0/d9/f37 0 2026-03-09T16:15:14.755 INFO:tasks.workunit.client.1.vm05.stdout:2/419: mknod db/dd/d15/d1f/d20/d23/c8c 0 2026-03-09T16:15:14.765 INFO:tasks.workunit.client.1.vm05.stdout:2/420: write db/dd/f10 [3166819,45380] 0 2026-03-09T16:15:14.765 INFO:tasks.workunit.client.1.vm05.stdout:3/356: write d0/d9/d22/f30 [4364252,13769] 0 2026-03-09T16:15:14.765 INFO:tasks.workunit.client.1.vm05.stdout:1/457: creat d7/dd/d21/d39/fa4 x:0 0 0 2026-03-09T16:15:14.765 INFO:tasks.workunit.client.1.vm05.stdout:2/421: dread db/dd/d15/d1f/d20/f3d [0,4194304] 0 2026-03-09T16:15:14.765 INFO:tasks.workunit.client.1.vm05.stdout:1/458: stat d7/dd/d21/d39/d48 0 2026-03-09T16:15:14.766 INFO:tasks.workunit.client.1.vm05.stdout:1/459: write d7/dd/de/d52/f7d [4029681,59147] 0 2026-03-09T16:15:14.766 INFO:tasks.workunit.client.1.vm05.stdout:9/486: rename d4/d10/l42 to d4/d10/d35/d36/d48/d54/la9 0 2026-03-09T16:15:14.766 INFO:tasks.workunit.client.1.vm05.stdout:5/419: creat d8/d95/f98 x:0 0 0 2026-03-09T16:15:14.771 INFO:tasks.workunit.client.1.vm05.stdout:6/423: unlink d17/d22/d27/d34/d4b/f6c 0 2026-03-09T16:15:14.771 INFO:tasks.workunit.client.1.vm05.stdout:2/422: mkdir db/dd/d15/d46/d8d 0 2026-03-09T16:15:14.771 INFO:tasks.workunit.client.1.vm05.stdout:1/460: rmdir d7/dd/d21/d44 39 2026-03-09T16:15:14.772 INFO:tasks.workunit.client.1.vm05.stdout:1/461: chown d7/d27/l91 98 1 2026-03-09T16:15:14.772 INFO:tasks.workunit.client.1.vm05.stdout:4/410: getdents d5/de/d15/d21/d27/d3c 0 2026-03-09T16:15:14.775 INFO:tasks.workunit.client.1.vm05.stdout:5/420: creat d8/d18/d1b/d47/d48/f99 x:0 0 0 2026-03-09T16:15:14.776 INFO:tasks.workunit.client.1.vm05.stdout:9/487: creat d4/d10/faa x:0 0 0 2026-03-09T16:15:14.776 INFO:tasks.workunit.client.1.vm05.stdout:9/488: read - d4/d10/faa zero size 2026-03-09T16:15:14.777 INFO:tasks.workunit.client.1.vm05.stdout:4/411: chown d5/de/d15/d21/d27/d3c/d5c/d5f/d4e 1 1 2026-03-09T16:15:14.778 INFO:tasks.workunit.client.1.vm05.stdout:2/423: mknod db/dd/d15/d4c/d56/c8e 0 2026-03-09T16:15:14.779 INFO:tasks.workunit.client.1.vm05.stdout:9/489: unlink d4/l1f 0 2026-03-09T16:15:14.780 INFO:tasks.workunit.client.1.vm05.stdout:6/424: creat d17/d22/d27/d8a/d8b/f9c x:0 0 0 2026-03-09T16:15:14.782 INFO:tasks.workunit.client.1.vm05.stdout:9/490: truncate d4/d10/f1d 5160963 0 2026-03-09T16:15:14.782 INFO:tasks.workunit.client.1.vm05.stdout:4/412: fsync d5/de/f23 0 2026-03-09T16:15:14.782 INFO:tasks.workunit.client.1.vm05.stdout:5/421: symlink d8/d18/d1b/d2e/l9a 0 2026-03-09T16:15:14.787 INFO:tasks.workunit.client.1.vm05.stdout:7/485: dwrite d1/d2/d8/dc/d1b/f5a [0,4194304] 0 2026-03-09T16:15:14.787 INFO:tasks.workunit.client.1.vm05.stdout:1/462: sync 2026-03-09T16:15:14.788 INFO:tasks.workunit.client.1.vm05.stdout:6/425: fsync d17/d22/d27/d34/d42/d53/f55 0 2026-03-09T16:15:14.791 INFO:tasks.workunit.client.1.vm05.stdout:2/424: creat db/dd/d15/d1f/d20/d86/f8f x:0 0 0 2026-03-09T16:15:14.791 INFO:tasks.workunit.client.1.vm05.stdout:7/486: stat d1/d2/d11/c12 0 2026-03-09T16:15:14.791 INFO:tasks.workunit.client.1.vm05.stdout:1/463: mkdir d7/d62/da5 0 2026-03-09T16:15:14.791 INFO:tasks.workunit.client.1.vm05.stdout:7/487: readlink d1/l7c 0 2026-03-09T16:15:14.796 INFO:tasks.workunit.client.1.vm05.stdout:5/422: symlink d8/d18/d1b/d2e/l9b 0 2026-03-09T16:15:14.796 INFO:tasks.workunit.client.1.vm05.stdout:7/488: fdatasync d1/fa0 0 2026-03-09T16:15:14.800 INFO:tasks.workunit.client.1.vm05.stdout:9/491: link d4/d10/d35/cb d4/d10/d35/d36/d48/d54/cab 0 2026-03-09T16:15:14.805 INFO:tasks.workunit.client.1.vm05.stdout:6/426: rename d17/d22/d27/d34/d42/d68 to d17/d22/d9d 0 2026-03-09T16:15:14.805 INFO:tasks.workunit.client.1.vm05.stdout:6/427: readlink d17/d22/d27/l54 0 2026-03-09T16:15:14.805 INFO:tasks.workunit.client.1.vm05.stdout:2/425: creat db/dd/d15/f90 x:0 0 0 2026-03-09T16:15:14.805 INFO:tasks.workunit.client.1.vm05.stdout:2/426: readlink db/dd/d15/l42 0 2026-03-09T16:15:14.805 INFO:tasks.workunit.client.1.vm05.stdout:7/489: chown d1/d2/d8/c16 3313648 1 2026-03-09T16:15:14.807 INFO:tasks.workunit.client.1.vm05.stdout:2/427: fdatasync db/dd/d15/d3f/d55/f80 0 2026-03-09T16:15:14.811 INFO:tasks.workunit.client.1.vm05.stdout:1/464: link d7/dd/d21/d39/d48/d5d/fa0 d7/dd/d21/d39/d87/fa6 0 2026-03-09T16:15:14.813 INFO:tasks.workunit.client.1.vm05.stdout:5/423: rename d8/d18/d1b/d47/d4e/c65 to d8/d18/d1b/d47/c9c 0 2026-03-09T16:15:14.814 INFO:tasks.workunit.client.1.vm05.stdout:3/357: truncate d0/d9/d22/f2e 514027 0 2026-03-09T16:15:14.817 INFO:tasks.workunit.client.1.vm05.stdout:6/428: creat d17/d5d/d73/f9e x:0 0 0 2026-03-09T16:15:14.817 INFO:tasks.workunit.client.1.vm05.stdout:4/413: mkdir d5/d19/d90 0 2026-03-09T16:15:14.827 INFO:tasks.workunit.client.1.vm05.stdout:7/490: creat d1/d2/d8/dc/d1b/d30/d7d/fa7 x:0 0 0 2026-03-09T16:15:14.828 INFO:tasks.workunit.client.1.vm05.stdout:3/358: mkdir d0/d9/d22/d5f/d75 0 2026-03-09T16:15:14.828 INFO:tasks.workunit.client.1.vm05.stdout:6/429: rmdir d17/d22/d27 39 2026-03-09T16:15:14.828 INFO:tasks.workunit.client.1.vm05.stdout:6/430: readlink d17/d22/l59 0 2026-03-09T16:15:14.830 INFO:tasks.workunit.client.1.vm05.stdout:4/414: fdatasync d5/de/d15/d21/d39/d5d/d7d/f7e 0 2026-03-09T16:15:14.830 INFO:tasks.workunit.client.1.vm05.stdout:1/465: mkdir d7/dd/d21/d39/d48/da7 0 2026-03-09T16:15:14.831 INFO:tasks.workunit.client.1.vm05.stdout:1/466: readlink d7/d27/l91 0 2026-03-09T16:15:14.839 INFO:tasks.workunit.client.1.vm05.stdout:6/431: mkdir d17/d22/d27/d34/d42/d53/d9f 0 2026-03-09T16:15:14.845 INFO:tasks.workunit.client.1.vm05.stdout:1/467: sync 2026-03-09T16:15:14.845 INFO:tasks.workunit.client.1.vm05.stdout:5/424: sync 2026-03-09T16:15:14.847 INFO:tasks.workunit.client.1.vm05.stdout:6/432: rename d17/d4f/f96 to d17/d22/d27/d34/d42/d53/d9f/fa0 0 2026-03-09T16:15:14.847 INFO:tasks.workunit.client.1.vm05.stdout:9/492: write d4/f61 [654933,124566] 0 2026-03-09T16:15:14.848 INFO:tasks.workunit.client.1.vm05.stdout:7/491: dwrite d1/d2/d8/dc/d1b/f5a [4194304,4194304] 0 2026-03-09T16:15:14.852 INFO:tasks.workunit.client.1.vm05.stdout:4/415: dread d5/d19/f1f [0,4194304] 0 2026-03-09T16:15:14.852 INFO:tasks.workunit.client.1.vm05.stdout:9/493: chown d4/d10/d35/d2b/d38/fa6 13772 1 2026-03-09T16:15:14.854 INFO:tasks.workunit.client.1.vm05.stdout:4/416: chown d5/d19/f1f 44630700 1 2026-03-09T16:15:14.856 INFO:tasks.workunit.client.1.vm05.stdout:8/399: dread f0 [0,4194304] 0 2026-03-09T16:15:14.868 INFO:tasks.workunit.client.1.vm05.stdout:8/400: truncate d4/d6/d53/f7f 812994 0 2026-03-09T16:15:14.868 INFO:tasks.workunit.client.1.vm05.stdout:2/428: write db/dd/d15/d3f/d5b/f69 [16150,54875] 0 2026-03-09T16:15:14.868 INFO:tasks.workunit.client.1.vm05.stdout:6/433: creat d17/d22/d27/d8a/fa1 x:0 0 0 2026-03-09T16:15:14.869 INFO:tasks.workunit.client.1.vm05.stdout:8/401: dread - d4/d6/f1f zero size 2026-03-09T16:15:14.874 INFO:tasks.workunit.client.1.vm05.stdout:9/494: stat d4/d10/d35/d36/d48/d54/la9 0 2026-03-09T16:15:14.875 INFO:tasks.workunit.client.1.vm05.stdout:7/492: mkdir d1/d2/d8/dc/d72/da8 0 2026-03-09T16:15:14.877 INFO:tasks.workunit.client.1.vm05.stdout:4/417: mkdir d5/de/d15/d21/d39/d91 0 2026-03-09T16:15:14.879 INFO:tasks.workunit.client.1.vm05.stdout:7/493: dread - d1/d2/d11/d86/d8a/fa3 zero size 2026-03-09T16:15:14.880 INFO:tasks.workunit.client.1.vm05.stdout:3/359: dwrite d0/d33/f36 [0,4194304] 0 2026-03-09T16:15:14.880 INFO:tasks.workunit.client.1.vm05.stdout:8/402: chown d4/d55 16780 1 2026-03-09T16:15:14.886 INFO:tasks.workunit.client.1.vm05.stdout:8/403: chown d4/d6/db/df/c57 4048439 1 2026-03-09T16:15:14.887 INFO:tasks.workunit.client.1.vm05.stdout:6/434: mknod d17/d22/d27/d8a/d8b/ca2 0 2026-03-09T16:15:14.888 INFO:tasks.workunit.client.1.vm05.stdout:0/433: dread d5/db/f6e [0,4194304] 0 2026-03-09T16:15:14.891 INFO:tasks.workunit.client.1.vm05.stdout:2/429: write db/f17 [1408915,123213] 0 2026-03-09T16:15:14.893 INFO:tasks.workunit.client.1.vm05.stdout:2/430: truncate db/dd/d15/d3f/d5b/d60/f7c 463993 0 2026-03-09T16:15:14.893 INFO:tasks.workunit.client.1.vm05.stdout:1/468: dwrite d7/d15/d16/f1c [0,4194304] 0 2026-03-09T16:15:14.895 INFO:tasks.workunit.client.1.vm05.stdout:5/425: rename d8/d18/d1b/f2c to d8/d18/d1b/d2e/f9d 0 2026-03-09T16:15:14.895 INFO:tasks.workunit.client.1.vm05.stdout:4/418: creat d5/de/d15/d21/d27/d3c/f92 x:0 0 0 2026-03-09T16:15:14.896 INFO:tasks.workunit.client.1.vm05.stdout:6/435: mknod d17/d22/d9d/ca3 0 2026-03-09T16:15:14.898 INFO:tasks.workunit.client.1.vm05.stdout:2/431: stat db/dd/d15/d3f/d5b/d60/d6a/f8a 0 2026-03-09T16:15:14.900 INFO:tasks.workunit.client.1.vm05.stdout:7/494: link d1/d2/d8/dc/d33/f9f d1/d2/d8/d31/d8d/fa9 0 2026-03-09T16:15:14.901 INFO:tasks.workunit.client.1.vm05.stdout:0/434: write d5/db/d5b/d82/f89 [762415,42514] 0 2026-03-09T16:15:14.903 INFO:tasks.workunit.client.1.vm05.stdout:1/469: unlink d7/dd/de/d52/d5b/f8b 0 2026-03-09T16:15:14.903 INFO:tasks.workunit.client.1.vm05.stdout:5/426: sync 2026-03-09T16:15:14.904 INFO:tasks.workunit.client.1.vm05.stdout:7/495: chown d1/d2/d8/dc/d1b/d30/d4b/l90 58230614 1 2026-03-09T16:15:14.904 INFO:tasks.workunit.client.1.vm05.stdout:1/470: dread - d7/dd/d21/d3b/d55/d95/f99 zero size 2026-03-09T16:15:14.906 INFO:tasks.workunit.client.1.vm05.stdout:0/435: dread - d5/db/d1d/f6d zero size 2026-03-09T16:15:14.906 INFO:tasks.workunit.client.1.vm05.stdout:7/496: chown d1/d2/d8/dc/d33/f9d 154134 1 2026-03-09T16:15:14.906 INFO:tasks.workunit.client.1.vm05.stdout:2/432: creat db/dd/d15/d46/f91 x:0 0 0 2026-03-09T16:15:14.907 INFO:tasks.workunit.client.1.vm05.stdout:2/433: chown db/dd/d15/f90 317194568 1 2026-03-09T16:15:14.912 INFO:tasks.workunit.client.1.vm05.stdout:0/436: chown d5/d2c/d49/f80 388479627 1 2026-03-09T16:15:14.915 INFO:tasks.workunit.client.1.vm05.stdout:0/437: unlink d5/db/d5b/f87 0 2026-03-09T16:15:14.915 INFO:tasks.workunit.client.1.vm05.stdout:9/495: dread d4/d10/d35/d2b/f2c [4194304,4194304] 0 2026-03-09T16:15:14.916 INFO:tasks.workunit.client.1.vm05.stdout:1/471: dread d7/d15/d16/f74 [0,4194304] 0 2026-03-09T16:15:14.918 INFO:tasks.workunit.client.1.vm05.stdout:9/496: chown d4/d10/c46 3436 1 2026-03-09T16:15:14.922 INFO:tasks.workunit.client.1.vm05.stdout:1/472: link d7/l9a d7/dd/d21/d63/d71/la8 0 2026-03-09T16:15:14.922 INFO:tasks.workunit.client.1.vm05.stdout:7/497: dread d1/d2/d8/dc/d33/f57 [0,4194304] 0 2026-03-09T16:15:14.926 INFO:tasks.workunit.client.1.vm05.stdout:1/473: chown d7/dd/d21/d39/fa4 3784 1 2026-03-09T16:15:14.926 INFO:tasks.workunit.client.1.vm05.stdout:7/498: mknod d1/d2/d8/dc/caa 0 2026-03-09T16:15:14.927 INFO:tasks.workunit.client.1.vm05.stdout:7/499: dwrite d1/d2/d8/dc/d9c/f6b [0,4194304] 0 2026-03-09T16:15:14.929 INFO:tasks.workunit.client.1.vm05.stdout:7/500: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f49 zero size 2026-03-09T16:15:14.938 INFO:tasks.workunit.client.1.vm05.stdout:5/427: stat d8/d18/d1b/l42 0 2026-03-09T16:15:14.949 INFO:tasks.workunit.client.1.vm05.stdout:8/404: truncate d4/d6/d3a/d3c/f45 4160923 0 2026-03-09T16:15:14.950 INFO:tasks.workunit.client.1.vm05.stdout:7/501: dread d1/d2/d8/dc/d1b/d71/f59 [0,4194304] 0 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:3/360: truncate d0/d9/d22/f2a 3421485 0 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:6/436: write d17/d1d/f1e [1694956,115621] 0 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:3/361: read - d0/f69 zero size 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:6/437: chown d17/d22/d27/d44/l5e 34199917 1 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:7/502: dread - d1/d2/d8/dc/d1b/f66 zero size 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:8/405: rename d4/d6/d3a/d40/f52 to d4/d6/d3a/d3c/f8d 0 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:8/406: readlink d4/d6/db/df/d4f/l5b 0 2026-03-09T16:15:14.956 INFO:tasks.workunit.client.1.vm05.stdout:4/419: truncate d5/de/d15/d21/d39/d5d/f70 468185 0 2026-03-09T16:15:14.957 INFO:tasks.workunit.client.1.vm05.stdout:2/434: truncate db/dd/d15/d1f/f49 11526803 0 2026-03-09T16:15:14.959 INFO:tasks.workunit.client.1.vm05.stdout:0/438: truncate d5/db/d1d/f52 1107270 0 2026-03-09T16:15:14.960 INFO:tasks.workunit.client.1.vm05.stdout:9/497: write d4/d10/d35/d2b/d31/d96/f9b [481970,112639] 0 2026-03-09T16:15:14.963 INFO:tasks.workunit.client.1.vm05.stdout:3/362: sync 2026-03-09T16:15:14.964 INFO:tasks.workunit.client.1.vm05.stdout:9/498: sync 2026-03-09T16:15:14.964 INFO:tasks.workunit.client.1.vm05.stdout:7/503: mknod d1/d2/d8/dc/d1b/d71/cab 0 2026-03-09T16:15:14.965 INFO:tasks.workunit.client.1.vm05.stdout:5/428: rename d8/d18/d1b/d47/d48/d73/l86 to d8/d18/d1b/d2e/d43/l9e 0 2026-03-09T16:15:14.968 INFO:tasks.workunit.client.1.vm05.stdout:1/474: write d7/d15/f8d [998647,45790] 0 2026-03-09T16:15:14.971 INFO:tasks.workunit.client.1.vm05.stdout:8/407: dread f0 [0,4194304] 0 2026-03-09T16:15:14.972 INFO:tasks.workunit.client.1.vm05.stdout:3/363: write d0/f5a [916038,58523] 0 2026-03-09T16:15:14.973 INFO:tasks.workunit.client.1.vm05.stdout:8/408: sync 2026-03-09T16:15:14.977 INFO:tasks.workunit.client.1.vm05.stdout:5/429: symlink d8/d53/l9f 0 2026-03-09T16:15:14.980 INFO:tasks.workunit.client.1.vm05.stdout:2/435: creat db/dd/d15/d1f/d20/d23/d78/f92 x:0 0 0 2026-03-09T16:15:14.981 INFO:tasks.workunit.client.1.vm05.stdout:5/430: dread - d8/d1d/f85 zero size 2026-03-09T16:15:14.981 INFO:tasks.workunit.client.1.vm05.stdout:2/436: fdatasync db/dd/f1b 0 2026-03-09T16:15:14.982 INFO:tasks.workunit.client.1.vm05.stdout:4/420: dwrite f1 [4194304,4194304] 0 2026-03-09T16:15:14.988 INFO:tasks.workunit.client.1.vm05.stdout:6/438: dwrite d17/d22/d27/d44/f48 [4194304,4194304] 0 2026-03-09T16:15:14.988 INFO:tasks.workunit.client.1.vm05.stdout:4/421: dread - d5/de/f87 zero size 2026-03-09T16:15:14.990 INFO:tasks.workunit.client.1.vm05.stdout:4/422: chown d5/de/d15/d21/d27/d3c/c6b 62 1 2026-03-09T16:15:14.990 INFO:tasks.workunit.client.1.vm05.stdout:2/437: dread - db/dd/d15/d3f/f5c zero size 2026-03-09T16:15:14.991 INFO:tasks.workunit.client.1.vm05.stdout:5/431: read d8/d1d/f21 [18405,28767] 0 2026-03-09T16:15:14.998 INFO:tasks.workunit.client.1.vm05.stdout:3/364: dwrite d0/f45 [0,4194304] 0 2026-03-09T16:15:15.000 INFO:tasks.workunit.client.1.vm05.stdout:2/438: write db/dd/d15/d3f/d5b/f7d [4632772,122204] 0 2026-03-09T16:15:15.008 INFO:tasks.workunit.client.1.vm05.stdout:7/504: mknod d1/d2/d8/d67/d76/cac 0 2026-03-09T16:15:15.009 INFO:tasks.workunit.client.1.vm05.stdout:7/505: write d1/d2/d8/d31/f39 [2006263,7737] 0 2026-03-09T16:15:15.009 INFO:tasks.workunit.client.1.vm05.stdout:8/409: creat d4/d6/d3a/f8e x:0 0 0 2026-03-09T16:15:15.012 INFO:tasks.workunit.client.1.vm05.stdout:2/439: mknod db/dd/d15/d46/d67/c93 0 2026-03-09T16:15:15.015 INFO:tasks.workunit.client.1.vm05.stdout:0/439: link d5/d1b/l86 d5/db/d5b/d82/l8c 0 2026-03-09T16:15:15.015 INFO:tasks.workunit.client.1.vm05.stdout:4/423: getdents d5/de/d15/d21/d39/d91 0 2026-03-09T16:15:15.015 INFO:tasks.workunit.client.1.vm05.stdout:0/440: write d5/db/d1d/f60 [1290231,129005] 0 2026-03-09T16:15:15.018 INFO:tasks.workunit.client.1.vm05.stdout:0/441: chown d5/f17 112863 1 2026-03-09T16:15:15.019 INFO:tasks.workunit.client.1.vm05.stdout:9/499: rename d4/d10/d35/d36/d48/d54/cab to d4/d10/d35/d2b/d31/d96/cac 0 2026-03-09T16:15:15.023 INFO:tasks.workunit.client.1.vm05.stdout:5/432: mkdir d8/d59/d5b/d8b/da0 0 2026-03-09T16:15:15.029 INFO:tasks.workunit.client.1.vm05.stdout:5/433: truncate d8/d18/d1b/d2e/f8c 762679 0 2026-03-09T16:15:15.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:14 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:15.029 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:14 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:15.029 INFO:tasks.workunit.client.1.vm05.stdout:6/439: link d17/d5d/f84 d17/d22/d27/d34/d4b/fa4 0 2026-03-09T16:15:15.029 INFO:tasks.workunit.client.1.vm05.stdout:3/365: mkdir d0/d9/d22/d5f/d75/d76 0 2026-03-09T16:15:15.033 INFO:tasks.workunit.client.1.vm05.stdout:3/366: dwrite d0/d9/f5c [0,4194304] 0 2026-03-09T16:15:15.039 INFO:tasks.workunit.client.1.vm05.stdout:3/367: read - d0/d33/f41 zero size 2026-03-09T16:15:15.039 INFO:tasks.workunit.client.1.vm05.stdout:4/424: creat d5/d19/d37/d60/f93 x:0 0 0 2026-03-09T16:15:15.040 INFO:tasks.workunit.client.1.vm05.stdout:4/425: truncate d5/de/d2f/f8e 634167 0 2026-03-09T16:15:15.040 INFO:tasks.workunit.client.1.vm05.stdout:4/426: chown d5/de/d15/d21/d39/d5d/f6a 3 1 2026-03-09T16:15:15.041 INFO:tasks.workunit.client.1.vm05.stdout:4/427: truncate d5/f35 5177607 0 2026-03-09T16:15:15.043 INFO:tasks.workunit.client.1.vm05.stdout:4/428: truncate d5/de/d15/d21/d39/d5d/d7d/f89 637641 0 2026-03-09T16:15:15.046 INFO:tasks.workunit.client.1.vm05.stdout:6/440: mkdir d17/d22/d9d/da5 0 2026-03-09T16:15:15.049 INFO:tasks.workunit.client.1.vm05.stdout:0/442: symlink d5/db/l8d 0 2026-03-09T16:15:15.052 INFO:tasks.workunit.client.1.vm05.stdout:6/441: creat d17/d22/d27/d34/d42/d65/fa6 x:0 0 0 2026-03-09T16:15:15.053 INFO:tasks.workunit.client.1.vm05.stdout:3/368: link d0/d33/f3a d0/d33/f77 0 2026-03-09T16:15:15.053 INFO:tasks.workunit.client.1.vm05.stdout:3/369: stat d0/d9/f4d 0 2026-03-09T16:15:15.057 INFO:tasks.workunit.client.1.vm05.stdout:2/440: rename db/dd/d15/d3f/d5b/d60/l84 to db/dd/l94 0 2026-03-09T16:15:15.065 INFO:tasks.workunit.client.1.vm05.stdout:0/443: mknod d5/d11/d4f/c8e 0 2026-03-09T16:15:15.065 INFO:tasks.workunit.client.1.vm05.stdout:4/429: symlink d5/de/d15/l94 0 2026-03-09T16:15:15.065 INFO:tasks.workunit.client.1.vm05.stdout:2/441: dread db/f17 [0,4194304] 0 2026-03-09T16:15:15.065 INFO:tasks.workunit.client.1.vm05.stdout:2/442: write db/dd/d15/f48 [5920901,46841] 0 2026-03-09T16:15:15.065 INFO:tasks.workunit.client.1.vm05.stdout:0/444: read d5/d1b/d30/f29 [3956500,29007] 0 2026-03-09T16:15:15.072 INFO:tasks.workunit.client.1.vm05.stdout:1/475: write d7/d27/f4d [774765,77089] 0 2026-03-09T16:15:15.072 INFO:tasks.workunit.client.1.vm05.stdout:1/476: chown d7/dd/d21/d39/d48/c51 91 1 2026-03-09T16:15:15.077 INFO:tasks.workunit.client.1.vm05.stdout:2/443: dread - db/f2d zero size 2026-03-09T16:15:15.079 INFO:tasks.workunit.client.1.vm05.stdout:4/430: link d5/de/d15/d21/f6d d5/f95 0 2026-03-09T16:15:15.082 INFO:tasks.workunit.client.1.vm05.stdout:1/477: symlink d7/dd/d21/d3b/d55/d95/la9 0 2026-03-09T16:15:15.083 INFO:tasks.workunit.client.1.vm05.stdout:6/442: getdents d17/d22/d27 0 2026-03-09T16:15:15.084 INFO:tasks.workunit.client.1.vm05.stdout:6/443: read - d17/d22/d27/d34/d4b/f6d zero size 2026-03-09T16:15:15.087 INFO:tasks.workunit.client.1.vm05.stdout:8/410: write d4/d6/db/dc/f17 [1137873,80129] 0 2026-03-09T16:15:15.090 INFO:tasks.workunit.client.1.vm05.stdout:7/506: dwrite d1/d2/d8/dc/f45 [0,4194304] 0 2026-03-09T16:15:15.091 INFO:tasks.workunit.client.1.vm05.stdout:4/431: chown d5/l8b 436786533 1 2026-03-09T16:15:15.091 INFO:tasks.workunit.client.1.vm05.stdout:7/507: stat d1/d2/d8/dc/d1b/d30/d4b/d65/f8f 0 2026-03-09T16:15:15.091 INFO:tasks.workunit.client.1.vm05.stdout:8/411: write d4/d6/f58 [2532715,5968] 0 2026-03-09T16:15:15.093 INFO:tasks.workunit.client.1.vm05.stdout:7/508: stat d1/d2/c23 0 2026-03-09T16:15:15.100 INFO:tasks.workunit.client.1.vm05.stdout:8/412: read - d4/d6/d3a/d3c/f8d zero size 2026-03-09T16:15:15.100 INFO:tasks.workunit.client.1.vm05.stdout:0/445: creat d5/d2c/d49/d83/d8b/f8f x:0 0 0 2026-03-09T16:15:15.102 INFO:tasks.workunit.client.1.vm05.stdout:3/370: rmdir d0/d9 39 2026-03-09T16:15:15.116 INFO:tasks.workunit.client.1.vm05.stdout:5/434: write d8/d18/d1b/d2e/f9d [2360127,103257] 0 2026-03-09T16:15:15.120 INFO:tasks.workunit.client.1.vm05.stdout:9/500: write d4/d10/d35/d2b/f2c [4743170,111397] 0 2026-03-09T16:15:15.120 INFO:tasks.workunit.client.1.vm05.stdout:7/509: dread d1/d2/d8/d31/d8d/f56 [0,4194304] 0 2026-03-09T16:15:15.127 INFO:tasks.workunit.client.1.vm05.stdout:0/446: dwrite d5/d2c/d49/f80 [0,4194304] 0 2026-03-09T16:15:15.133 INFO:tasks.workunit.client.1.vm05.stdout:3/371: dwrite d0/f5a [0,4194304] 0 2026-03-09T16:15:15.134 INFO:tasks.workunit.client.1.vm05.stdout:4/432: mknod d5/de/d15/d21/d39/d5d/d8c/c96 0 2026-03-09T16:15:15.139 INFO:tasks.workunit.client.1.vm05.stdout:5/435: dwrite d8/d18/d1b/d2e/f8c [0,4194304] 0 2026-03-09T16:15:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:14 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:15.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:14 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:15.148 INFO:tasks.workunit.client.1.vm05.stdout:1/478: write d7/f34 [3159521,10898] 0 2026-03-09T16:15:15.150 INFO:tasks.workunit.client.1.vm05.stdout:2/444: dwrite db/dd/d15/d46/f4e [4194304,4194304] 0 2026-03-09T16:15:15.154 INFO:tasks.workunit.client.1.vm05.stdout:6/444: dwrite d17/f1c [0,4194304] 0 2026-03-09T16:15:15.158 INFO:tasks.workunit.client.1.vm05.stdout:9/501: creat d4/d10/d35/d36/d48/d60/fad x:0 0 0 2026-03-09T16:15:15.159 INFO:tasks.workunit.client.1.vm05.stdout:1/479: write d7/f3f [873876,23251] 0 2026-03-09T16:15:15.161 INFO:tasks.workunit.client.1.vm05.stdout:9/502: chown d4/d10/d35/c13 5456122 1 2026-03-09T16:15:15.162 INFO:tasks.workunit.client.1.vm05.stdout:8/413: symlink d4/d6/d3a/d40/d6a/l8f 0 2026-03-09T16:15:15.166 INFO:tasks.workunit.client.1.vm05.stdout:8/414: fsync d4/d6/d3a/f88 0 2026-03-09T16:15:15.166 INFO:tasks.workunit.client.1.vm05.stdout:8/415: readlink d4/d6/db/dc/d5d/l78 0 2026-03-09T16:15:15.168 INFO:tasks.workunit.client.1.vm05.stdout:8/416: write d4/d6/db/f5e [107686,69526] 0 2026-03-09T16:15:15.170 INFO:tasks.workunit.client.1.vm05.stdout:5/436: symlink d8/d59/la1 0 2026-03-09T16:15:15.174 INFO:tasks.workunit.client.1.vm05.stdout:6/445: dwrite d17/d1d/f1e [0,4194304] 0 2026-03-09T16:15:15.185 INFO:tasks.workunit.client.1.vm05.stdout:9/503: dwrite d4/d10/d35/d2b/f2f [0,4194304] 0 2026-03-09T16:15:15.189 INFO:tasks.workunit.client.1.vm05.stdout:2/445: dread db/f17 [0,4194304] 0 2026-03-09T16:15:15.189 INFO:tasks.workunit.client.1.vm05.stdout:2/446: write db/dd/d15/f70 [1554442,40515] 0 2026-03-09T16:15:15.193 INFO:tasks.workunit.client.1.vm05.stdout:3/372: creat d0/d9/d22/d4c/f78 x:0 0 0 2026-03-09T16:15:15.193 INFO:tasks.workunit.client.1.vm05.stdout:2/447: fdatasync db/dd/d15/d46/f91 0 2026-03-09T16:15:15.194 INFO:tasks.workunit.client.1.vm05.stdout:6/446: creat d17/d22/d27/d8a/fa7 x:0 0 0 2026-03-09T16:15:15.194 INFO:tasks.workunit.client.1.vm05.stdout:1/480: mkdir d7/daa 0 2026-03-09T16:15:15.199 INFO:tasks.workunit.client.1.vm05.stdout:0/447: creat d5/d11/f90 x:0 0 0 2026-03-09T16:15:15.199 INFO:tasks.workunit.client.1.vm05.stdout:3/373: mknod d0/d33/c79 0 2026-03-09T16:15:15.200 INFO:tasks.workunit.client.1.vm05.stdout:2/448: rmdir db 39 2026-03-09T16:15:15.201 INFO:tasks.workunit.client.1.vm05.stdout:0/448: creat d5/db/d48/d66/f91 x:0 0 0 2026-03-09T16:15:15.202 INFO:tasks.workunit.client.1.vm05.stdout:5/437: getdents d8/d59/d5b/d8b 0 2026-03-09T16:15:15.203 INFO:tasks.workunit.client.1.vm05.stdout:2/449: chown db/dd/d15/d3f/f4a 9593 1 2026-03-09T16:15:15.204 INFO:tasks.workunit.client.1.vm05.stdout:1/481: rename d7/d15/d45/f70 to d7/dd/d21/d39/d5a/d50/fab 0 2026-03-09T16:15:15.206 INFO:tasks.workunit.client.1.vm05.stdout:1/482: write d7/d62/d72/f9f [732718,120991] 0 2026-03-09T16:15:15.206 INFO:tasks.workunit.client.1.vm05.stdout:5/438: creat d8/d18/d1b/d47/d48/fa2 x:0 0 0 2026-03-09T16:15:15.207 INFO:tasks.workunit.client.1.vm05.stdout:2/450: rename db/dd/d15/d3f/d55 to db/dd/d15/d3f/d5b/d60/d95 0 2026-03-09T16:15:15.222 INFO:tasks.workunit.client.1.vm05.stdout:5/439: mknod d8/d3d/ca3 0 2026-03-09T16:15:15.222 INFO:tasks.workunit.client.1.vm05.stdout:1/483: unlink d7/dd/d21/d3b/d55/d95/la9 0 2026-03-09T16:15:15.222 INFO:tasks.workunit.client.1.vm05.stdout:6/447: dwrite d17/d22/d27/d34/d42/d53/f90 [0,4194304] 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:1/484: fsync d7/d15/f22 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:2/451: rename db/dd/d15/d1f/d21/c7f to db/dd/d15/d3f/c96 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:5/440: mknod d8/d95/ca4 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:6/448: creat d17/d1d/fa8 x:0 0 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:0/449: getdents d5/d11/d4f/d68 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:2/452: readlink db/dd/l6c 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:6/449: fsync d17/d5d/f84 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:0/450: creat d5/d1b/d3b/f92 x:0 0 0 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:2/453: dread - db/dd/d15/d3f/d5b/d60/d6a/f8a zero size 2026-03-09T16:15:15.223 INFO:tasks.workunit.client.1.vm05.stdout:5/441: dwrite d8/d18/d1b/d2e/f52 [0,4194304] 0 2026-03-09T16:15:15.227 INFO:tasks.workunit.client.1.vm05.stdout:2/454: creat db/dd/d15/d3f/d5b/f97 x:0 0 0 2026-03-09T16:15:15.228 INFO:tasks.workunit.client.1.vm05.stdout:5/442: write d8/d1d/f44 [1246432,68733] 0 2026-03-09T16:15:15.228 INFO:tasks.workunit.client.1.vm05.stdout:1/485: link d7/dd/d21/d39/d5a/l94 d7/dd/d21/d39/d48/lac 0 2026-03-09T16:15:15.228 INFO:tasks.workunit.client.1.vm05.stdout:0/451: write d5/d2c/f28 [8590748,32224] 0 2026-03-09T16:15:15.228 INFO:tasks.workunit.client.1.vm05.stdout:2/455: dread - db/f2d zero size 2026-03-09T16:15:15.231 INFO:tasks.workunit.client.1.vm05.stdout:5/443: chown d8/d18/d1b/d2e/f9d 198 1 2026-03-09T16:15:15.233 INFO:tasks.workunit.client.1.vm05.stdout:0/452: dread d5/d1b/d3b/f6f [0,4194304] 0 2026-03-09T16:15:15.233 INFO:tasks.workunit.client.1.vm05.stdout:6/450: mkdir d17/d22/d9d/da9 0 2026-03-09T16:15:15.237 INFO:tasks.workunit.client.1.vm05.stdout:0/453: chown d5/d11/d4f/d68/f6b 11217119 1 2026-03-09T16:15:15.237 INFO:tasks.workunit.client.1.vm05.stdout:1/486: rename d7/dd/de/c7a to d7/dd/de/d52/cad 0 2026-03-09T16:15:15.238 INFO:tasks.workunit.client.1.vm05.stdout:6/451: symlink d17/d22/d27/d34/d4b/laa 0 2026-03-09T16:15:15.241 INFO:tasks.workunit.client.1.vm05.stdout:1/487: symlink d7/d15/d16/lae 0 2026-03-09T16:15:15.244 INFO:tasks.workunit.client.1.vm05.stdout:6/452: symlink d17/d22/d27/d8a/d8b/lab 0 2026-03-09T16:15:15.244 INFO:tasks.workunit.client.1.vm05.stdout:6/453: chown d17/d22/d27/d44 47360836 1 2026-03-09T16:15:15.244 INFO:tasks.workunit.client.1.vm05.stdout:5/444: creat d8/d18/d1b/d2e/fa5 x:0 0 0 2026-03-09T16:15:15.244 INFO:tasks.workunit.client.1.vm05.stdout:4/433: write d5/d19/f1f [4034255,117794] 0 2026-03-09T16:15:15.256 INFO:tasks.workunit.client.1.vm05.stdout:7/510: dwrite d1/d2/d8/dc/d33/f9f [0,4194304] 0 2026-03-09T16:15:15.256 INFO:tasks.workunit.client.1.vm05.stdout:7/511: chown d1/d2/d8/d31/d8d/l3f 0 1 2026-03-09T16:15:15.266 INFO:tasks.workunit.client.1.vm05.stdout:9/504: sync 2026-03-09T16:15:15.267 INFO:tasks.workunit.client.1.vm05.stdout:3/374: sync 2026-03-09T16:15:15.267 INFO:tasks.workunit.client.1.vm05.stdout:4/434: symlink d5/de/d15/d21/d31/l97 0 2026-03-09T16:15:15.268 INFO:tasks.workunit.client.1.vm05.stdout:1/488: symlink d7/dd/d21/d63/d71/laf 0 2026-03-09T16:15:15.271 INFO:tasks.workunit.client.1.vm05.stdout:0/454: dwrite d5/d2c/d49/f7a [0,4194304] 0 2026-03-09T16:15:15.271 INFO:tasks.workunit.client.1.vm05.stdout:2/456: dread db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:15.280 INFO:tasks.workunit.client.1.vm05.stdout:5/445: rename d8/d18/d1b/d47/d4e/d76/d8f/c96 to d8/d59/d5b/d8b/da0/ca6 0 2026-03-09T16:15:15.281 INFO:tasks.workunit.client.1.vm05.stdout:9/505: dread d4/d10/d35/f7c [0,4194304] 0 2026-03-09T16:15:15.283 INFO:tasks.workunit.client.1.vm05.stdout:0/455: readlink d5/d2c/d49/l74 0 2026-03-09T16:15:15.283 INFO:tasks.workunit.client.1.vm05.stdout:9/506: stat d4/d10/d35/d2b/d31/d96/f9b 0 2026-03-09T16:15:15.285 INFO:tasks.workunit.client.1.vm05.stdout:8/417: dwrite d4/f13 [0,4194304] 0 2026-03-09T16:15:15.285 INFO:tasks.workunit.client.1.vm05.stdout:6/454: link d17/d4f/l6f d17/d22/d27/d34/d42/d53/d87/lac 0 2026-03-09T16:15:15.285 INFO:tasks.workunit.client.1.vm05.stdout:1/489: rmdir d7/dd/d21/d63 39 2026-03-09T16:15:15.286 INFO:tasks.workunit.client.1.vm05.stdout:1/490: chown d7/dd/d21/d2d/c68 739119 1 2026-03-09T16:15:15.287 INFO:tasks.workunit.client.1.vm05.stdout:4/435: creat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f98 x:0 0 0 2026-03-09T16:15:15.288 INFO:tasks.workunit.client.1.vm05.stdout:8/418: write d4/d6/d3a/d40/f76 [808292,48931] 0 2026-03-09T16:15:15.288 INFO:tasks.workunit.client.1.vm05.stdout:2/457: mkdir db/dd/d98 0 2026-03-09T16:15:15.288 INFO:tasks.workunit.client.1.vm05.stdout:9/507: mkdir d4/d10/d35/d36/d48/d60/dae 0 2026-03-09T16:15:15.291 INFO:tasks.workunit.client.1.vm05.stdout:5/446: rename d8/d18/d1b/d2e/l39 to d8/d18/d1b/d47/d4e/la7 0 2026-03-09T16:15:15.292 INFO:tasks.workunit.client.1.vm05.stdout:9/508: symlink d4/d10/laf 0 2026-03-09T16:15:15.292 INFO:tasks.workunit.client.1.vm05.stdout:4/436: dread - d5/de/d15/d21/d31/f58 zero size 2026-03-09T16:15:15.292 INFO:tasks.workunit.client.1.vm05.stdout:5/447: write d8/d3d/f94 [443036,44787] 0 2026-03-09T16:15:15.292 INFO:tasks.workunit.client.1.vm05.stdout:6/455: symlink d17/d22/d9d/lad 0 2026-03-09T16:15:15.296 INFO:tasks.workunit.client.1.vm05.stdout:8/419: symlink d4/d6/db/df/l90 0 2026-03-09T16:15:15.297 INFO:tasks.workunit.client.1.vm05.stdout:4/437: fsync d5/de/d15/d21/d31/f58 0 2026-03-09T16:15:15.298 INFO:tasks.workunit.client.1.vm05.stdout:5/448: mknod d8/d53/ca8 0 2026-03-09T16:15:15.299 INFO:tasks.workunit.client.1.vm05.stdout:3/375: dread d0/d9/f2f [0,4194304] 0 2026-03-09T16:15:15.304 INFO:tasks.workunit.client.1.vm05.stdout:8/420: chown d4/d6/d3a/l4b 8193992 1 2026-03-09T16:15:15.305 INFO:tasks.workunit.client.1.vm05.stdout:0/456: getdents d5/db/d48/d66 0 2026-03-09T16:15:15.305 INFO:tasks.workunit.client.1.vm05.stdout:4/438: creat d5/de/d2f/f99 x:0 0 0 2026-03-09T16:15:15.305 INFO:tasks.workunit.client.1.vm05.stdout:8/421: write d4/d6/db/dc/f2a [2589486,19491] 0 2026-03-09T16:15:15.305 INFO:tasks.workunit.client.1.vm05.stdout:0/457: mknod d5/db/d5b/c93 0 2026-03-09T16:15:15.305 INFO:tasks.workunit.client.1.vm05.stdout:5/449: write d8/d18/d1b/d47/d4e/f64 [522138,41376] 0 2026-03-09T16:15:15.307 INFO:tasks.workunit.client.1.vm05.stdout:2/458: dwrite db/dd/d15/d1f/f25 [4194304,4194304] 0 2026-03-09T16:15:15.307 INFO:tasks.workunit.client.1.vm05.stdout:6/456: creat d17/d5d/fae x:0 0 0 2026-03-09T16:15:15.309 INFO:tasks.workunit.client.1.vm05.stdout:4/439: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/c9a 0 2026-03-09T16:15:15.309 INFO:tasks.workunit.client.1.vm05.stdout:8/422: dread - d4/d6/d53/f5a zero size 2026-03-09T16:15:15.317 INFO:tasks.workunit.client.1.vm05.stdout:5/450: symlink d8/d53/d7e/la9 0 2026-03-09T16:15:15.317 INFO:tasks.workunit.client.1.vm05.stdout:4/440: chown d5/l8b 11855 1 2026-03-09T16:15:15.317 INFO:tasks.workunit.client.1.vm05.stdout:2/459: fdatasync db/dd/d15/f48 0 2026-03-09T16:15:15.319 INFO:tasks.workunit.client.1.vm05.stdout:9/509: dwrite d4/d10/d35/d2b/d38/fa6 [0,4194304] 0 2026-03-09T16:15:15.323 INFO:tasks.workunit.client.1.vm05.stdout:5/451: write d8/d18/d1b/f36 [1583911,115495] 0 2026-03-09T16:15:15.328 INFO:tasks.workunit.client.1.vm05.stdout:9/510: dread - d4/d10/d35/d2b/d31/f99 zero size 2026-03-09T16:15:15.328 INFO:tasks.workunit.client.1.vm05.stdout:2/460: readlink db/dd/d15/d4c/l85 0 2026-03-09T16:15:15.328 INFO:tasks.workunit.client.1.vm05.stdout:8/423: unlink d4/d6/d3a/f8b 0 2026-03-09T16:15:15.333 INFO:tasks.workunit.client.1.vm05.stdout:6/457: mknod d17/d22/d9d/da9/caf 0 2026-03-09T16:15:15.335 INFO:tasks.workunit.client.1.vm05.stdout:6/458: write d17/d22/d27/d58/f72 [825939,92313] 0 2026-03-09T16:15:15.335 INFO:tasks.workunit.client.1.vm05.stdout:6/459: chown d17/d22/d27/d34 617 1 2026-03-09T16:15:15.338 INFO:tasks.workunit.client.1.vm05.stdout:7/512: write d1/d2/d8/dc/d1b/d30/d4b/d65/f63 [482206,21132] 0 2026-03-09T16:15:15.339 INFO:tasks.workunit.client.1.vm05.stdout:7/513: chown d1/d2/d8/dc/d1b/d71/d3c/f6e 3 1 2026-03-09T16:15:15.340 INFO:tasks.workunit.client.1.vm05.stdout:9/511: mkdir d4/d10/d35/d36/d48/d54/db0 0 2026-03-09T16:15:15.341 INFO:tasks.workunit.client.1.vm05.stdout:4/441: mknod d5/de/d15/d21/d39/d5d/c9b 0 2026-03-09T16:15:15.343 INFO:tasks.workunit.client.1.vm05.stdout:0/458: link d5/d2c/f41 d5/d11/d4f/d68/f94 0 2026-03-09T16:15:15.343 INFO:tasks.workunit.client.1.vm05.stdout:6/460: symlink d17/d22/d27/d8a/lb0 0 2026-03-09T16:15:15.347 INFO:tasks.workunit.client.1.vm05.stdout:2/461: rename db/dd/d15/f81 to db/dd/d15/d1f/d21/d87/f99 0 2026-03-09T16:15:15.349 INFO:tasks.workunit.client.1.vm05.stdout:7/514: symlink d1/d2/d8/dc/d72/da8/lad 0 2026-03-09T16:15:15.351 INFO:tasks.workunit.client.1.vm05.stdout:6/461: creat d17/d22/d27/d58/fb1 x:0 0 0 2026-03-09T16:15:15.352 INFO:tasks.workunit.client.1.vm05.stdout:4/442: unlink d5/de/d15/f38 0 2026-03-09T16:15:15.357 INFO:tasks.workunit.client.1.vm05.stdout:1/491: dwrite d7/f4b [0,4194304] 0 2026-03-09T16:15:15.365 INFO:tasks.workunit.client.1.vm05.stdout:2/462: creat db/dd/d15/d46/d67/f9a x:0 0 0 2026-03-09T16:15:15.365 INFO:tasks.workunit.client.1.vm05.stdout:6/462: unlink d17/l51 0 2026-03-09T16:15:15.366 INFO:tasks.workunit.client.1.vm05.stdout:3/376: truncate d0/f49 131792 0 2026-03-09T16:15:15.367 INFO:tasks.workunit.client.1.vm05.stdout:8/424: rename d4/f10 to d4/d6/db/dc/d5d/d79/f91 0 2026-03-09T16:15:15.368 INFO:tasks.workunit.client.1.vm05.stdout:7/515: dwrite d1/d2/d8/dc/d1b/d71/d3c/f95 [0,4194304] 0 2026-03-09T16:15:15.370 INFO:tasks.workunit.client.1.vm05.stdout:8/425: read d4/d6/f5f [2277933,67336] 0 2026-03-09T16:15:15.372 INFO:tasks.workunit.client.1.vm05.stdout:7/516: readlink d1/d2/d8/dc/d72/da8/lad 0 2026-03-09T16:15:15.376 INFO:tasks.workunit.client.1.vm05.stdout:4/443: truncate d5/de/d15/d21/d27/f8f 831755 0 2026-03-09T16:15:15.384 INFO:tasks.workunit.client.1.vm05.stdout:6/463: chown d17/d5d/d73/d83 1197 1 2026-03-09T16:15:15.384 INFO:tasks.workunit.client.1.vm05.stdout:2/463: truncate db/dd/d15/d1f/d20/f3d 1077961 0 2026-03-09T16:15:15.384 INFO:tasks.workunit.client.1.vm05.stdout:9/512: rmdir d4 39 2026-03-09T16:15:15.384 INFO:tasks.workunit.client.1.vm05.stdout:7/517: symlink d1/d2/d8/dc/d33/lae 0 2026-03-09T16:15:15.384 INFO:tasks.workunit.client.1.vm05.stdout:7/518: truncate d1/d2/d8/dc/d1b/d30/d5e/f81 531843 0 2026-03-09T16:15:15.385 INFO:tasks.workunit.client.1.vm05.stdout:2/464: stat c8 0 2026-03-09T16:15:15.386 INFO:tasks.workunit.client.1.vm05.stdout:2/465: chown db/dd/d15/d3f/d5b/f97 237 1 2026-03-09T16:15:15.386 INFO:tasks.workunit.client.1.vm05.stdout:4/444: rmdir d5/de/d15 39 2026-03-09T16:15:15.387 INFO:tasks.workunit.client.1.vm05.stdout:2/466: chown db/dd/d15/d1f/d21/c64 2 1 2026-03-09T16:15:15.388 INFO:tasks.workunit.client.1.vm05.stdout:6/464: rename d17/d22/d27/f3c to d17/d22/d9d/fb2 0 2026-03-09T16:15:15.393 INFO:tasks.workunit.client.1.vm05.stdout:4/445: mkdir d5/d9c 0 2026-03-09T16:15:15.395 INFO:tasks.workunit.client.1.vm05.stdout:2/467: creat db/dd/d15/d1f/d20/d23/f9b x:0 0 0 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:6/465: unlink d17/d22/f3d 0 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:8/426: rename d4/d55 to d4/d92 0 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:7/519: dwrite d1/d2/d8/dc/d33/f57 [0,4194304] 0 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:9/513: rmdir d4/d10/d35/d36/d48/d4c 39 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:2/468: dread db/dd/d15/d3f/d5b/d60/f7c [0,4194304] 0 2026-03-09T16:15:15.401 INFO:tasks.workunit.client.1.vm05.stdout:3/377: link d0/d9/d22/d4c/l53 d0/d9/d22/d4c/l7a 0 2026-03-09T16:15:15.407 INFO:tasks.workunit.client.1.vm05.stdout:1/492: sync 2026-03-09T16:15:15.411 INFO:tasks.workunit.client.1.vm05.stdout:9/514: chown d4/d10/d35/d2b/d38/f5e 0 1 2026-03-09T16:15:15.414 INFO:tasks.workunit.client.1.vm05.stdout:8/427: creat d4/d6/d3a/d15/f93 x:0 0 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:6/466: write d17/d22/d9d/fb2 [64174,89790] 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:3/378: mkdir d0/d9/d22/d5f/d7b 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:9/515: mknod d4/d10/d35/d36/d48/d60/cb1 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:4/446: creat d5/de/f9d x:0 0 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:7/520: dwrite d1/d2/d8/dc/d1b/d71/d3c/f6e [0,4194304] 0 2026-03-09T16:15:15.425 INFO:tasks.workunit.client.1.vm05.stdout:0/459: dwrite d5/d2c/d49/f5d [0,4194304] 0 2026-03-09T16:15:15.427 INFO:tasks.workunit.client.1.vm05.stdout:9/516: truncate d4/d10/f8d 817497 0 2026-03-09T16:15:15.428 INFO:tasks.workunit.client.1.vm05.stdout:8/428: sync 2026-03-09T16:15:15.433 INFO:tasks.workunit.client.1.vm05.stdout:0/460: readlink d5/db/d48/d66/l67 0 2026-03-09T16:15:15.438 INFO:tasks.workunit.client.1.vm05.stdout:6/467: dwrite d17/d22/d27/d34/d42/d65/f75 [0,4194304] 0 2026-03-09T16:15:15.441 INFO:tasks.workunit.client.1.vm05.stdout:8/429: dwrite d4/d6/f44 [0,4194304] 0 2026-03-09T16:15:15.453 INFO:tasks.workunit.client.1.vm05.stdout:2/469: rename db/dd/d15/d46/f88 to db/dd/d15/d1f/f9c 0 2026-03-09T16:15:15.461 INFO:tasks.workunit.client.1.vm05.stdout:2/470: chown db/dd/d15/d3f 530815 1 2026-03-09T16:15:15.461 INFO:tasks.workunit.client.1.vm05.stdout:1/493: creat d7/fb0 x:0 0 0 2026-03-09T16:15:15.461 INFO:tasks.workunit.client.1.vm05.stdout:1/494: chown d7/f4b 5385 1 2026-03-09T16:15:15.461 INFO:tasks.workunit.client.1.vm05.stdout:7/521: symlink d1/d2/d8/dc/d72/laf 0 2026-03-09T16:15:15.461 INFO:tasks.workunit.client.1.vm05.stdout:1/495: chown d7/dd/d21/d3b/f65 0 1 2026-03-09T16:15:15.466 INFO:tasks.workunit.client.1.vm05.stdout:6/468: mknod d17/d4f/cb3 0 2026-03-09T16:15:15.468 INFO:tasks.workunit.client.1.vm05.stdout:8/430: symlink d4/d6/db/dc/d5d/l94 0 2026-03-09T16:15:15.468 INFO:tasks.workunit.client.1.vm05.stdout:0/461: rename d5/db/d1d to d5/d2c/d49/d83/d8b/d95 0 2026-03-09T16:15:15.469 INFO:tasks.workunit.client.1.vm05.stdout:8/431: chown d4/d6/d3a/d3c/l68 483 1 2026-03-09T16:15:15.470 INFO:tasks.workunit.client.1.vm05.stdout:2/471: mknod db/dd/d15/d1f/d20/d23/d78/c9d 0 2026-03-09T16:15:15.471 INFO:tasks.workunit.client.1.vm05.stdout:7/522: unlink d1/c28 0 2026-03-09T16:15:15.472 INFO:tasks.workunit.client.1.vm05.stdout:3/379: dwrite d0/d9/f37 [0,4194304] 0 2026-03-09T16:15:15.474 INFO:tasks.workunit.client.1.vm05.stdout:1/496: chown d7/dd/d21/d39/d48/lac 236 1 2026-03-09T16:15:15.474 INFO:tasks.workunit.client.1.vm05.stdout:3/380: stat d0/d9/l25 0 2026-03-09T16:15:15.474 INFO:tasks.workunit.client.1.vm05.stdout:1/497: stat d7/dd/de/d52/l6a 0 2026-03-09T16:15:15.477 INFO:tasks.workunit.client.1.vm05.stdout:3/381: dwrite d0/d9/f4b [4194304,4194304] 0 2026-03-09T16:15:15.491 INFO:tasks.workunit.client.1.vm05.stdout:6/469: mkdir d17/d22/d9d/db4 0 2026-03-09T16:15:15.491 INFO:tasks.workunit.client.1.vm05.stdout:9/517: rename d4/d10/l29 to d4/d10/d35/lb2 0 2026-03-09T16:15:15.492 INFO:tasks.workunit.client.1.vm05.stdout:8/432: write d4/f3e [171713,69397] 0 2026-03-09T16:15:15.492 INFO:tasks.workunit.client.1.vm05.stdout:2/472: rmdir db/dd/d15/d4c 39 2026-03-09T16:15:15.493 INFO:tasks.workunit.client.1.vm05.stdout:2/473: fdatasync db/dd/d15/d1f/f25 0 2026-03-09T16:15:15.494 INFO:tasks.workunit.client.1.vm05.stdout:4/447: link d5/d19/f32 d5/de/d82/f9e 0 2026-03-09T16:15:15.505 INFO:tasks.workunit.client.1.vm05.stdout:8/433: stat d4/c16 0 2026-03-09T16:15:15.505 INFO:tasks.workunit.client.1.vm05.stdout:2/474: dwrite db/dd/d15/d1f/f25 [4194304,4194304] 0 2026-03-09T16:15:15.511 INFO:tasks.workunit.client.1.vm05.stdout:7/523: creat d1/d2/d11/d86/da2/fb0 x:0 0 0 2026-03-09T16:15:15.512 INFO:tasks.workunit.client.1.vm05.stdout:7/524: dread - d1/d2/d8/d31/d8d/f6f zero size 2026-03-09T16:15:15.514 INFO:tasks.workunit.client.1.vm05.stdout:4/448: dread - d5/de/d15/d21/d31/f72 zero size 2026-03-09T16:15:15.514 INFO:tasks.workunit.client.1.vm05.stdout:7/525: write d1/d2/d8/dc/d1b/d30/f93 [410510,11705] 0 2026-03-09T16:15:15.516 INFO:tasks.workunit.client.1.vm05.stdout:8/434: fdatasync d4/f77 0 2026-03-09T16:15:15.519 INFO:tasks.workunit.client.1.vm05.stdout:8/435: chown d4/d6/db/dc/f30 10992588 1 2026-03-09T16:15:15.519 INFO:tasks.workunit.client.1.vm05.stdout:9/518: creat d4/d10/d35/d36/fb3 x:0 0 0 2026-03-09T16:15:15.522 INFO:tasks.workunit.client.1.vm05.stdout:7/526: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/db1 0 2026-03-09T16:15:15.523 INFO:tasks.workunit.client.1.vm05.stdout:4/449: truncate d5/de/d15/f25 3994845 0 2026-03-09T16:15:15.524 INFO:tasks.workunit.client.1.vm05.stdout:2/475: dwrite f7 [4194304,4194304] 0 2026-03-09T16:15:15.524 INFO:tasks.workunit.client.1.vm05.stdout:9/519: stat d4/d10/d35/d36/d48/d4c/c83 0 2026-03-09T16:15:15.524 INFO:tasks.workunit.client.1.vm05.stdout:4/450: chown d5/de 206 1 2026-03-09T16:15:15.537 INFO:tasks.workunit.client.1.vm05.stdout:7/527: sync 2026-03-09T16:15:15.537 INFO:tasks.workunit.client.1.vm05.stdout:8/436: mkdir d4/d6/db/d75/d84/d95 0 2026-03-09T16:15:15.538 INFO:tasks.workunit.client.1.vm05.stdout:1/498: read d7/d27/f33 [1952653,13304] 0 2026-03-09T16:15:15.539 INFO:tasks.workunit.client.1.vm05.stdout:4/451: rename d5/de/d2f/f8e to d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/f9f 0 2026-03-09T16:15:15.541 INFO:tasks.workunit.client.1.vm05.stdout:4/452: write d5/d19/d37/f51 [238945,43222] 0 2026-03-09T16:15:15.542 INFO:tasks.workunit.client.1.vm05.stdout:8/437: mknod d4/d6/db/dc/d5d/c96 0 2026-03-09T16:15:15.542 INFO:tasks.workunit.client.1.vm05.stdout:7/528: fsync d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f88 0 2026-03-09T16:15:15.544 INFO:tasks.workunit.client.1.vm05.stdout:2/476: symlink db/dd/d15/d4c/d56/l9e 0 2026-03-09T16:15:15.544 INFO:tasks.workunit.client.1.vm05.stdout:9/520: rmdir d4/d10/d35/d36/d48/d54/da3 0 2026-03-09T16:15:15.545 INFO:tasks.workunit.client.1.vm05.stdout:8/438: rmdir d4/d6/db/df 39 2026-03-09T16:15:15.547 INFO:tasks.workunit.client.1.vm05.stdout:9/521: unlink d4/d10/d35/d2b/d31/d82/f88 0 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:1/499: dwrite d7/fc [4194304,4194304] 0 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:7/529: mkdir d1/d2/d8/dc/d1b/d30/d4b/db2 0 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:8/439: mkdir d4/d6/d3a/d40/d6a/d97 0 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:2/477: chown db/dd/l94 89901 1 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:4/453: getdents d5/de/d2f/d8a 0 2026-03-09T16:15:15.558 INFO:tasks.workunit.client.1.vm05.stdout:1/500: creat d7/dd/d21/d3b/d55/fb1 x:0 0 0 2026-03-09T16:15:15.565 INFO:tasks.workunit.client.1.vm05.stdout:8/440: stat d4/d6/d3a/d3c/f3f 0 2026-03-09T16:15:15.570 INFO:tasks.workunit.client.1.vm05.stdout:9/522: rename d4/d10/d35/d36/d48/l90 to d4/d10/d35/d2b/d38/d65/lb4 0 2026-03-09T16:15:15.575 INFO:tasks.workunit.client.1.vm05.stdout:7/530: dwrite d1/d2/d8/dc/d1b/d71/d3c/f95 [0,4194304] 0 2026-03-09T16:15:15.586 INFO:tasks.workunit.client.1.vm05.stdout:1/501: dread d7/d27/f33 [0,4194304] 0 2026-03-09T16:15:15.586 INFO:tasks.workunit.client.1.vm05.stdout:8/441: rmdir d4/d6/db/d75/d84/d95 0 2026-03-09T16:15:15.586 INFO:tasks.workunit.client.1.vm05.stdout:9/523: fdatasync d4/d10/d35/d36/d48/d60/f6c 0 2026-03-09T16:15:15.586 INFO:tasks.workunit.client.1.vm05.stdout:5/452: dread d8/d18/d1b/f32 [4194304,4194304] 0 2026-03-09T16:15:15.590 INFO:tasks.workunit.client.1.vm05.stdout:9/524: fsync d4/d10/d35/d2b/fa1 0 2026-03-09T16:15:15.591 INFO:tasks.workunit.client.1.vm05.stdout:2/478: dread db/dd/d15/d4c/d56/f62 [0,4194304] 0 2026-03-09T16:15:15.594 INFO:tasks.workunit.client.1.vm05.stdout:4/454: dread d5/d19/f48 [0,4194304] 0 2026-03-09T16:15:15.594 INFO:tasks.workunit.client.1.vm05.stdout:9/525: rmdir d4 39 2026-03-09T16:15:15.603 INFO:tasks.workunit.client.1.vm05.stdout:2/479: creat db/dd/d15/d3f/d5b/f9f x:0 0 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:7/531: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fb3 x:0 0 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:1/502: dread d7/dd/de/f3e [0,4194304] 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:1/503: readlink d7/d15/l1a 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:9/526: truncate d4/d10/d35/d36/fb3 571933 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:7/532: rmdir d1/d2/d8/dc/d1b/d71 39 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:4/455: mkdir d5/de/d15/d21/da0 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:7/533: creat d1/d2/d8/dc/d33/fb4 x:0 0 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:4/456: fdatasync d5/f6 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:1/504: mknod d7/daa/cb2 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:7/534: stat d1/d2/d8/d31/d8d/f6f 0 2026-03-09T16:15:15.604 INFO:tasks.workunit.client.1.vm05.stdout:9/527: rename d4/d10/d35/d36/l53 to d4/lb5 0 2026-03-09T16:15:15.606 INFO:tasks.workunit.client.1.vm05.stdout:8/442: sync 2026-03-09T16:15:15.608 INFO:tasks.workunit.client.1.vm05.stdout:8/443: dread - d4/d6/d53/f89 zero size 2026-03-09T16:15:15.621 INFO:tasks.workunit.client.1.vm05.stdout:5/453: dread d8/d18/d1b/f28 [0,4194304] 0 2026-03-09T16:15:15.622 INFO:tasks.workunit.client.1.vm05.stdout:5/454: write f5 [2386386,127327] 0 2026-03-09T16:15:15.627 INFO:tasks.workunit.client.1.vm05.stdout:4/457: creat d5/de/d15/d21/d27/d3c/d5c/fa1 x:0 0 0 2026-03-09T16:15:15.629 INFO:tasks.workunit.client.1.vm05.stdout:4/458: write d5/f3e [2117067,37859] 0 2026-03-09T16:15:15.629 INFO:tasks.workunit.client.1.vm05.stdout:8/444: dwrite d4/d6/db/dc/d2e/f46 [0,4194304] 0 2026-03-09T16:15:15.642 INFO:tasks.workunit.client.1.vm05.stdout:8/445: mknod d4/d6/db/dc/c98 0 2026-03-09T16:15:15.644 INFO:tasks.workunit.client.1.vm05.stdout:8/446: write d4/d6/f44 [3599534,13709] 0 2026-03-09T16:15:15.645 INFO:tasks.workunit.client.1.vm05.stdout:8/447: truncate d4/d6/d3a/d15/f93 835623 0 2026-03-09T16:15:15.647 INFO:tasks.workunit.client.1.vm05.stdout:4/459: mkdir d5/de/d15/d21/d27/d3c/d5c/da2 0 2026-03-09T16:15:15.647 INFO:tasks.workunit.client.1.vm05.stdout:6/470: mknod d17/d22/d27/d8a/cb5 0 2026-03-09T16:15:15.649 INFO:tasks.workunit.client.1.vm05.stdout:5/455: sync 2026-03-09T16:15:15.651 INFO:tasks.workunit.client.1.vm05.stdout:9/528: dread d4/d10/f80 [0,4194304] 0 2026-03-09T16:15:15.653 INFO:tasks.workunit.client.1.vm05.stdout:8/448: dwrite d4/d6/d3a/f8e [0,4194304] 0 2026-03-09T16:15:15.653 INFO:tasks.workunit.client.1.vm05.stdout:5/456: rmdir d8/d59/d5b 39 2026-03-09T16:15:15.660 INFO:tasks.workunit.client.1.vm05.stdout:1/505: dread d7/dd/de/d52/d5b/f5e [0,4194304] 0 2026-03-09T16:15:15.662 INFO:tasks.workunit.client.1.vm05.stdout:9/529: dread d4/d10/d35/f32 [0,4194304] 0 2026-03-09T16:15:15.669 INFO:tasks.workunit.client.1.vm05.stdout:6/471: chown d17/d22/d27/d34/d42/d65/fa6 178574837 1 2026-03-09T16:15:15.673 INFO:tasks.workunit.client.1.vm05.stdout:0/462: dwrite d5/db/f54 [0,4194304] 0 2026-03-09T16:15:15.675 INFO:tasks.workunit.client.1.vm05.stdout:8/449: stat d4/d6/d3a/c2c 0 2026-03-09T16:15:15.675 INFO:tasks.workunit.client.1.vm05.stdout:3/382: dwrite d0/d9/d22/f2e [0,4194304] 0 2026-03-09T16:15:15.678 INFO:tasks.workunit.client.1.vm05.stdout:8/450: readlink d4/d6/d3a/d15/d83/l64 0 2026-03-09T16:15:15.682 INFO:tasks.workunit.client.1.vm05.stdout:1/506: truncate d7/dd/de/f2e 3148735 0 2026-03-09T16:15:15.683 INFO:tasks.workunit.client.1.vm05.stdout:6/472: dwrite d17/f5b [0,4194304] 0 2026-03-09T16:15:15.684 INFO:tasks.workunit.client.1.vm05.stdout:4/460: read d5/f59 [4045363,51200] 0 2026-03-09T16:15:15.691 INFO:tasks.workunit.client.1.vm05.stdout:6/473: truncate d17/d4f/f8c 68663 0 2026-03-09T16:15:15.710 INFO:tasks.workunit.client.1.vm05.stdout:8/451: rmdir d4/d6/d3a/d40/d71 39 2026-03-09T16:15:15.716 INFO:tasks.workunit.client.1.vm05.stdout:2/480: truncate f7 7543544 0 2026-03-09T16:15:15.716 INFO:tasks.workunit.client.1.vm05.stdout:1/507: mkdir d7/dd/db3 0 2026-03-09T16:15:15.718 INFO:tasks.workunit.client.1.vm05.stdout:4/461: fdatasync d5/f10 0 2026-03-09T16:15:15.718 INFO:tasks.workunit.client.1.vm05.stdout:7/535: write d1/d2/d8/dc/d14/f41 [2505458,39453] 0 2026-03-09T16:15:15.720 INFO:tasks.workunit.client.1.vm05.stdout:2/481: truncate db/dd/d15/d3f/d5b/f9f 239117 0 2026-03-09T16:15:15.723 INFO:tasks.workunit.client.1.vm05.stdout:0/463: mknod d5/db/c96 0 2026-03-09T16:15:15.726 INFO:tasks.workunit.client.1.vm05.stdout:8/452: unlink d4/d6/d3a/l4b 0 2026-03-09T16:15:15.730 INFO:tasks.workunit.client.1.vm05.stdout:8/453: truncate d4/f13 4916991 0 2026-03-09T16:15:15.732 INFO:tasks.workunit.client.1.vm05.stdout:8/454: chown d4/d6/d3a/d40/f4e 398847057 1 2026-03-09T16:15:15.733 INFO:tasks.workunit.client.1.vm05.stdout:8/455: chown d4/d6/f58 184626 1 2026-03-09T16:15:15.737 INFO:tasks.workunit.client.1.vm05.stdout:0/464: dwrite d5/d2c/f7f [0,4194304] 0 2026-03-09T16:15:15.745 INFO:tasks.workunit.client.1.vm05.stdout:2/482: dread db/dd/d15/d3f/f4a [0,4194304] 0 2026-03-09T16:15:15.745 INFO:tasks.workunit.client.1.vm05.stdout:3/383: rename d0/d9/f4b to d0/f7c 0 2026-03-09T16:15:15.747 INFO:tasks.workunit.client.1.vm05.stdout:0/465: rmdir d5/d11 39 2026-03-09T16:15:15.748 INFO:tasks.workunit.client.1.vm05.stdout:1/508: dwrite d7/d15/d16/f74 [0,4194304] 0 2026-03-09T16:15:15.752 INFO:tasks.workunit.client.1.vm05.stdout:8/456: rename d4/d6/d3a/d40/c4a to d4/d6/db/dc/d2e/c99 0 2026-03-09T16:15:15.758 INFO:tasks.workunit.client.1.vm05.stdout:5/457: write d8/d18/f3a [3790740,82807] 0 2026-03-09T16:15:15.758 INFO:tasks.workunit.client.1.vm05.stdout:4/462: creat d5/de/d15/fa3 x:0 0 0 2026-03-09T16:15:15.758 INFO:tasks.workunit.client.1.vm05.stdout:3/384: creat d0/d33/f7d x:0 0 0 2026-03-09T16:15:15.758 INFO:tasks.workunit.client.1.vm05.stdout:2/483: unlink db/dd/d15/f51 0 2026-03-09T16:15:15.767 INFO:tasks.workunit.client.1.vm05.stdout:1/509: dwrite d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:15.771 INFO:tasks.workunit.client.1.vm05.stdout:5/458: rename d8/d18/d1b/d2e to d8/d3d/daa 0 2026-03-09T16:15:15.771 INFO:tasks.workunit.client.1.vm05.stdout:0/466: unlink d5/db/d5b/d82/l8c 0 2026-03-09T16:15:15.771 INFO:tasks.workunit.client.1.vm05.stdout:7/536: creat d1/d2/d8/dc/d33/fb5 x:0 0 0 2026-03-09T16:15:15.771 INFO:tasks.workunit.client.1.vm05.stdout:6/474: link d17/d1d/c89 d17/d22/cb6 0 2026-03-09T16:15:15.773 INFO:tasks.workunit.client.1.vm05.stdout:7/537: stat d1/d2/d8/dc/d1b/d30/d4b/d65/l29 0 2026-03-09T16:15:15.774 INFO:tasks.workunit.client.1.vm05.stdout:3/385: dread d0/d9/f5c [0,4194304] 0 2026-03-09T16:15:15.779 INFO:tasks.workunit.client.1.vm05.stdout:4/463: symlink d5/d19/la4 0 2026-03-09T16:15:15.781 INFO:tasks.workunit.client.1.vm05.stdout:1/510: mkdir d7/dd/d21/d3b/db4 0 2026-03-09T16:15:15.781 INFO:tasks.workunit.client.1.vm05.stdout:1/511: dread - d7/d27/f64 zero size 2026-03-09T16:15:15.781 INFO:tasks.workunit.client.1.vm05.stdout:0/467: mkdir d5/d97 0 2026-03-09T16:15:15.785 INFO:tasks.workunit.client.1.vm05.stdout:8/457: sync 2026-03-09T16:15:15.786 INFO:tasks.workunit.client.1.vm05.stdout:2/484: dread db/dd/d15/f48 [4194304,4194304] 0 2026-03-09T16:15:15.789 INFO:tasks.workunit.client.1.vm05.stdout:8/458: read - d4/d6/d3a/d3c/f8d zero size 2026-03-09T16:15:15.794 INFO:tasks.workunit.client.1.vm05.stdout:4/464: unlink d5/de/d15/c28 0 2026-03-09T16:15:15.795 INFO:tasks.workunit.client.1.vm05.stdout:7/538: dwrite d1/d2/d8/dc/d1b/f5a [0,4194304] 0 2026-03-09T16:15:15.795 INFO:tasks.workunit.client.1.vm05.stdout:4/465: dread f1 [4194304,4194304] 0 2026-03-09T16:15:15.795 INFO:tasks.workunit.client.1.vm05.stdout:8/459: mkdir d4/d6/d9a 0 2026-03-09T16:15:15.797 INFO:tasks.workunit.client.1.vm05.stdout:1/512: mkdir d7/dd/d21/d39/d48/da7/db5 0 2026-03-09T16:15:15.797 INFO:tasks.workunit.client.1.vm05.stdout:6/475: link d17/d22/d27/d34/d4b/fa4 d17/fb7 0 2026-03-09T16:15:15.798 INFO:tasks.workunit.client.1.vm05.stdout:4/466: write d5/de/d15/d21/d27/d3c/f3d [4285768,49926] 0 2026-03-09T16:15:15.798 INFO:tasks.workunit.client.1.vm05.stdout:4/467: readlink d5/l8b 0 2026-03-09T16:15:15.798 INFO:tasks.workunit.client.1.vm05.stdout:3/386: creat d0/d9/d22/d5f/d75/d76/f7e x:0 0 0 2026-03-09T16:15:15.798 INFO:tasks.workunit.client.1.vm05.stdout:4/468: stat d5/d19/d37/d60/f93 0 2026-03-09T16:15:15.802 INFO:tasks.workunit.client.1.vm05.stdout:9/530: truncate d4/d10/d35/d2b/f2c 873207 0 2026-03-09T16:15:15.803 INFO:tasks.workunit.client.1.vm05.stdout:8/460: readlink d4/d6/l19 0 2026-03-09T16:15:15.803 INFO:tasks.workunit.client.1.vm05.stdout:6/476: chown d17/d1d/f1e 41 1 2026-03-09T16:15:15.809 INFO:tasks.workunit.client.1.vm05.stdout:2/485: mkdir db/dd/d15/d46/d8d/da0 0 2026-03-09T16:15:15.812 INFO:tasks.workunit.client.1.vm05.stdout:2/486: chown db/dd/d15/d1f/d21 1 1 2026-03-09T16:15:15.820 INFO:tasks.workunit.client.1.vm05.stdout:4/469: creat d5/d19/d37/fa5 x:0 0 0 2026-03-09T16:15:15.820 INFO:tasks.workunit.client.1.vm05.stdout:0/468: dread d5/db/f12 [0,4194304] 0 2026-03-09T16:15:15.821 INFO:tasks.workunit.client.1.vm05.stdout:3/387: rename d0/d9/d22/d4c/d4e/f55 to d0/d9/d22/d4c/f7f 0 2026-03-09T16:15:15.832 INFO:tasks.workunit.client.1.vm05.stdout:2/487: dwrite f7 [0,4194304] 0 2026-03-09T16:15:15.832 INFO:tasks.workunit.client.1.vm05.stdout:4/470: creat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/fa6 x:0 0 0 2026-03-09T16:15:15.837 INFO:tasks.workunit.client.1.vm05.stdout:7/539: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fb3 [0,4194304] 0 2026-03-09T16:15:15.843 INFO:tasks.workunit.client.1.vm05.stdout:9/531: unlink d4/c12 0 2026-03-09T16:15:15.843 INFO:tasks.workunit.client.1.vm05.stdout:8/461: mkdir d4/d6/db/d9b 0 2026-03-09T16:15:15.844 INFO:tasks.workunit.client.1.vm05.stdout:3/388: mkdir d0/d9/d22/d4c/d80 0 2026-03-09T16:15:15.845 INFO:tasks.workunit.client.1.vm05.stdout:6/477: dwrite d17/d22/d27/d34/d42/d53/f55 [4194304,4194304] 0 2026-03-09T16:15:15.850 INFO:tasks.workunit.client.1.vm05.stdout:5/459: readlink d8/d18/d1b/d47/d4e/l97 0 2026-03-09T16:15:15.851 INFO:tasks.workunit.client.1.vm05.stdout:5/460: truncate d8/d18/d1b/d47/d48/f61 1449386 0 2026-03-09T16:15:15.860 INFO:tasks.workunit.client.1.vm05.stdout:4/471: creat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/fa7 x:0 0 0 2026-03-09T16:15:15.861 INFO:tasks.workunit.client.1.vm05.stdout:0/469: mknod d5/d97/c98 0 2026-03-09T16:15:15.869 INFO:tasks.workunit.client.1.vm05.stdout:8/462: chown d4/d6/c1a 0 1 2026-03-09T16:15:15.869 INFO:tasks.workunit.client.1.vm05.stdout:3/389: unlink d0/d9/d22/c2d 0 2026-03-09T16:15:15.870 INFO:tasks.workunit.client.1.vm05.stdout:9/532: dwrite d4/d10/d35/d36/d48/d60/f6c [0,4194304] 0 2026-03-09T16:15:15.874 INFO:tasks.workunit.client.1.vm05.stdout:5/461: mkdir d8/d18/d1b/d47/d4e/d76/d8f/dab 0 2026-03-09T16:15:15.875 INFO:tasks.workunit.client.1.vm05.stdout:9/533: fsync d4/d10/d35/d36/d48/d60/fad 0 2026-03-09T16:15:15.875 INFO:tasks.workunit.client.1.vm05.stdout:9/534: chown d4/d10/d35/d2b/d31/d96/fa7 20524254 1 2026-03-09T16:15:15.876 INFO:tasks.workunit.client.1.vm05.stdout:9/535: readlink d4/d10/l33 0 2026-03-09T16:15:15.880 INFO:tasks.workunit.client.1.vm05.stdout:7/540: mkdir d1/d2/d11/d86/da2/db6 0 2026-03-09T16:15:15.889 INFO:tasks.workunit.client.1.vm05.stdout:0/470: dwrite d5/db/d48/d66/f72 [0,4194304] 0 2026-03-09T16:15:15.890 INFO:tasks.workunit.client.1.vm05.stdout:0/471: stat d5/d1b/d3b 0 2026-03-09T16:15:15.890 INFO:tasks.workunit.client.1.vm05.stdout:0/472: stat d5/db/d77 0 2026-03-09T16:15:15.890 INFO:tasks.workunit.client.1.vm05.stdout:3/390: creat d0/d33/f81 x:0 0 0 2026-03-09T16:15:15.890 INFO:tasks.workunit.client.1.vm05.stdout:3/391: truncate d0/d9/f51 696246 0 2026-03-09T16:15:15.895 INFO:tasks.workunit.client.1.vm05.stdout:6/478: mkdir d17/d22/d27/d58/db8 0 2026-03-09T16:15:15.900 INFO:tasks.workunit.client.1.vm05.stdout:4/472: creat d5/d9c/fa8 x:0 0 0 2026-03-09T16:15:15.900 INFO:tasks.workunit.client.1.vm05.stdout:8/463: creat d4/d6/db/df/d80/f9c x:0 0 0 2026-03-09T16:15:15.900 INFO:tasks.workunit.client.1.vm05.stdout:7/541: mknod d1/d2/d8/d31/d8d/d5d/cb7 0 2026-03-09T16:15:15.900 INFO:tasks.workunit.client.1.vm05.stdout:0/473: chown d5/db/d5b/f35 596497680 1 2026-03-09T16:15:15.901 INFO:tasks.workunit.client.1.vm05.stdout:0/474: chown d5/d2c/d49 986445 1 2026-03-09T16:15:15.905 INFO:tasks.workunit.client.1.vm05.stdout:8/464: fdatasync d4/d6/d3a/d15/f22 0 2026-03-09T16:15:15.910 INFO:tasks.workunit.client.1.vm05.stdout:3/392: rename d0/d9/d22/l20 to d0/d9/l82 0 2026-03-09T16:15:15.911 INFO:tasks.workunit.client.1.vm05.stdout:3/393: chown d0/d33/f81 58312219 1 2026-03-09T16:15:15.911 INFO:tasks.workunit.client.1.vm05.stdout:3/394: dread - d0/d33/f64 zero size 2026-03-09T16:15:15.911 INFO:tasks.workunit.client.1.vm05.stdout:4/473: mkdir d5/de/d15/da9 0 2026-03-09T16:15:15.913 INFO:tasks.workunit.client.1.vm05.stdout:5/462: dwrite d8/d18/d1b/f28 [4194304,4194304] 0 2026-03-09T16:15:15.915 INFO:tasks.workunit.client.1.vm05.stdout:9/536: link d4/d10/f18 d4/d10/d35/d36/d48/d54/d59/fb6 0 2026-03-09T16:15:15.916 INFO:tasks.workunit.client.1.vm05.stdout:0/475: creat d5/db/d48/d66/f99 x:0 0 0 2026-03-09T16:15:15.916 INFO:tasks.workunit.client.1.vm05.stdout:4/474: chown d5/de/d15/d21/d27/d3c/c54 14099076 1 2026-03-09T16:15:15.919 INFO:tasks.workunit.client.1.vm05.stdout:1/513: dread d7/dd/f1f [0,4194304] 0 2026-03-09T16:15:15.920 INFO:tasks.workunit.client.1.vm05.stdout:0/476: stat d5/db/d48/d66/f72 0 2026-03-09T16:15:15.933 INFO:tasks.workunit.client.1.vm05.stdout:8/465: creat d4/d6/d3a/d15/d83/f9d x:0 0 0 2026-03-09T16:15:15.934 INFO:tasks.workunit.client.1.vm05.stdout:3/395: mknod d0/d9/d22/d4c/d4e/c83 0 2026-03-09T16:15:15.937 INFO:tasks.workunit.client.1.vm05.stdout:1/514: mkdir d7/d62/db6 0 2026-03-09T16:15:15.940 INFO:tasks.workunit.client.1.vm05.stdout:2/488: dread - db/dd/d15/f90 zero size 2026-03-09T16:15:15.940 INFO:tasks.workunit.client.1.vm05.stdout:7/542: creat d1/d2/d8/dc/d1b/d30/fb8 x:0 0 0 2026-03-09T16:15:15.941 INFO:tasks.workunit.client.1.vm05.stdout:9/537: fsync d4/d10/d35/d36/d48/d60/f8f 0 2026-03-09T16:15:15.945 INFO:tasks.workunit.client.1.vm05.stdout:6/479: dread d17/f18 [0,4194304] 0 2026-03-09T16:15:15.945 INFO:tasks.workunit.client.1.vm05.stdout:7/543: dread - d1/d2/d11/d86/d8a/fa3 zero size 2026-03-09T16:15:15.945 INFO:tasks.workunit.client.1.vm05.stdout:5/463: link d8/f55 d8/d18/d1b/d47/d48/d73/d80/fac 0 2026-03-09T16:15:15.947 INFO:tasks.workunit.client.1.vm05.stdout:1/515: write d7/dd/d21/d44/f46 [344762,80955] 0 2026-03-09T16:15:15.949 INFO:tasks.workunit.client.1.vm05.stdout:2/489: symlink db/dd/d15/d1f/d20/la1 0 2026-03-09T16:15:15.950 INFO:tasks.workunit.client.1.vm05.stdout:5/464: write d8/d53/d7a/f92 [770939,40156] 0 2026-03-09T16:15:15.954 INFO:tasks.workunit.client.1.vm05.stdout:9/538: chown d4/f20 193 1 2026-03-09T16:15:15.958 INFO:tasks.workunit.client.1.vm05.stdout:0/477: sync 2026-03-09T16:15:15.960 INFO:tasks.workunit.client.1.vm05.stdout:8/466: dwrite d4/d6/db/dc/d5d/f7a [0,4194304] 0 2026-03-09T16:15:15.964 INFO:tasks.workunit.client.1.vm05.stdout:3/396: dwrite d0/d9/f4d [0,4194304] 0 2026-03-09T16:15:15.966 INFO:tasks.workunit.client.1.vm05.stdout:0/478: fdatasync d5/d2c/d49/d83/d8b/d95/f6d 0 2026-03-09T16:15:15.978 INFO:tasks.workunit.client.1.vm05.stdout:6/480: unlink d17/d22/d27/d34/d42/d53/f55 0 2026-03-09T16:15:15.978 INFO:tasks.workunit.client.1.vm05.stdout:4/475: truncate d5/de/d15/d21/f50 1092847 0 2026-03-09T16:15:15.984 INFO:tasks.workunit.client.1.vm05.stdout:9/539: truncate d4/d10/d35/d36/f67 291585 0 2026-03-09T16:15:15.985 INFO:tasks.workunit.client.1.vm05.stdout:9/540: chown d4/d10/l3d 44736 1 2026-03-09T16:15:15.986 INFO:tasks.workunit.client.1.vm05.stdout:0/479: fsync d5/d11/f40 0 2026-03-09T16:15:15.991 INFO:tasks.workunit.client.1.vm05.stdout:5/465: dwrite d8/d5e/f72 [0,4194304] 0 2026-03-09T16:15:15.991 INFO:tasks.workunit.client.1.vm05.stdout:0/480: write d5/d1b/f78 [644278,52830] 0 2026-03-09T16:15:15.991 INFO:tasks.workunit.client.1.vm05.stdout:0/481: write d5/db/d5f/f85 [465255,88779] 0 2026-03-09T16:15:15.999 INFO:tasks.workunit.client.1.vm05.stdout:3/397: rmdir d0/d9/d22/d5f/d75 39 2026-03-09T16:15:16.000 INFO:tasks.workunit.client.1.vm05.stdout:8/467: creat d4/d6/db/d9b/f9e x:0 0 0 2026-03-09T16:15:16.001 INFO:tasks.workunit.client.1.vm05.stdout:3/398: chown d0/d9/l25 191258 1 2026-03-09T16:15:16.001 INFO:tasks.workunit.client.1.vm05.stdout:8/468: fdatasync d4/d6/d3a/f28 0 2026-03-09T16:15:16.003 INFO:tasks.workunit.client.1.vm05.stdout:0/482: mknod d5/d1b/d30/c9a 0 2026-03-09T16:15:16.007 INFO:tasks.workunit.client.1.vm05.stdout:4/476: link d5/de/d15/d21/d31/f64 d5/de/d15/d21/d39/d91/faa 0 2026-03-09T16:15:16.018 INFO:tasks.workunit.client.1.vm05.stdout:0/483: dwrite d5/d1b/d30/f55 [0,4194304] 0 2026-03-09T16:15:16.019 INFO:tasks.workunit.client.1.vm05.stdout:7/544: link d1/d2/d8/dc/d1b/c53 d1/d2/d8/dc/d1b/d71/d3c/cb9 0 2026-03-09T16:15:16.020 INFO:tasks.workunit.client.1.vm05.stdout:3/399: sync 2026-03-09T16:15:16.022 INFO:tasks.workunit.client.1.vm05.stdout:8/469: truncate d4/d6/db/d59/f60 676695 0 2026-03-09T16:15:16.025 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:15 vm05.local ceph-mon[58702]: Upgrade: Updating node-exporter.vm05 (2/2) 2026-03-09T16:15:16.025 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:15 vm05.local ceph-mon[58702]: Deploying daemon node-exporter.vm05 on vm05 2026-03-09T16:15:16.025 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:15 vm05.local ceph-mon[58702]: pgmap v17: 65 pgs: 65 active+clean; 2.5 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 28 MiB/s rd, 143 MiB/s wr, 336 op/s 2026-03-09T16:15:16.028 INFO:tasks.workunit.client.1.vm05.stdout:2/490: write db/dd/d15/d3f/d5b/d60/f7c [421296,70843] 0 2026-03-09T16:15:16.033 INFO:tasks.workunit.client.1.vm05.stdout:2/491: write db/dd/d15/d3f/d5b/f9f [878197,108632] 0 2026-03-09T16:15:16.033 INFO:tasks.workunit.client.1.vm05.stdout:5/466: link d8/d18/d1b/d47/d4e/f64 d8/fad 0 2026-03-09T16:15:16.034 INFO:tasks.workunit.client.1.vm05.stdout:4/477: creat d5/de/d82/fab x:0 0 0 2026-03-09T16:15:16.034 INFO:tasks.workunit.client.1.vm05.stdout:7/545: unlink d1/fa0 0 2026-03-09T16:15:16.036 INFO:tasks.workunit.client.1.vm05.stdout:8/470: mkdir d4/d6/db/df/d4f/d9f 0 2026-03-09T16:15:16.038 INFO:tasks.workunit.client.1.vm05.stdout:2/492: mkdir db/dd/d15/d3f/d5b/d60/da2 0 2026-03-09T16:15:16.042 INFO:tasks.workunit.client.1.vm05.stdout:8/471: chown d4/d6/db/df/d80/f9c 845564 1 2026-03-09T16:15:16.044 INFO:tasks.workunit.client.1.vm05.stdout:6/481: dwrite d17/f60 [4194304,4194304] 0 2026-03-09T16:15:16.045 INFO:tasks.workunit.client.1.vm05.stdout:3/400: symlink d0/d9/d22/d4c/d80/l84 0 2026-03-09T16:15:16.048 INFO:tasks.workunit.client.1.vm05.stdout:7/546: mknod d1/d2/d8/d31/d8d/d5d/cba 0 2026-03-09T16:15:16.049 INFO:tasks.workunit.client.1.vm05.stdout:8/472: mkdir d4/d6/db/dc/d5d/da0 0 2026-03-09T16:15:16.051 INFO:tasks.workunit.client.1.vm05.stdout:1/516: write d7/d15/d16/f29 [239922,125391] 0 2026-03-09T16:15:16.051 INFO:tasks.workunit.client.1.vm05.stdout:4/478: mknod d5/de/d15/d21/d27/d3c/d5c/da2/cac 0 2026-03-09T16:15:16.051 INFO:tasks.workunit.client.1.vm05.stdout:5/467: dwrite d8/d18/d1b/d47/d48/fa2 [0,4194304] 0 2026-03-09T16:15:16.064 INFO:tasks.workunit.client.1.vm05.stdout:7/547: readlink d1/d2/d8/lb 0 2026-03-09T16:15:16.072 INFO:tasks.workunit.client.1.vm05.stdout:5/468: chown d8/d53/d7e/f82 30137 1 2026-03-09T16:15:16.078 INFO:tasks.workunit.client.1.vm05.stdout:7/548: dwrite d1/d2/d8/dc/d1b/d71/d3c/f60 [0,4194304] 0 2026-03-09T16:15:16.082 INFO:tasks.workunit.client.1.vm05.stdout:2/493: dwrite db/dd/d15/d3f/d5b/d60/d6a/f8a [0,4194304] 0 2026-03-09T16:15:16.086 INFO:tasks.workunit.client.1.vm05.stdout:3/401: dwrite d0/d9/d22/f2e [0,4194304] 0 2026-03-09T16:15:16.090 INFO:tasks.workunit.client.1.vm05.stdout:7/549: write d1/d2/d8/dc/d1b/d30/d7d/fa7 [434095,49872] 0 2026-03-09T16:15:16.091 INFO:tasks.workunit.client.1.vm05.stdout:0/484: truncate d5/d2c/f28 2463796 0 2026-03-09T16:15:16.091 INFO:tasks.workunit.client.1.vm05.stdout:1/517: symlink d7/daa/lb7 0 2026-03-09T16:15:16.092 INFO:tasks.workunit.client.1.vm05.stdout:6/482: mknod d17/d22/d9d/db4/cb9 0 2026-03-09T16:15:16.093 INFO:tasks.workunit.client.1.vm05.stdout:9/541: dwrite d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:16.094 INFO:tasks.workunit.client.1.vm05.stdout:1/518: chown d7/dd/d21/d44/d5c 1016195596 1 2026-03-09T16:15:16.099 INFO:tasks.workunit.client.1.vm05.stdout:4/479: rename d5/d19 to d5/de/d15/d21/d31/dad 0 2026-03-09T16:15:16.100 INFO:tasks.workunit.client.1.vm05.stdout:7/550: chown d1/d2/d8/dc/d1b/d30/d7d/fa7 13219 1 2026-03-09T16:15:16.105 INFO:tasks.workunit.client.1.vm05.stdout:5/469: read - d8/d18/d1b/d6b/f93 zero size 2026-03-09T16:15:16.105 INFO:tasks.workunit.client.1.vm05.stdout:2/494: rmdir db/dd/d15/d1f/d20/d23 39 2026-03-09T16:15:16.113 INFO:tasks.workunit.client.1.vm05.stdout:2/495: write db/dd/d15/d46/d67/f9a [281864,55058] 0 2026-03-09T16:15:16.114 INFO:tasks.workunit.client.1.vm05.stdout:3/402: creat d0/d33/f85 x:0 0 0 2026-03-09T16:15:16.114 INFO:tasks.workunit.client.1.vm05.stdout:7/551: dwrite d1/d2/d8/dc/d33/f9d [0,4194304] 0 2026-03-09T16:15:16.114 INFO:tasks.workunit.client.1.vm05.stdout:8/473: creat d4/d6/db/dc/d5d/da0/fa1 x:0 0 0 2026-03-09T16:15:16.115 INFO:tasks.workunit.client.1.vm05.stdout:3/403: chown d0/l7 843 1 2026-03-09T16:15:16.116 INFO:tasks.workunit.client.1.vm05.stdout:7/552: stat d1/d2/d8/dc/d33/fb5 0 2026-03-09T16:15:16.123 INFO:tasks.workunit.client.1.vm05.stdout:6/483: unlink d17/d22/d27/d34/c81 0 2026-03-09T16:15:16.123 INFO:tasks.workunit.client.1.vm05.stdout:1/519: link d7/d15/d16/f53 d7/dd/de/d52/d5b/fb8 0 2026-03-09T16:15:16.124 INFO:tasks.workunit.client.1.vm05.stdout:1/520: stat d7/dd/d21/f3d 0 2026-03-09T16:15:16.128 INFO:tasks.workunit.client.1.vm05.stdout:9/542: rename d4/f6b to d4/d10/d35/d36/d48/fb7 0 2026-03-09T16:15:16.128 INFO:tasks.workunit.client.1.vm05.stdout:6/484: creat d17/d22/d27/d34/d42/d53/fba x:0 0 0 2026-03-09T16:15:16.129 INFO:tasks.workunit.client.1.vm05.stdout:1/521: mkdir d7/dd/d21/d39/d87/db9 0 2026-03-09T16:15:16.131 INFO:tasks.workunit.client.1.vm05.stdout:8/474: rename d4/d6/d3a/d3c/f69 to d4/d6/db/dc/fa2 0 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:6/485: symlink d17/d1d/lbb 0 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:9/543: rmdir d4/d10/d35/d36/d48/d4c 39 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:3/404: getdents d0/d9/d22/d5f 0 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:3/405: write d0/f60 [1729707,8216] 0 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:3/406: dread - d0/d33/f85 zero size 2026-03-09T16:15:16.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:15 vm03.local ceph-mon[51019]: Upgrade: Updating node-exporter.vm05 (2/2) 2026-03-09T16:15:16.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:15 vm03.local ceph-mon[51019]: Deploying daemon node-exporter.vm05 on vm05 2026-03-09T16:15:16.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:15 vm03.local ceph-mon[51019]: pgmap v17: 65 pgs: 65 active+clean; 2.5 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 28 MiB/s rd, 143 MiB/s wr, 336 op/s 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:9/544: chown d4/d10/d35/d2b/c74 1872865137 1 2026-03-09T16:15:16.140 INFO:tasks.workunit.client.1.vm05.stdout:4/480: sync 2026-03-09T16:15:16.141 INFO:tasks.workunit.client.1.vm05.stdout:5/470: rename d8/d3d to d8/d59/d5b/d8b/da0/dae 0 2026-03-09T16:15:16.141 INFO:tasks.workunit.client.1.vm05.stdout:0/485: dread d5/d2c/d49/d83/d8b/d95/f59 [0,4194304] 0 2026-03-09T16:15:16.142 INFO:tasks.workunit.client.1.vm05.stdout:6/486: rename d17/d22/d27/d34/d42/d65/l8f to d17/d22/d27/d34/d42/d65/lbc 0 2026-03-09T16:15:16.146 INFO:tasks.workunit.client.1.vm05.stdout:8/475: symlink d4/la3 0 2026-03-09T16:15:16.147 INFO:tasks.workunit.client.1.vm05.stdout:1/522: link d7/dd/d21/d39/d5a/d50/fab d7/dd/d21/fba 0 2026-03-09T16:15:16.148 INFO:tasks.workunit.client.1.vm05.stdout:5/471: fsync d8/d1d/f21 0 2026-03-09T16:15:16.148 INFO:tasks.workunit.client.1.vm05.stdout:4/481: rename d5/de/d15/d21/d27/d3c/d5c/fa1 to d5/de/d15/d21/d39/d91/fae 0 2026-03-09T16:15:16.149 INFO:tasks.workunit.client.1.vm05.stdout:0/486: symlink d5/d2c/d49/l9b 0 2026-03-09T16:15:16.149 INFO:tasks.workunit.client.1.vm05.stdout:6/487: chown d17/d22/d27/d34/d42/d53/l6a 1018 1 2026-03-09T16:15:16.151 INFO:tasks.workunit.client.1.vm05.stdout:9/545: dread d4/d10/d35/d2b/d31/d96/f9b [0,4194304] 0 2026-03-09T16:15:16.154 INFO:tasks.workunit.client.1.vm05.stdout:8/476: truncate d4/d6/d53/f89 837944 0 2026-03-09T16:15:16.155 INFO:tasks.workunit.client.1.vm05.stdout:5/472: readlink d8/d59/d5b/d8b/da0/dae/daa/l9a 0 2026-03-09T16:15:16.157 INFO:tasks.workunit.client.1.vm05.stdout:0/487: creat d5/d2c/d49/d83/f9c x:0 0 0 2026-03-09T16:15:16.158 INFO:tasks.workunit.client.1.vm05.stdout:1/523: creat d7/d62/da3/fbb x:0 0 0 2026-03-09T16:15:16.163 INFO:tasks.workunit.client.1.vm05.stdout:9/546: rmdir d4/d10/d35/d2b/d38/d65 39 2026-03-09T16:15:16.164 INFO:tasks.workunit.client.1.vm05.stdout:5/473: dwrite d8/d59/d5b/d8b/da0/dae/daa/f9d [0,4194304] 0 2026-03-09T16:15:16.164 INFO:tasks.workunit.client.1.vm05.stdout:4/482: mknod d5/de/d15/d21/d27/d3c/d5c/caf 0 2026-03-09T16:15:16.165 INFO:tasks.workunit.client.1.vm05.stdout:9/547: chown d4/d10/d35/d36/d48/d54/db0 3826 1 2026-03-09T16:15:16.168 INFO:tasks.workunit.client.1.vm05.stdout:1/524: read d7/fb [12367977,70468] 0 2026-03-09T16:15:16.170 INFO:tasks.workunit.client.1.vm05.stdout:9/548: chown d4/d10/d35/d36/c3b 4498 1 2026-03-09T16:15:16.172 INFO:tasks.workunit.client.1.vm05.stdout:0/488: dwrite d5/f76 [0,4194304] 0 2026-03-09T16:15:16.172 INFO:tasks.workunit.client.1.vm05.stdout:1/525: write d7/d15/d16/f74 [851858,107847] 0 2026-03-09T16:15:16.175 INFO:tasks.workunit.client.1.vm05.stdout:3/407: link d0/d9/d22/f2a d0/f86 0 2026-03-09T16:15:16.179 INFO:tasks.workunit.client.1.vm05.stdout:4/483: creat d5/de/d2f/d8a/fb0 x:0 0 0 2026-03-09T16:15:16.179 INFO:tasks.workunit.client.1.vm05.stdout:3/408: write d0/d33/f64 [993556,55582] 0 2026-03-09T16:15:16.180 INFO:tasks.workunit.client.1.vm05.stdout:1/526: mkdir d7/d15/d6e/dbc 0 2026-03-09T16:15:16.181 INFO:tasks.workunit.client.1.vm05.stdout:4/484: truncate d5/de/d15/d21/d27/f7a 43653 0 2026-03-09T16:15:16.184 INFO:tasks.workunit.client.1.vm05.stdout:4/485: truncate d5/de/d15/d21/d27/d3c/f92 263314 0 2026-03-09T16:15:16.184 INFO:tasks.workunit.client.1.vm05.stdout:4/486: stat d5/de/d15/d21/d31/f67 0 2026-03-09T16:15:16.184 INFO:tasks.workunit.client.1.vm05.stdout:4/487: write d5/de/f9d [962047,16096] 0 2026-03-09T16:15:16.186 INFO:tasks.workunit.client.1.vm05.stdout:4/488: truncate d5/de/d15/d21/d39/d5d/d7d/f7e 876545 0 2026-03-09T16:15:16.186 INFO:tasks.workunit.client.1.vm05.stdout:3/409: symlink d0/d9/d22/d4c/l87 0 2026-03-09T16:15:16.189 INFO:tasks.workunit.client.1.vm05.stdout:4/489: chown d5/de/d15/d21/d31/dad/f48 24 1 2026-03-09T16:15:16.189 INFO:tasks.workunit.client.1.vm05.stdout:3/410: chown d0/d9/f2b 13381830 1 2026-03-09T16:15:16.189 INFO:tasks.workunit.client.1.vm05.stdout:5/474: link d8/c12 d8/caf 0 2026-03-09T16:15:16.190 INFO:tasks.workunit.client.1.vm05.stdout:1/527: sync 2026-03-09T16:15:16.191 INFO:tasks.workunit.client.1.vm05.stdout:1/528: read - d7/dd/d21/d39/f86 zero size 2026-03-09T16:15:16.192 INFO:tasks.workunit.client.1.vm05.stdout:9/549: creat d4/d10/d35/d36/d48/fb8 x:0 0 0 2026-03-09T16:15:16.194 INFO:tasks.workunit.client.1.vm05.stdout:1/529: chown d7/dd/d21/d63/d71 2 1 2026-03-09T16:15:16.194 INFO:tasks.workunit.client.1.vm05.stdout:1/530: dread - d7/dd/d21/d3b/f65 zero size 2026-03-09T16:15:16.195 INFO:tasks.workunit.client.1.vm05.stdout:1/531: write d7/d15/f22 [340348,17997] 0 2026-03-09T16:15:16.195 INFO:tasks.workunit.client.1.vm05.stdout:3/411: write d0/d9/f2c [1673755,43910] 0 2026-03-09T16:15:16.196 INFO:tasks.workunit.client.1.vm05.stdout:0/489: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T16:15:16.197 INFO:tasks.workunit.client.1.vm05.stdout:1/532: sync 2026-03-09T16:15:16.198 INFO:tasks.workunit.client.1.vm05.stdout:0/490: sync 2026-03-09T16:15:16.203 INFO:tasks.workunit.client.1.vm05.stdout:9/550: unlink d4/d10/d35/d2b/d31/f76 0 2026-03-09T16:15:16.209 INFO:tasks.workunit.client.1.vm05.stdout:1/533: symlink d7/dd/de/lbd 0 2026-03-09T16:15:16.209 INFO:tasks.workunit.client.1.vm05.stdout:3/412: mkdir d0/d9/d22/d5f/d75/d76/d88 0 2026-03-09T16:15:16.211 INFO:tasks.workunit.client.1.vm05.stdout:0/491: mknod d5/d11/c9d 0 2026-03-09T16:15:16.221 INFO:tasks.workunit.client.1.vm05.stdout:0/492: dread - d5/f73 zero size 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:1/534: mkdir d7/dbe 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:0/493: mkdir d5/d9e 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:1/535: write d7/dd/d21/d39/d5a/f41 [4334786,26806] 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:0/494: fsync d5/d1b/d30/f55 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:9/551: dwrite d4/d10/d35/d36/f85 [0,4194304] 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:1/536: mkdir d7/d62/d72/dbf 0 2026-03-09T16:15:16.222 INFO:tasks.workunit.client.1.vm05.stdout:0/495: fsync d5/d2c/f7f 0 2026-03-09T16:15:16.232 INFO:tasks.workunit.client.1.vm05.stdout:0/496: dread d5/d2c/d49/d83/d8b/d95/f59 [0,4194304] 0 2026-03-09T16:15:16.236 INFO:tasks.workunit.client.1.vm05.stdout:9/552: dwrite d4/d10/d35/d2b/fa1 [0,4194304] 0 2026-03-09T16:15:16.238 INFO:tasks.workunit.client.1.vm05.stdout:0/497: rmdir d5/db/d5b 39 2026-03-09T16:15:16.240 INFO:tasks.workunit.client.1.vm05.stdout:9/553: chown d4/d10/f8d 14826 1 2026-03-09T16:15:16.248 INFO:tasks.workunit.client.1.vm05.stdout:4/490: rename d5/de/d15/d21/d31 to d5/de/d15/da9/db1 0 2026-03-09T16:15:16.249 INFO:tasks.workunit.client.1.vm05.stdout:9/554: write d4/d10/d35/d2b/d38/d65/fa5 [828916,94998] 0 2026-03-09T16:15:16.253 INFO:tasks.workunit.client.1.vm05.stdout:4/491: rmdir d5/de/d15/d21/d27/d3c/d5c/d5f/d4e 39 2026-03-09T16:15:16.254 INFO:tasks.workunit.client.1.vm05.stdout:0/498: creat d5/d11/f9f x:0 0 0 2026-03-09T16:15:16.255 INFO:tasks.workunit.client.1.vm05.stdout:5/475: rename d8/caf to d8/d18/d1b/d78/cb0 0 2026-03-09T16:15:16.256 INFO:tasks.workunit.client.1.vm05.stdout:4/492: read d5/de/d15/da9/db1/dad/d37/d60/f62 [28207,42658] 0 2026-03-09T16:15:16.256 INFO:tasks.workunit.client.1.vm05.stdout:0/499: mknod d5/d1b/d30/ca0 0 2026-03-09T16:15:16.258 INFO:tasks.workunit.client.1.vm05.stdout:0/500: write d5/db/d48/d66/f72 [3352144,3731] 0 2026-03-09T16:15:16.260 INFO:tasks.workunit.client.1.vm05.stdout:5/476: getdents d8/d59 0 2026-03-09T16:15:16.261 INFO:tasks.workunit.client.1.vm05.stdout:0/501: symlink d5/db/d5f/la1 0 2026-03-09T16:15:16.262 INFO:tasks.workunit.client.1.vm05.stdout:4/493: dwrite d5/de/f9d [0,4194304] 0 2026-03-09T16:15:16.264 INFO:tasks.workunit.client.1.vm05.stdout:4/494: dread d5/de/d15/d21/d39/d5d/d7d/f89 [0,4194304] 0 2026-03-09T16:15:16.269 INFO:tasks.workunit.client.1.vm05.stdout:0/502: dwrite d5/d2c/d49/d83/d8b/d95/f59 [0,4194304] 0 2026-03-09T16:15:16.271 INFO:tasks.workunit.client.1.vm05.stdout:4/495: stat d5/l83 0 2026-03-09T16:15:16.275 INFO:tasks.workunit.client.1.vm05.stdout:4/496: mknod d5/de/d15/d21/d27/d3c/d5c/da2/cb2 0 2026-03-09T16:15:16.278 INFO:tasks.workunit.client.1.vm05.stdout:5/477: rename d8/d18/d1b/d47/d48/f99 to d8/d59/d5b/d8b/da0/dae/daa/fb1 0 2026-03-09T16:15:16.281 INFO:tasks.workunit.client.1.vm05.stdout:4/497: write d5/de/d15/d21/d39/d91/fae [166022,84553] 0 2026-03-09T16:15:16.283 INFO:tasks.workunit.client.1.vm05.stdout:4/498: chown d5/de/d15/d21/d39/d5d/c7c 981798100 1 2026-03-09T16:15:16.283 INFO:tasks.workunit.client.1.vm05.stdout:5/478: dread d8/fad [0,4194304] 0 2026-03-09T16:15:16.284 INFO:tasks.workunit.client.1.vm05.stdout:5/479: readlink d8/d1d/l5d 0 2026-03-09T16:15:16.287 INFO:tasks.workunit.client.1.vm05.stdout:4/499: getdents d5/de/d15/d21/d39 0 2026-03-09T16:15:16.297 INFO:tasks.workunit.client.1.vm05.stdout:4/500: read d5/de/f9d [1285782,47789] 0 2026-03-09T16:15:16.297 INFO:tasks.workunit.client.1.vm05.stdout:4/501: mknod d5/d9c/cb3 0 2026-03-09T16:15:16.297 INFO:tasks.workunit.client.1.vm05.stdout:4/502: unlink d5/de/d2f/l61 0 2026-03-09T16:15:16.297 INFO:tasks.workunit.client.1.vm05.stdout:4/503: truncate f0 1277546 0 2026-03-09T16:15:16.297 INFO:tasks.workunit.client.1.vm05.stdout:4/504: symlink d5/de/d2f/lb4 0 2026-03-09T16:15:16.306 INFO:tasks.workunit.client.1.vm05.stdout:4/505: dread d5/f35 [0,4194304] 0 2026-03-09T16:15:16.308 INFO:tasks.workunit.client.1.vm05.stdout:5/480: sync 2026-03-09T16:15:16.308 INFO:tasks.workunit.client.1.vm05.stdout:2/496: dread db/dd/d15/d3f/d5b/d60/f7c [0,4194304] 0 2026-03-09T16:15:16.309 INFO:tasks.workunit.client.1.vm05.stdout:4/506: getdents d5/de/d15/da9/db1/dad/d37 0 2026-03-09T16:15:16.310 INFO:tasks.workunit.client.1.vm05.stdout:5/481: creat d8/d18/fb2 x:0 0 0 2026-03-09T16:15:16.313 INFO:tasks.workunit.client.1.vm05.stdout:2/497: symlink db/dd/d15/d1f/la3 0 2026-03-09T16:15:16.313 INFO:tasks.workunit.client.1.vm05.stdout:4/507: symlink d5/de/d15/d21/d39/d5d/d8c/lb5 0 2026-03-09T16:15:16.318 INFO:tasks.workunit.client.1.vm05.stdout:2/498: truncate db/dd/d15/d1f/d20/d23/f9b 893057 0 2026-03-09T16:15:16.321 INFO:tasks.workunit.client.1.vm05.stdout:4/508: dwrite d5/de/d15/d21/d27/f7a [0,4194304] 0 2026-03-09T16:15:16.328 INFO:tasks.workunit.client.1.vm05.stdout:4/509: chown d5/de/d15/d21/d27/c73 256923 1 2026-03-09T16:15:16.330 INFO:tasks.workunit.client.1.vm05.stdout:4/510: chown d5/de/d15/da9/db1/dad/c4c 75690056 1 2026-03-09T16:15:16.332 INFO:tasks.workunit.client.1.vm05.stdout:4/511: write d5/de/d15/da9/db1/dad/f1f [3749116,123308] 0 2026-03-09T16:15:16.335 INFO:tasks.workunit.client.1.vm05.stdout:9/555: fdatasync d4/d10/f1d 0 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/512: mkdir d5/de/d15/d21/d27/d3c/d5c/d5f/db6 0 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/513: mknod d5/de/d15/d21/d27/d3c/cb7 0 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/514: read - d5/de/d15/da9/db1/f58 zero size 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:9/556: dwrite d4/d10/d35/d2b/d31/f55 [0,4194304] 0 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/515: truncate d5/de/d15/d21/d39/d5d/f70 242328 0 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/516: chown d5/de/d15/d21/d27/d3c/d5c/d5f/f57 2657 1 2026-03-09T16:15:16.347 INFO:tasks.workunit.client.1.vm05.stdout:4/517: dread - d5/de/d15/da9/db1/f67 zero size 2026-03-09T16:15:16.350 INFO:tasks.workunit.client.1.vm05.stdout:4/518: read d5/f2d [197392,31001] 0 2026-03-09T16:15:16.358 INFO:tasks.workunit.client.1.vm05.stdout:7/553: dread d1/d2/d8/d31/d8d/fa9 [0,4194304] 0 2026-03-09T16:15:16.360 INFO:tasks.workunit.client.1.vm05.stdout:9/557: rename d4/d10/d35/d36/c72 to d4/d10/d35/d36/d48/cb9 0 2026-03-09T16:15:16.360 INFO:tasks.workunit.client.1.vm05.stdout:4/519: link d5/de/d15/f25 d5/de/d15/d21/d27/d3c/d5c/d5f/db6/fb8 0 2026-03-09T16:15:16.363 INFO:tasks.workunit.client.1.vm05.stdout:9/558: mkdir d4/d10/d35/d2b/dba 0 2026-03-09T16:15:16.365 INFO:tasks.workunit.client.1.vm05.stdout:9/559: write d4/d10/d35/d2b/d38/f62 [3801882,55499] 0 2026-03-09T16:15:16.367 INFO:tasks.workunit.client.1.vm05.stdout:4/520: dwrite d5/de/f87 [0,4194304] 0 2026-03-09T16:15:16.368 INFO:tasks.workunit.client.1.vm05.stdout:7/554: dwrite d1/f26 [0,4194304] 0 2026-03-09T16:15:16.369 INFO:tasks.workunit.client.1.vm05.stdout:7/555: write d1/d2/d8/d31/d8d/f52 [477832,51235] 0 2026-03-09T16:15:16.385 INFO:tasks.workunit.client.1.vm05.stdout:9/560: rename d4/d10/d35/c5d to d4/d10/d35/d36/d48/d60/d94/cbb 0 2026-03-09T16:15:16.392 INFO:tasks.workunit.client.1.vm05.stdout:7/556: mknod d1/d2/cbb 0 2026-03-09T16:15:16.393 INFO:tasks.workunit.client.1.vm05.stdout:8/477: dread d4/d6/d53/f89 [0,4194304] 0 2026-03-09T16:15:16.393 INFO:tasks.workunit.client.1.vm05.stdout:9/561: symlink d4/d10/lbc 0 2026-03-09T16:15:16.395 INFO:tasks.workunit.client.1.vm05.stdout:7/557: write d1/d2/d8/dc/d1b/d30/f85 [330090,29911] 0 2026-03-09T16:15:16.397 INFO:tasks.workunit.client.1.vm05.stdout:4/521: rename d5/de/d15/da9/db1/dad/c80 to d5/de/cb9 0 2026-03-09T16:15:16.397 INFO:tasks.workunit.client.1.vm05.stdout:9/562: chown d4/f2e 44526355 1 2026-03-09T16:15:16.397 INFO:tasks.workunit.client.1.vm05.stdout:8/478: write d4/d6/db/dc/fa2 [985023,80807] 0 2026-03-09T16:15:16.403 INFO:tasks.workunit.client.1.vm05.stdout:7/558: mknod d1/d2/d8/dc/d72/da8/cbc 0 2026-03-09T16:15:16.403 INFO:tasks.workunit.client.1.vm05.stdout:9/563: truncate d4/d10/f8d 1556164 0 2026-03-09T16:15:16.406 INFO:tasks.workunit.client.1.vm05.stdout:7/559: write d1/d2/d8/dc/d1b/d30/d4b/d65/f63 [472319,26372] 0 2026-03-09T16:15:16.409 INFO:tasks.workunit.client.1.vm05.stdout:6/488: dwrite d17/d22/d27/f6b [0,4194304] 0 2026-03-09T16:15:16.410 INFO:tasks.workunit.client.1.vm05.stdout:7/560: rename d1/f84 to d1/d2/d8/dc/d1b/d71/fbd 0 2026-03-09T16:15:16.413 INFO:tasks.workunit.client.1.vm05.stdout:6/489: truncate d17/d1d/f67 9321370 0 2026-03-09T16:15:16.415 INFO:tasks.workunit.client.1.vm05.stdout:7/561: mknod d1/d2/d11/d86/d8a/d91/cbe 0 2026-03-09T16:15:16.420 INFO:tasks.workunit.client.1.vm05.stdout:7/562: creat d1/d2/d8/dc/fbf x:0 0 0 2026-03-09T16:15:16.420 INFO:tasks.workunit.client.1.vm05.stdout:7/563: read d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fb3 [350098,619] 0 2026-03-09T16:15:16.423 INFO:tasks.workunit.client.1.vm05.stdout:7/564: read - d1/d2/d8/d31/f51 zero size 2026-03-09T16:15:16.426 INFO:tasks.workunit.client.1.vm05.stdout:7/565: mknod d1/d2/d8/dc/d1b/d71/cc0 0 2026-03-09T16:15:16.430 INFO:tasks.workunit.client.1.vm05.stdout:7/566: dwrite d1/d2/d8/d31/d8d/fa9 [0,4194304] 0 2026-03-09T16:15:16.435 INFO:tasks.workunit.client.1.vm05.stdout:7/567: creat d1/d2/d8/dc/d1b/d30/d7d/fc1 x:0 0 0 2026-03-09T16:15:16.437 INFO:tasks.workunit.client.1.vm05.stdout:7/568: mknod d1/d2/d8/dc/d72/da8/cc2 0 2026-03-09T16:15:16.443 INFO:tasks.workunit.client.1.vm05.stdout:7/569: creat d1/d2/d8/d67/d76/fc3 x:0 0 0 2026-03-09T16:15:16.445 INFO:tasks.workunit.client.1.vm05.stdout:7/570: creat d1/d2/d8/dc/d33/fc4 x:0 0 0 2026-03-09T16:15:16.456 INFO:tasks.workunit.client.1.vm05.stdout:3/413: dread d0/d33/f64 [0,4194304] 0 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/414: rmdir d0/d9/d22/d4c 39 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/415: mkdir d0/d9/d22/d5f/d75/d76/d88/d89 0 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/416: write d0/d33/f36 [1266438,59421] 0 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/417: write d0/d9/d22/d5f/f66 [825762,128231] 0 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/418: chown d0/d9/d22/l40 7080898 1 2026-03-09T16:15:16.465 INFO:tasks.workunit.client.1.vm05.stdout:3/419: symlink d0/d9/d22/d4c/d80/l8a 0 2026-03-09T16:15:16.466 INFO:tasks.workunit.client.1.vm05.stdout:3/420: mkdir d0/d9/d8b 0 2026-03-09T16:15:16.467 INFO:tasks.workunit.client.1.vm05.stdout:3/421: rename d0/d9/d22 to d0/d9/d22/d5f/d7b/d8c 22 2026-03-09T16:15:16.469 INFO:tasks.workunit.client.1.vm05.stdout:3/422: unlink d0/d9/f37 0 2026-03-09T16:15:16.470 INFO:tasks.workunit.client.1.vm05.stdout:3/423: mknod d0/d9/c8d 0 2026-03-09T16:15:16.473 INFO:tasks.workunit.client.1.vm05.stdout:3/424: symlink d0/d9/d22/d5f/d75/d76/l8e 0 2026-03-09T16:15:16.475 INFO:tasks.workunit.client.1.vm05.stdout:3/425: dread - d0/d33/f41 zero size 2026-03-09T16:15:16.487 INFO:tasks.workunit.client.1.vm05.stdout:0/503: dread d5/d1b/f61 [0,4194304] 0 2026-03-09T16:15:16.488 INFO:tasks.workunit.client.1.vm05.stdout:1/537: write d7/dd/de/d52/f58 [1646133,21411] 0 2026-03-09T16:15:16.499 INFO:tasks.workunit.client.1.vm05.stdout:1/538: dread d7/dd/d21/d63/d71/f7b [0,4194304] 0 2026-03-09T16:15:16.501 INFO:tasks.workunit.client.1.vm05.stdout:1/539: symlink d7/dd/d21/d39/d87/lc0 0 2026-03-09T16:15:16.505 INFO:tasks.workunit.client.1.vm05.stdout:1/540: dread d7/f3f [0,4194304] 0 2026-03-09T16:15:16.506 INFO:tasks.workunit.client.1.vm05.stdout:1/541: dread - d7/dd/d21/d39/d87/fa6 zero size 2026-03-09T16:15:16.509 INFO:tasks.workunit.client.1.vm05.stdout:1/542: creat d7/d62/db6/fc1 x:0 0 0 2026-03-09T16:15:16.510 INFO:tasks.workunit.client.1.vm05.stdout:1/543: truncate d7/d62/d72/f9f 1682249 0 2026-03-09T16:15:16.511 INFO:tasks.workunit.client.1.vm05.stdout:1/544: mkdir d7/d15/d16/dc2 0 2026-03-09T16:15:16.520 INFO:tasks.workunit.client.1.vm05.stdout:1/545: dread d7/dd/d21/f2b [0,4194304] 0 2026-03-09T16:15:16.522 INFO:tasks.workunit.client.1.vm05.stdout:0/504: dread d5/f5c [0,4194304] 0 2026-03-09T16:15:16.523 INFO:tasks.workunit.client.1.vm05.stdout:1/546: mknod d7/dd/d21/d3b/cc3 0 2026-03-09T16:15:16.525 INFO:tasks.workunit.client.1.vm05.stdout:0/505: rmdir d5/db/d48/d66 39 2026-03-09T16:15:16.527 INFO:tasks.workunit.client.1.vm05.stdout:0/506: creat d5/d9e/fa2 x:0 0 0 2026-03-09T16:15:16.527 INFO:tasks.workunit.client.1.vm05.stdout:1/547: link d7/dd/d21/d63/d71/laf d7/dd/d21/d3b/d55/d95/lc4 0 2026-03-09T16:15:16.528 INFO:tasks.workunit.client.1.vm05.stdout:0/507: mkdir d5/db/d5f/da3 0 2026-03-09T16:15:16.532 INFO:tasks.workunit.client.1.vm05.stdout:0/508: mkdir d5/db/d5f/da3/da4 0 2026-03-09T16:15:16.533 INFO:tasks.workunit.client.1.vm05.stdout:1/548: dwrite d7/dd/d21/d63/d71/f7b [0,4194304] 0 2026-03-09T16:15:16.534 INFO:tasks.workunit.client.1.vm05.stdout:1/549: stat d7/dd/de 0 2026-03-09T16:15:16.535 INFO:tasks.workunit.client.1.vm05.stdout:1/550: dread - d7/fb0 zero size 2026-03-09T16:15:16.536 INFO:tasks.workunit.client.1.vm05.stdout:0/509: mkdir d5/db/d5b/da5 0 2026-03-09T16:15:16.541 INFO:tasks.workunit.client.1.vm05.stdout:1/551: truncate d7/dd/de/f3e 1521028 0 2026-03-09T16:15:16.543 INFO:tasks.workunit.client.1.vm05.stdout:0/510: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T16:15:16.545 INFO:tasks.workunit.client.1.vm05.stdout:1/552: dread - d7/dd/d21/d39/d48/d5d/f98 zero size 2026-03-09T16:15:16.548 INFO:tasks.workunit.client.1.vm05.stdout:1/553: stat d7/dd/de/f56 0 2026-03-09T16:15:16.550 INFO:tasks.workunit.client.1.vm05.stdout:0/511: write d5/d1b/f50 [4818197,38214] 0 2026-03-09T16:15:16.550 INFO:tasks.workunit.client.1.vm05.stdout:0/512: write d5/db/f7c [560163,14203] 0 2026-03-09T16:15:16.550 INFO:tasks.workunit.client.1.vm05.stdout:4/522: dread d5/de/d15/d21/d39/f42 [0,4194304] 0 2026-03-09T16:15:16.552 INFO:tasks.workunit.client.1.vm05.stdout:4/523: readlink d5/lc 0 2026-03-09T16:15:16.556 INFO:tasks.workunit.client.1.vm05.stdout:5/482: getdents d8/d18/d1b/d78 0 2026-03-09T16:15:16.560 INFO:tasks.workunit.client.1.vm05.stdout:1/554: creat d7/d62/da3/fc5 x:0 0 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:1/555: chown d7/dd/d21/d2d 1742066 1 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:0/513: symlink d5/db/d5f/da3/la6 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:0/514: readlink d5/d2c/l75 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:2/499: dwrite db/dd/d15/d1f/f49 [4194304,4194304] 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:1/556: dwrite d7/dd/d21/d3b/d55/d95/f99 [0,4194304] 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:0/515: symlink d5/la7 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:8/479: write d4/d6/d3a/d3c/f45 [3456956,90842] 0 2026-03-09T16:15:16.576 INFO:tasks.workunit.client.1.vm05.stdout:7/571: getdents d1/d2/d11/d86/d8a/d91 0 2026-03-09T16:15:16.577 INFO:tasks.workunit.client.1.vm05.stdout:2/500: unlink db/dd/d15/d3f/d5b/d60/d95/c63 0 2026-03-09T16:15:16.579 INFO:tasks.workunit.client.1.vm05.stdout:4/524: sync 2026-03-09T16:15:16.579 INFO:tasks.workunit.client.1.vm05.stdout:5/483: getdents d8/d1d 0 2026-03-09T16:15:16.584 INFO:tasks.workunit.client.1.vm05.stdout:8/480: dread d4/d6/d3a/d40/f76 [0,4194304] 0 2026-03-09T16:15:16.586 INFO:tasks.workunit.client.1.vm05.stdout:7/572: creat d1/d2/d8/d31/fc5 x:0 0 0 2026-03-09T16:15:16.586 INFO:tasks.workunit.client.1.vm05.stdout:1/557: mknod d7/d15/cc6 0 2026-03-09T16:15:16.589 INFO:tasks.workunit.client.1.vm05.stdout:2/501: dwrite db/dd/d15/d3f/d5b/d60/d95/f76 [0,4194304] 0 2026-03-09T16:15:16.592 INFO:tasks.workunit.client.1.vm05.stdout:4/525: rename d5/f2e to d5/de/d2f/d8a/fba 0 2026-03-09T16:15:16.599 INFO:tasks.workunit.client.1.vm05.stdout:5/484: mkdir d8/d59/d5b/db3 0 2026-03-09T16:15:16.599 INFO:tasks.workunit.client.1.vm05.stdout:1/558: dread d7/dd/de/d52/d5b/f5e [0,4194304] 0 2026-03-09T16:15:16.599 INFO:tasks.workunit.client.1.vm05.stdout:9/564: symlink d4/d10/d35/d36/d48/d4c/lbd 0 2026-03-09T16:15:16.599 INFO:tasks.workunit.client.1.vm05.stdout:3/426: dwrite d0/d33/f77 [0,4194304] 0 2026-03-09T16:15:16.599 INFO:tasks.workunit.client.1.vm05.stdout:7/573: creat d1/d2/d11/d86/d8a/fc6 x:0 0 0 2026-03-09T16:15:16.604 INFO:tasks.workunit.client.1.vm05.stdout:3/427: read d0/f7c [4097905,56917] 0 2026-03-09T16:15:16.616 INFO:tasks.workunit.client.1.vm05.stdout:6/490: dread d17/d5d/f84 [0,4194304] 0 2026-03-09T16:15:16.624 INFO:tasks.workunit.client.1.vm05.stdout:1/559: rename d7/dd/de/d52/d5b/fb8 to d7/d15/d16/dc2/fc7 0 2026-03-09T16:15:16.626 INFO:tasks.workunit.client.1.vm05.stdout:4/526: mkdir d5/de/d15/da9/db1/dad/d90/dbb 0 2026-03-09T16:15:16.627 INFO:tasks.workunit.client.1.vm05.stdout:6/491: write d17/f4e [1653125,54341] 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:7/574: symlink d1/d2/d8/dc/d1b/d30/d4b/lc7 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:2/502: creat db/fa4 x:0 0 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:4/527: rename d5/de/d15/l94 to d5/de/d82/lbc 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:2/503: write db/dd/d15/d46/f91 [950933,86780] 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:1/560: mknod d7/d15/d6e/dbc/cc8 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:1/561: write d7/d15/d16/f29 [1025249,79984] 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:7/575: creat d1/d2/d8/d31/d8d/d5d/fc8 x:0 0 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:1/562: readlink d7/dd/de/d52/d5b/la2 0 2026-03-09T16:15:16.637 INFO:tasks.workunit.client.1.vm05.stdout:2/504: dwrite f7 [0,4194304] 0 2026-03-09T16:15:16.646 INFO:tasks.workunit.client.1.vm05.stdout:1/563: dwrite d7/fc [4194304,4194304] 0 2026-03-09T16:15:16.649 INFO:tasks.workunit.client.1.vm05.stdout:4/528: rmdir d5/de/d15/d21/d27/d3c/d5c/d5f/d4e 39 2026-03-09T16:15:16.650 INFO:tasks.workunit.client.1.vm05.stdout:1/564: mknod d7/d15/d16/cc9 0 2026-03-09T16:15:16.651 INFO:tasks.workunit.client.1.vm05.stdout:7/576: creat d1/d2/d11/d86/fc9 x:0 0 0 2026-03-09T16:15:16.652 INFO:tasks.workunit.client.1.vm05.stdout:2/505: creat db/fa5 x:0 0 0 2026-03-09T16:15:16.653 INFO:tasks.workunit.client.1.vm05.stdout:7/577: read d1/d2/d8/dc/f1a [626332,83365] 0 2026-03-09T16:15:16.653 INFO:tasks.workunit.client.1.vm05.stdout:2/506: stat db/dd/d15/d46/d8d 0 2026-03-09T16:15:16.654 INFO:tasks.workunit.client.1.vm05.stdout:7/578: readlink d1/d2/d8/d31/d8d/l3f 0 2026-03-09T16:15:16.657 INFO:tasks.workunit.client.1.vm05.stdout:3/428: dread d0/d9/d22/f5b [0,4194304] 0 2026-03-09T16:15:16.659 INFO:tasks.workunit.client.1.vm05.stdout:7/579: symlink d1/d2/d11/d86/lca 0 2026-03-09T16:15:16.662 INFO:tasks.workunit.client.1.vm05.stdout:3/429: mknod d0/d9/d22/d5f/c8f 0 2026-03-09T16:15:16.663 INFO:tasks.workunit.client.1.vm05.stdout:3/430: dread d0/d9/f51 [0,4194304] 0 2026-03-09T16:15:16.664 INFO:tasks.workunit.client.1.vm05.stdout:7/580: rename d1/d2/d11/d86/fa6 to d1/d2/d8/fcb 0 2026-03-09T16:15:16.666 INFO:tasks.workunit.client.1.vm05.stdout:7/581: creat d1/d2/d8/dc/d1b/d30/d4b/fcc x:0 0 0 2026-03-09T16:15:16.672 INFO:tasks.workunit.client.1.vm05.stdout:7/582: dwrite d1/d2/d8/dc/d1b/d71/f46 [0,4194304] 0 2026-03-09T16:15:16.680 INFO:tasks.workunit.client.1.vm05.stdout:7/583: creat d1/d2/d8/dc/d33/fcd x:0 0 0 2026-03-09T16:15:16.682 INFO:tasks.workunit.client.1.vm05.stdout:3/431: sync 2026-03-09T16:15:16.685 INFO:tasks.workunit.client.1.vm05.stdout:7/584: dwrite d1/d2/d8/dc/d1b/d71/d3c/f60 [0,4194304] 0 2026-03-09T16:15:16.689 INFO:tasks.workunit.client.1.vm05.stdout:3/432: sync 2026-03-09T16:15:16.689 INFO:tasks.workunit.client.1.vm05.stdout:3/433: write d0/d33/f3a [1280918,61092] 0 2026-03-09T16:15:16.690 INFO:tasks.workunit.client.1.vm05.stdout:3/434: read - d0/d33/f41 zero size 2026-03-09T16:15:16.691 INFO:tasks.workunit.client.1.vm05.stdout:3/435: chown d0/d33/c6c 4592863 1 2026-03-09T16:15:16.694 INFO:tasks.workunit.client.1.vm05.stdout:7/585: mknod d1/d2/d8/dc/d14/cce 0 2026-03-09T16:15:16.696 INFO:tasks.workunit.client.1.vm05.stdout:7/586: unlink d1/d2/d11/d86/d8a/fc6 0 2026-03-09T16:15:16.704 INFO:tasks.workunit.client.1.vm05.stdout:8/481: write d4/d6/f9 [6330156,130139] 0 2026-03-09T16:15:16.706 INFO:tasks.workunit.client.1.vm05.stdout:0/516: truncate d5/d11/f1e 3035910 0 2026-03-09T16:15:16.708 INFO:tasks.workunit.client.1.vm05.stdout:5/485: write d8/d18/f20 [121184,35424] 0 2026-03-09T16:15:16.714 INFO:tasks.workunit.client.1.vm05.stdout:7/587: mknod d1/d2/d8/dc/d1b/d71/ccf 0 2026-03-09T16:15:16.715 INFO:tasks.workunit.client.1.vm05.stdout:5/486: dwrite d8/d59/f5c [0,4194304] 0 2026-03-09T16:15:16.717 INFO:tasks.workunit.client.1.vm05.stdout:8/482: rmdir d4/d6/db/dc 39 2026-03-09T16:15:16.717 INFO:tasks.workunit.client.1.vm05.stdout:0/517: symlink d5/d2c/d49/la8 0 2026-03-09T16:15:16.719 INFO:tasks.workunit.client.1.vm05.stdout:8/483: write d4/d6/f9 [8158880,128129] 0 2026-03-09T16:15:16.721 INFO:tasks.workunit.client.1.vm05.stdout:8/484: fsync d4/d6/d3a/f8e 0 2026-03-09T16:15:16.726 INFO:tasks.workunit.client.1.vm05.stdout:8/485: fdatasync d4/d6/f5f 0 2026-03-09T16:15:16.728 INFO:tasks.workunit.client.1.vm05.stdout:8/486: symlink d4/d6/d3a/d15/d83/la4 0 2026-03-09T16:15:16.728 INFO:tasks.workunit.client.1.vm05.stdout:0/518: dwrite d5/d2c/d49/d83/d8b/d95/f60 [0,4194304] 0 2026-03-09T16:15:16.728 INFO:tasks.workunit.client.1.vm05.stdout:8/487: fdatasync d4/d6/d53/f5a 0 2026-03-09T16:15:16.730 INFO:tasks.workunit.client.1.vm05.stdout:8/488: rmdir d4/d6/db/d9b 39 2026-03-09T16:15:16.732 INFO:tasks.workunit.client.1.vm05.stdout:5/487: sync 2026-03-09T16:15:16.735 INFO:tasks.workunit.client.1.vm05.stdout:4/529: dread d5/de/d15/d21/d39/d91/faa [0,4194304] 0 2026-03-09T16:15:16.741 INFO:tasks.workunit.client.1.vm05.stdout:5/488: fsync d8/d59/d5b/d8b/da0/dae/daa/f35 0 2026-03-09T16:15:16.742 INFO:tasks.workunit.client.1.vm05.stdout:4/530: mkdir d5/d9c/dbd 0 2026-03-09T16:15:16.742 INFO:tasks.workunit.client.1.vm05.stdout:0/519: link d5/db/d48/d66/l6c d5/d9e/la9 0 2026-03-09T16:15:16.745 INFO:tasks.workunit.client.1.vm05.stdout:0/520: rename d5/c45 to d5/d11/d4f/d68/caa 0 2026-03-09T16:15:16.747 INFO:tasks.workunit.client.1.vm05.stdout:4/531: rename d5/de/d15/d21/d39/d5d/d7d/f89 to d5/de/d82/fbe 0 2026-03-09T16:15:16.748 INFO:tasks.workunit.client.1.vm05.stdout:0/521: mknod d5/db/d5f/da3/cab 0 2026-03-09T16:15:16.750 INFO:tasks.workunit.client.1.vm05.stdout:0/522: creat d5/db/d5b/d82/fac x:0 0 0 2026-03-09T16:15:16.751 INFO:tasks.workunit.client.1.vm05.stdout:5/489: dread d8/d18/d1b/f32 [4194304,4194304] 0 2026-03-09T16:15:16.751 INFO:tasks.workunit.client.1.vm05.stdout:4/532: truncate d5/de/d15/f25 1145556 0 2026-03-09T16:15:16.752 INFO:tasks.workunit.client.1.vm05.stdout:5/490: fdatasync d8/d18/f3a 0 2026-03-09T16:15:16.753 INFO:tasks.workunit.client.1.vm05.stdout:0/523: creat d5/db/d48/fad x:0 0 0 2026-03-09T16:15:16.754 INFO:tasks.workunit.client.1.vm05.stdout:5/491: chown d8/d53/d7e/la9 79571898 1 2026-03-09T16:15:16.756 INFO:tasks.workunit.client.1.vm05.stdout:5/492: mknod d8/d59/d5b/d8b/da0/dae/daa/d43/cb4 0 2026-03-09T16:15:16.757 INFO:tasks.workunit.client.1.vm05.stdout:0/524: creat d5/db/d77/fae x:0 0 0 2026-03-09T16:15:16.758 INFO:tasks.workunit.client.1.vm05.stdout:0/525: dread - d5/d2c/f63 zero size 2026-03-09T16:15:16.761 INFO:tasks.workunit.client.1.vm05.stdout:0/526: chown d5/d2c/d49/d83/d8b/d95/f6d 12321827 1 2026-03-09T16:15:16.761 INFO:tasks.workunit.client.1.vm05.stdout:5/493: symlink d8/d18/d1b/d47/d4e/d76/d8f/dab/lb5 0 2026-03-09T16:15:16.767 INFO:tasks.workunit.client.1.vm05.stdout:0/527: mkdir d5/d2c/d49/d83/d8b/daf 0 2026-03-09T16:15:16.767 INFO:tasks.workunit.client.1.vm05.stdout:9/565: dwrite d4/f2e [0,4194304] 0 2026-03-09T16:15:16.771 INFO:tasks.workunit.client.1.vm05.stdout:6/492: truncate d17/d22/d27/f6b 3751717 0 2026-03-09T16:15:16.772 INFO:tasks.workunit.client.1.vm05.stdout:1/565: write d7/dd/d21/d39/d87/fa6 [336826,110571] 0 2026-03-09T16:15:16.775 INFO:tasks.workunit.client.1.vm05.stdout:5/494: chown d8/d18/c56 3961 1 2026-03-09T16:15:16.778 INFO:tasks.workunit.client.1.vm05.stdout:0/528: dwrite d5/d1b/d3b/f6f [0,4194304] 0 2026-03-09T16:15:16.779 INFO:tasks.workunit.client.1.vm05.stdout:6/493: rmdir d17/d1d 39 2026-03-09T16:15:16.782 INFO:tasks.workunit.client.1.vm05.stdout:9/566: read d4/d10/d35/d2b/d38/d65/fa5 [492535,43531] 0 2026-03-09T16:15:16.787 INFO:tasks.workunit.client.1.vm05.stdout:0/529: chown d5/db/c38 29793696 1 2026-03-09T16:15:16.790 INFO:tasks.workunit.client.1.vm05.stdout:9/567: link d4/d10/d35/c13 d4/d10/d35/d2b/d31/cbe 0 2026-03-09T16:15:16.793 INFO:tasks.workunit.client.1.vm05.stdout:9/568: dread d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:16.795 INFO:tasks.workunit.client.1.vm05.stdout:9/569: dread - d4/d10/d35/d36/d48/d54/d59/f9f zero size 2026-03-09T16:15:16.796 INFO:tasks.workunit.client.1.vm05.stdout:9/570: fdatasync d4/d10/d35/d2b/d38/f62 0 2026-03-09T16:15:16.801 INFO:tasks.workunit.client.1.vm05.stdout:0/530: sync 2026-03-09T16:15:16.805 INFO:tasks.workunit.client.1.vm05.stdout:0/531: symlink d5/d1b/d30/lb0 0 2026-03-09T16:15:16.806 INFO:tasks.workunit.client.1.vm05.stdout:2/507: write db/dd/d15/d1f/f36 [2549354,90812] 0 2026-03-09T16:15:16.808 INFO:tasks.workunit.client.1.vm05.stdout:0/532: link d5/f5c d5/db/d5f/da3/fb1 0 2026-03-09T16:15:16.809 INFO:tasks.workunit.client.1.vm05.stdout:2/508: write db/dd/d15/d3f/d5b/d60/d95/f80 [788012,57520] 0 2026-03-09T16:15:16.811 INFO:tasks.workunit.client.1.vm05.stdout:2/509: rmdir db/dd/d15/d4c/d56 39 2026-03-09T16:15:16.814 INFO:tasks.workunit.client.1.vm05.stdout:2/510: rename db/dd/d15/d46/f91 to db/dd/d15/d46/fa6 0 2026-03-09T16:15:16.818 INFO:tasks.workunit.client.1.vm05.stdout:0/533: dwrite d5/f79 [0,4194304] 0 2026-03-09T16:15:16.818 INFO:tasks.workunit.client.1.vm05.stdout:1/566: dread d7/dd/d21/d39/d5a/f41 [0,4194304] 0 2026-03-09T16:15:16.819 INFO:tasks.workunit.client.1.vm05.stdout:0/534: chown d5/db/d48/d66/f99 376272 1 2026-03-09T16:15:16.821 INFO:tasks.workunit.client.1.vm05.stdout:0/535: creat d5/db/d48/fb2 x:0 0 0 2026-03-09T16:15:16.827 INFO:tasks.workunit.client.1.vm05.stdout:2/511: sync 2026-03-09T16:15:16.833 INFO:tasks.workunit.client.1.vm05.stdout:0/536: rename d5/db/d5b/c93 to d5/db/cb3 0 2026-03-09T16:15:16.840 INFO:tasks.workunit.client.1.vm05.stdout:0/537: chown d5/d9e 1 1 2026-03-09T16:15:16.852 INFO:tasks.workunit.client.1.vm05.stdout:7/588: truncate d1/d2/d8/dc/d1b/d30/f93 61052 0 2026-03-09T16:15:16.852 INFO:tasks.workunit.client.1.vm05.stdout:2/512: creat db/dd/d15/d4c/d56/fa7 x:0 0 0 2026-03-09T16:15:16.856 INFO:tasks.workunit.client.1.vm05.stdout:5/495: read d8/d59/d5b/d8b/da0/dae/daa/f3c [342812,78568] 0 2026-03-09T16:15:16.858 INFO:tasks.workunit.client.1.vm05.stdout:8/489: getdents d4/d6/d3a/d15/d83 0 2026-03-09T16:15:16.863 INFO:tasks.workunit.client.1.vm05.stdout:7/589: rename d1/d2/d8/d67/d76/c89 to d1/d2/d8/d31/d8d/cd0 0 2026-03-09T16:15:16.863 INFO:tasks.workunit.client.1.vm05.stdout:3/436: mkdir d0/d9/d22/d5f/d90 0 2026-03-09T16:15:16.863 INFO:tasks.workunit.client.1.vm05.stdout:8/490: creat d4/d6/d3a/d15/d83/fa5 x:0 0 0 2026-03-09T16:15:16.863 INFO:tasks.workunit.client.1.vm05.stdout:0/538: dwrite d5/d2c/d49/d83/d8b/d95/f59 [4194304,4194304] 0 2026-03-09T16:15:16.869 INFO:tasks.workunit.client.1.vm05.stdout:7/590: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/db1/cd1 0 2026-03-09T16:15:16.875 INFO:tasks.workunit.client.1.vm05.stdout:4/533: write d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f63 [826365,27025] 0 2026-03-09T16:15:16.875 INFO:tasks.workunit.client.1.vm05.stdout:3/437: dwrite d0/d9/f6e [0,4194304] 0 2026-03-09T16:15:16.876 INFO:tasks.workunit.client.1.vm05.stdout:4/534: chown d5/de/d15/d21/da0 2637 1 2026-03-09T16:15:16.880 INFO:tasks.workunit.client.1.vm05.stdout:8/491: mkdir d4/d6/db/da6 0 2026-03-09T16:15:16.885 INFO:tasks.workunit.client.1.vm05.stdout:8/492: readlink d4/d6/d3a/d3c/l74 0 2026-03-09T16:15:16.886 INFO:tasks.workunit.client.1.vm05.stdout:7/591: chown d1/d2/c23 504775095 1 2026-03-09T16:15:16.892 INFO:tasks.workunit.client.1.vm05.stdout:3/438: creat d0/d9/d8b/f91 x:0 0 0 2026-03-09T16:15:16.893 INFO:tasks.workunit.client.1.vm05.stdout:7/592: symlink d1/d2/d8/dc/d1b/d71/d3c/ld2 0 2026-03-09T16:15:16.899 INFO:tasks.workunit.client.1.vm05.stdout:4/535: dwrite d5/de/d15/d21/d39/d5d/d7d/f7e [0,4194304] 0 2026-03-09T16:15:16.900 INFO:tasks.workunit.client.1.vm05.stdout:9/571: dwrite d4/f6 [0,4194304] 0 2026-03-09T16:15:16.911 INFO:tasks.workunit.client.1.vm05.stdout:5/496: dread d8/d18/d1b/f32 [0,4194304] 0 2026-03-09T16:15:16.913 INFO:tasks.workunit.client.1.vm05.stdout:3/439: dwrite d0/f7c [4194304,4194304] 0 2026-03-09T16:15:16.918 INFO:tasks.workunit.client.1.vm05.stdout:8/493: creat d4/fa7 x:0 0 0 2026-03-09T16:15:16.924 INFO:tasks.workunit.client.1.vm05.stdout:9/572: unlink d4/l41 0 2026-03-09T16:15:16.924 INFO:tasks.workunit.client.1.vm05.stdout:9/573: chown d4/d10/laf 126 1 2026-03-09T16:15:16.925 INFO:tasks.workunit.client.1.vm05.stdout:9/574: write d4/d10/f8d [2402036,10766] 0 2026-03-09T16:15:16.931 INFO:tasks.workunit.client.1.vm05.stdout:3/440: creat d0/d33/f92 x:0 0 0 2026-03-09T16:15:16.935 INFO:tasks.workunit.client.1.vm05.stdout:5/497: creat d8/d5e/fb6 x:0 0 0 2026-03-09T16:15:16.940 INFO:tasks.workunit.client.1.vm05.stdout:5/498: fsync d8/d1d/f44 0 2026-03-09T16:15:16.940 INFO:tasks.workunit.client.1.vm05.stdout:5/499: chown d8/d18/d1b/d78/d90 0 1 2026-03-09T16:15:16.941 INFO:tasks.workunit.client.1.vm05.stdout:3/441: creat d0/d9/f93 x:0 0 0 2026-03-09T16:15:16.948 INFO:tasks.workunit.client.1.vm05.stdout:2/513: dread db/dd/d15/d46/fa6 [0,4194304] 0 2026-03-09T16:15:16.949 INFO:tasks.workunit.client.1.vm05.stdout:3/442: dwrite d0/d9/d22/f2e [0,4194304] 0 2026-03-09T16:15:16.951 INFO:tasks.workunit.client.1.vm05.stdout:0/539: truncate d5/db/f54 1543617 0 2026-03-09T16:15:16.958 INFO:tasks.workunit.client.1.vm05.stdout:9/575: symlink d4/d10/d35/lbf 0 2026-03-09T16:15:16.958 INFO:tasks.workunit.client.1.vm05.stdout:5/500: getdents d8/d5e/d8e 0 2026-03-09T16:15:16.963 INFO:tasks.workunit.client.1.vm05.stdout:8/494: fdatasync d4/d6/db/dc/d5d/d79/f91 0 2026-03-09T16:15:16.966 INFO:tasks.workunit.client.1.vm05.stdout:1/567: dread d7/d62/f90 [4194304,4194304] 0 2026-03-09T16:15:16.966 INFO:tasks.workunit.client.1.vm05.stdout:2/514: write db/dd/d15/d3f/f4a [2789155,119329] 0 2026-03-09T16:15:16.968 INFO:tasks.workunit.client.1.vm05.stdout:4/536: dread d5/de/d15/f25 [0,4194304] 0 2026-03-09T16:15:16.970 INFO:tasks.workunit.client.1.vm05.stdout:5/501: mknod d8/d53/d7e/cb7 0 2026-03-09T16:15:16.971 INFO:tasks.workunit.client.1.vm05.stdout:9/576: write d4/d10/d35/d36/f67 [342901,24962] 0 2026-03-09T16:15:16.971 INFO:tasks.workunit.client.1.vm05.stdout:2/515: unlink db/dd/d15/d4c/d56/fa7 0 2026-03-09T16:15:16.972 INFO:tasks.workunit.client.1.vm05.stdout:8/495: fsync d4/d6/db/dc/d5d/d79/f91 0 2026-03-09T16:15:16.979 INFO:tasks.workunit.client.1.vm05.stdout:8/496: chown d4/d6/d3a/f28 266951 1 2026-03-09T16:15:16.985 INFO:tasks.workunit.client.1.vm05.stdout:2/516: mknod db/dd/d15/d3f/ca8 0 2026-03-09T16:15:16.988 INFO:tasks.workunit.client.1.vm05.stdout:8/497: dwrite d4/d6/db/dc/fa2 [0,4194304] 0 2026-03-09T16:15:16.995 INFO:tasks.workunit.client.1.vm05.stdout:4/537: fdatasync d5/de/d82/fbe 0 2026-03-09T16:15:17.002 INFO:tasks.workunit.client.1.vm05.stdout:5/502: symlink d8/lb8 0 2026-03-09T16:15:17.013 INFO:tasks.workunit.client.1.vm05.stdout:4/538: rename d5/de/d15/d21/d39/d5d to d5/de/d15/da9/db1/dad/d37/d60/dbf 0 2026-03-09T16:15:17.016 INFO:tasks.workunit.client.1.vm05.stdout:2/517: creat db/dd/d15/d3f/d5b/d60/da2/fa9 x:0 0 0 2026-03-09T16:15:17.020 INFO:tasks.workunit.client.1.vm05.stdout:1/568: write d7/dd/f1f [4510249,89182] 0 2026-03-09T16:15:17.021 INFO:tasks.workunit.client.1.vm05.stdout:3/443: write d0/d9/d22/f2a [3462509,71631] 0 2026-03-09T16:15:17.025 INFO:tasks.workunit.client.1.vm05.stdout:6/494: dread d17/d22/d27/d34/d4b/f5a [0,4194304] 0 2026-03-09T16:15:17.027 INFO:tasks.workunit.client.1.vm05.stdout:1/569: rename d7/dd/d21/d3b/d55/d95 to d7/dbe/dca 0 2026-03-09T16:15:17.027 INFO:tasks.workunit.client.1.vm05.stdout:1/570: dread - d7/d27/f64 zero size 2026-03-09T16:15:17.028 INFO:tasks.workunit.client.1.vm05.stdout:1/571: chown d7/d15/d16/f53 8778 1 2026-03-09T16:15:17.028 INFO:tasks.workunit.client.1.vm05.stdout:8/498: link d4/d6/db/dc/d2e/c99 d4/d6/db/d75/ca8 0 2026-03-09T16:15:17.030 INFO:tasks.workunit.client.1.vm05.stdout:8/499: dread d4/d6/db/d59/f60 [0,4194304] 0 2026-03-09T16:15:17.031 INFO:tasks.workunit.client.1.vm05.stdout:1/572: read d7/f3f [45504,58988] 0 2026-03-09T16:15:17.033 INFO:tasks.workunit.client.1.vm05.stdout:6/495: creat d17/d4f/fbd x:0 0 0 2026-03-09T16:15:17.034 INFO:tasks.workunit.client.1.vm05.stdout:6/496: write d17/d22/d27/d8a/f88 [1245751,3710] 0 2026-03-09T16:15:17.037 INFO:tasks.workunit.client.1.vm05.stdout:3/444: rename d0/d9/c8d to d0/d9/d22/d6b/c94 0 2026-03-09T16:15:17.038 INFO:tasks.workunit.client.1.vm05.stdout:1/573: mknod d7/d62/da3/ccb 0 2026-03-09T16:15:17.042 INFO:tasks.workunit.client.1.vm05.stdout:6/497: dwrite d17/d4f/fbd [0,4194304] 0 2026-03-09T16:15:17.046 INFO:tasks.workunit.client.1.vm05.stdout:1/574: stat d7/dd/d21/d39/d48/f59 0 2026-03-09T16:15:17.055 INFO:tasks.workunit.client.1.vm05.stdout:5/503: write d8/d59/d5b/d8b/da0/dae/daa/f3c [1918827,75778] 0 2026-03-09T16:15:17.060 INFO:tasks.workunit.client.1.vm05.stdout:4/539: dwrite d5/f6 [0,4194304] 0 2026-03-09T16:15:17.066 INFO:tasks.workunit.client.1.vm05.stdout:3/445: unlink d0/d9/f6e 0 2026-03-09T16:15:17.068 INFO:tasks.workunit.client.1.vm05.stdout:1/575: mkdir d7/dd/d21/d44/dcc 0 2026-03-09T16:15:17.069 INFO:tasks.workunit.client.1.vm05.stdout:8/500: rename d4/d6/d3a/d3c/f6f to d4/d6/db/dc/fa9 0 2026-03-09T16:15:17.072 INFO:tasks.workunit.client.1.vm05.stdout:8/501: chown d4/d6/d53 62 1 2026-03-09T16:15:17.079 INFO:tasks.workunit.client.1.vm05.stdout:1/576: dwrite d7/fc [4194304,4194304] 0 2026-03-09T16:15:17.082 INFO:tasks.workunit.client.1.vm05.stdout:5/504: mknod d8/d18/d1b/d47/d48/d73/cb9 0 2026-03-09T16:15:17.086 INFO:tasks.workunit.client.1.vm05.stdout:4/540: truncate d5/f2d 706136 0 2026-03-09T16:15:17.089 INFO:tasks.workunit.client.1.vm05.stdout:4/541: fdatasync d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 0 2026-03-09T16:15:17.093 INFO:tasks.workunit.client.1.vm05.stdout:8/502: symlink d4/d6/d3a/laa 0 2026-03-09T16:15:17.093 INFO:tasks.workunit.client.1.vm05.stdout:8/503: write d4/d6/f1b [3703024,123993] 0 2026-03-09T16:15:17.098 INFO:tasks.workunit.client.1.vm05.stdout:3/446: mknod d0/d9/d22/c95 0 2026-03-09T16:15:17.102 INFO:tasks.workunit.client.1.vm05.stdout:8/504: dread d4/d6/db/dc/d5d/d79/f91 [0,4194304] 0 2026-03-09T16:15:17.104 INFO:tasks.workunit.client.1.vm05.stdout:8/505: rename d4/d6/l2f to d4/d6/d9a/lab 0 2026-03-09T16:15:17.106 INFO:tasks.workunit.client.1.vm05.stdout:5/505: link d8/d53/d7e/cb7 d8/d59/d5b/cba 0 2026-03-09T16:15:17.113 INFO:tasks.workunit.client.1.vm05.stdout:4/542: getdents d5/de/d15 0 2026-03-09T16:15:17.114 INFO:tasks.workunit.client.1.vm05.stdout:4/543: chown d5/de/d15/da9/db1/dad/c4c 16 1 2026-03-09T16:15:17.115 INFO:tasks.workunit.client.1.vm05.stdout:4/544: write d5/de/d15/d21/d27/f7a [977766,48957] 0 2026-03-09T16:15:17.115 INFO:tasks.workunit.client.1.vm05.stdout:4/545: dread - d5/de/d15/d21/f79 zero size 2026-03-09T16:15:17.116 INFO:tasks.workunit.client.1.vm05.stdout:4/546: stat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e 0 2026-03-09T16:15:17.117 INFO:tasks.workunit.client.1.vm05.stdout:8/506: unlink d4/d6/db/d9b/f9e 0 2026-03-09T16:15:17.123 INFO:tasks.workunit.client.1.vm05.stdout:6/498: dwrite d17/d5d/f71 [4194304,4194304] 0 2026-03-09T16:15:17.126 INFO:tasks.workunit.client.1.vm05.stdout:4/547: symlink d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/lc0 0 2026-03-09T16:15:17.127 INFO:tasks.workunit.client.1.vm05.stdout:6/499: truncate d17/d22/d27/d8a/fa1 666291 0 2026-03-09T16:15:17.132 INFO:tasks.workunit.client.1.vm05.stdout:8/507: mknod d4/d6/db/df/d4f/cac 0 2026-03-09T16:15:17.139 INFO:tasks.workunit.client.1.vm05.stdout:4/548: fsync d5/de/d15/d21/d27/f29 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:3/447: dread d0/d9/d22/f54 [0,4194304] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:8/508: truncate d4/d6/db/dc/f26 359792 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:8/509: fsync d4/d6/d3a/d3c/f45 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:8/510: chown d4/d6/db/dc/l48 7091 1 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:4/549: fdatasync d5/de/d15/da9/db1/dad/f32 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/500: dread d17/f1c [0,4194304] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:4/550: readlink d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/lc0 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/501: mkdir d17/d22/d27/d34/d42/d65/dbe 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:7/593: dread d1/d2/f22 [0,4194304] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/502: symlink d17/d22/d27/d58/lbf 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:4/551: getdents d5/de/d15/d21/da0 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/503: write d17/d4f/f93 [310530,101684] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:7/594: write d1/d2/d8/dc/d1b/f5a [925713,77195] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:4/552: stat d5/de/d15/da9/db1/dad/d90/dbb 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/504: dread d17/f4a [0,4194304] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:7/595: dwrite d1/d2/d8/dc/d1b/d71/d3c/f95 [0,4194304] 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/505: rmdir d17/d5d/d73/d83 39 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:6/506: fsync d17/f5b 0 2026-03-09T16:15:17.179 INFO:tasks.workunit.client.1.vm05.stdout:7/596: mkdir d1/d2/d8/dc/d1b/d71/d3c/dd3 0 2026-03-09T16:15:17.184 INFO:tasks.workunit.client.1.vm05.stdout:7/597: rename d1/d2/d8/dc/d72 to d1/d2/d8/dc/dd4 0 2026-03-09T16:15:17.198 INFO:tasks.workunit.client.1.vm05.stdout:4/553: dread d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f63 [0,4194304] 0 2026-03-09T16:15:17.201 INFO:tasks.workunit.client.1.vm05.stdout:4/554: mkdir d5/de/d82/dc1 0 2026-03-09T16:15:17.201 INFO:tasks.workunit.client.1.vm05.stdout:4/555: dread - d5/d9c/fa8 zero size 2026-03-09T16:15:17.204 INFO:tasks.workunit.client.1.vm05.stdout:4/556: mknod d5/de/d15/cc2 0 2026-03-09T16:15:17.207 INFO:tasks.workunit.client.1.vm05.stdout:7/598: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f8f [0,4194304] 0 2026-03-09T16:15:17.214 INFO:tasks.workunit.client.1.vm05.stdout:7/599: dwrite d1/d2/d8/dc/d9c/f6b [0,4194304] 0 2026-03-09T16:15:17.226 INFO:tasks.workunit.client.1.vm05.stdout:7/600: symlink d1/d2/d8/d31/d8d/ld5 0 2026-03-09T16:15:17.226 INFO:tasks.workunit.client.1.vm05.stdout:7/601: creat d1/d2/d8/dc/dd4/fd6 x:0 0 0 2026-03-09T16:15:17.226 INFO:tasks.workunit.client.1.vm05.stdout:7/602: dread - d1/d2/d8/f9a zero size 2026-03-09T16:15:17.226 INFO:tasks.workunit.client.1.vm05.stdout:7/603: creat d1/d2/d8/dc/d1b/d71/d3c/dd3/fd7 x:0 0 0 2026-03-09T16:15:17.226 INFO:tasks.workunit.client.1.vm05.stdout:7/604: fsync d1/d2/d8/f9a 0 2026-03-09T16:15:17.230 INFO:tasks.workunit.client.1.vm05.stdout:7/605: dwrite d1/d2/d8/dc/d1b/d30/d7d/fa7 [0,4194304] 0 2026-03-09T16:15:17.240 INFO:tasks.workunit.client.1.vm05.stdout:7/606: unlink d1/d2/c2f 0 2026-03-09T16:15:17.242 INFO:tasks.workunit.client.1.vm05.stdout:1/577: write d7/dd/de/f56 [778647,74817] 0 2026-03-09T16:15:17.245 INFO:tasks.workunit.client.1.vm05.stdout:1/578: read d7/f4b [38689,77623] 0 2026-03-09T16:15:17.248 INFO:tasks.workunit.client.1.vm05.stdout:1/579: creat d7/dd/d21/d44/dcc/fcd x:0 0 0 2026-03-09T16:15:17.251 INFO:tasks.workunit.client.1.vm05.stdout:1/580: link d7/d15/d45/l92 d7/d62/da5/lce 0 2026-03-09T16:15:17.253 INFO:tasks.workunit.client.1.vm05.stdout:1/581: symlink d7/dd/d21/d39/d5a/d50/lcf 0 2026-03-09T16:15:17.254 INFO:tasks.workunit.client.1.vm05.stdout:5/506: rmdir d8/d53/d7e 39 2026-03-09T16:15:17.256 INFO:tasks.workunit.client.1.vm05.stdout:8/511: sync 2026-03-09T16:15:17.257 INFO:tasks.workunit.client.1.vm05.stdout:1/582: creat d7/dd/d21/d63/d71/fd0 x:0 0 0 2026-03-09T16:15:17.259 INFO:tasks.workunit.client.1.vm05.stdout:1/583: truncate d7/d27/f57 348357 0 2026-03-09T16:15:17.262 INFO:tasks.workunit.client.1.vm05.stdout:8/512: rename d4/d6/db/dc/c98 to d4/d6/db/dc/d2e/cad 0 2026-03-09T16:15:17.270 INFO:tasks.workunit.client.1.vm05.stdout:8/513: dwrite d4/d6/d3a/d15/d83/f9d [0,4194304] 0 2026-03-09T16:15:17.283 INFO:tasks.workunit.client.1.vm05.stdout:8/514: dwrite d4/f3e [0,4194304] 0 2026-03-09T16:15:17.285 INFO:tasks.workunit.client.1.vm05.stdout:8/515: chown d4/d6/d3a/d40/f4e 0 1 2026-03-09T16:15:17.292 INFO:tasks.workunit.client.1.vm05.stdout:6/507: truncate d17/d1d/f67 3041807 0 2026-03-09T16:15:17.296 INFO:tasks.workunit.client.1.vm05.stdout:8/516: creat d4/d6/fae x:0 0 0 2026-03-09T16:15:17.296 INFO:tasks.workunit.client.1.vm05.stdout:6/508: rmdir d17/d22/d9d/da9 39 2026-03-09T16:15:17.296 INFO:tasks.workunit.client.1.vm05.stdout:8/517: chown d4/d6/db/dc 22700 1 2026-03-09T16:15:17.299 INFO:tasks.workunit.client.1.vm05.stdout:6/509: symlink d17/d22/d27/d58/lc0 0 2026-03-09T16:15:17.300 INFO:tasks.workunit.client.1.vm05.stdout:8/518: symlink d4/d6/db/d9b/laf 0 2026-03-09T16:15:17.301 INFO:tasks.workunit.client.1.vm05.stdout:6/510: chown d17/d22/d27/d34/d42/d53 2841973 1 2026-03-09T16:15:17.301 INFO:tasks.workunit.client.1.vm05.stdout:8/519: write d4/f77 [207447,74476] 0 2026-03-09T16:15:17.303 INFO:tasks.workunit.client.1.vm05.stdout:7/607: dwrite d1/d2/d8/d31/f51 [0,4194304] 0 2026-03-09T16:15:17.304 INFO:tasks.workunit.client.1.vm05.stdout:6/511: truncate d17/d22/d9d/fb2 4279382 0 2026-03-09T16:15:17.304 INFO:tasks.workunit.client.1.vm05.stdout:8/520: write d4/d6/f58 [4297450,38948] 0 2026-03-09T16:15:17.312 INFO:tasks.workunit.client.1.vm05.stdout:8/521: rmdir d4/d92 39 2026-03-09T16:15:17.312 INFO:tasks.workunit.client.1.vm05.stdout:5/507: dwrite d8/d18/d1b/d47/d68/f70 [0,4194304] 0 2026-03-09T16:15:17.313 INFO:tasks.workunit.client.1.vm05.stdout:7/608: symlink d1/d2/d8/dc/d1b/d71/ld8 0 2026-03-09T16:15:17.313 INFO:tasks.workunit.client.1.vm05.stdout:6/512: link d17/d5d/f71 d17/d1d/fc1 0 2026-03-09T16:15:17.320 INFO:tasks.workunit.client.1.vm05.stdout:5/508: rmdir d8/d59/d5b/d8b/da0/dae/daa 39 2026-03-09T16:15:17.320 INFO:tasks.workunit.client.1.vm05.stdout:4/557: rename d5/de/d15/da9/db1/f67 to d5/de/d15/d21/d39/d91/fc3 0 2026-03-09T16:15:17.323 INFO:tasks.workunit.client.1.vm05.stdout:5/509: write d8/f55 [768408,7830] 0 2026-03-09T16:15:17.327 INFO:tasks.workunit.client.1.vm05.stdout:6/513: mknod d17/d22/d27/d34/d42/d65/dbe/cc2 0 2026-03-09T16:15:17.327 INFO:tasks.workunit.client.1.vm05.stdout:4/558: mknod d5/d9c/dbd/cc4 0 2026-03-09T16:15:17.328 INFO:tasks.workunit.client.1.vm05.stdout:4/559: creat d5/de/d2f/fc5 x:0 0 0 2026-03-09T16:15:17.329 INFO:tasks.workunit.client.1.vm05.stdout:4/560: stat d5/de/d15/d21/f26 0 2026-03-09T16:15:17.330 INFO:tasks.workunit.client.1.vm05.stdout:6/514: mknod d17/d22/d27/d58/cc3 0 2026-03-09T16:15:17.331 INFO:tasks.workunit.client.1.vm05.stdout:6/515: fdatasync d17/d1d/f67 0 2026-03-09T16:15:17.336 INFO:tasks.workunit.client.1.vm05.stdout:6/516: chown d17/d22/d27/d8a/fa1 3 1 2026-03-09T16:15:17.336 INFO:tasks.workunit.client.1.vm05.stdout:6/517: rmdir d17/d22/d27/d8a 39 2026-03-09T16:15:17.339 INFO:tasks.workunit.client.1.vm05.stdout:6/518: dwrite d17/d1d/fc1 [0,4194304] 0 2026-03-09T16:15:17.341 INFO:tasks.workunit.client.1.vm05.stdout:1/584: dwrite d7/d15/d45/f67 [0,4194304] 0 2026-03-09T16:15:17.348 INFO:tasks.workunit.client.1.vm05.stdout:1/585: symlink d7/d15/d16/dc2/ld1 0 2026-03-09T16:15:17.348 INFO:tasks.workunit.client.1.vm05.stdout:7/609: sync 2026-03-09T16:15:17.349 INFO:tasks.workunit.client.1.vm05.stdout:7/610: fsync d1/d2/d8/d31/d8d/f6f 0 2026-03-09T16:15:17.350 INFO:tasks.workunit.client.1.vm05.stdout:7/611: readlink d1/d2/d8/d31/d8d/ld5 0 2026-03-09T16:15:17.356 INFO:tasks.workunit.client.1.vm05.stdout:1/586: creat d7/d62/fd2 x:0 0 0 2026-03-09T16:15:17.356 INFO:tasks.workunit.client.1.vm05.stdout:7/612: dwrite d1/d2/d8/d31/fc5 [0,4194304] 0 2026-03-09T16:15:17.358 INFO:tasks.workunit.client.1.vm05.stdout:6/519: dwrite d17/d22/d27/d34/f85 [0,4194304] 0 2026-03-09T16:15:17.363 INFO:tasks.workunit.client.1.vm05.stdout:6/520: stat d17/d1d/fc1 0 2026-03-09T16:15:17.368 INFO:tasks.workunit.client.1.vm05.stdout:1/587: unlink d7/dd/d21/d39/d87/lc0 0 2026-03-09T16:15:17.368 INFO:tasks.workunit.client.1.vm05.stdout:1/588: write d7/dd/d21/f3d [1071032,10843] 0 2026-03-09T16:15:17.369 INFO:tasks.workunit.client.1.vm05.stdout:7/613: creat d1/d2/d8/dc/d1b/d30/d4b/db2/fd9 x:0 0 0 2026-03-09T16:15:17.372 INFO:tasks.workunit.client.1.vm05.stdout:7/614: rmdir d1/d2/d8/d31/d8d/d5d 39 2026-03-09T16:15:17.373 INFO:tasks.workunit.client.1.vm05.stdout:7/615: dread - d1/d2/d11/d86/d8a/fa3 zero size 2026-03-09T16:15:17.374 INFO:tasks.workunit.client.1.vm05.stdout:7/616: unlink d1/d2/d8/d31/d8d/fa9 0 2026-03-09T16:15:17.375 INFO:tasks.workunit.client.1.vm05.stdout:1/589: mknod d7/dd/d21/d39/d87/db9/cd3 0 2026-03-09T16:15:17.380 INFO:tasks.workunit.client.1.vm05.stdout:4/561: dread d5/de/d2f/d8a/fba [0,4194304] 0 2026-03-09T16:15:17.381 INFO:tasks.workunit.client.1.vm05.stdout:4/562: chown d5/de/l43 2076851 1 2026-03-09T16:15:17.384 INFO:tasks.workunit.client.1.vm05.stdout:1/590: rmdir d7/dd/db3 0 2026-03-09T16:15:17.387 INFO:tasks.workunit.client.1.vm05.stdout:4/563: dwrite d5/de/d15/f34 [0,4194304] 0 2026-03-09T16:15:17.390 INFO:tasks.workunit.client.1.vm05.stdout:6/521: sync 2026-03-09T16:15:17.390 INFO:tasks.workunit.client.1.vm05.stdout:4/564: stat d5/de/d15/d21/d27/d3c/f3d 0 2026-03-09T16:15:17.405 INFO:tasks.workunit.client.1.vm05.stdout:4/565: unlink d5/de/d15/d21/d27/d3c/f4d 0 2026-03-09T16:15:17.406 INFO:tasks.workunit.client.1.vm05.stdout:6/522: mknod d17/d22/d27/d34/d4b/d7f/cc4 0 2026-03-09T16:15:17.407 INFO:tasks.workunit.client.1.vm05.stdout:4/566: rename d5/de/d15/da9/db1/dad/d37/d60/dbf/f6a to d5/de/d15/da9/fc6 0 2026-03-09T16:15:17.408 INFO:tasks.workunit.client.1.vm05.stdout:6/523: rmdir d17/d22/d9d/da9 39 2026-03-09T16:15:17.410 INFO:tasks.workunit.client.1.vm05.stdout:4/567: truncate d5/de/d15/da9/db1/dad/f48 3576357 0 2026-03-09T16:15:17.411 INFO:tasks.workunit.client.1.vm05.stdout:6/524: symlink d17/d22/d9d/db4/lc5 0 2026-03-09T16:15:17.413 INFO:tasks.workunit.client.1.vm05.stdout:6/525: chown d17/f4e 55 1 2026-03-09T16:15:17.414 INFO:tasks.workunit.client.1.vm05.stdout:4/568: creat d5/de/d15/da9/db1/dad/d37/d60/dbf/fc7 x:0 0 0 2026-03-09T16:15:17.415 INFO:tasks.workunit.client.1.vm05.stdout:4/569: fsync d5/de/d15/da9/db1/dad/d37/fa5 0 2026-03-09T16:15:17.418 INFO:tasks.workunit.client.1.vm05.stdout:6/526: rename d17/d22/d9d/db4/lc5 to d17/d22/d27/d34/d4b/d7f/lc6 0 2026-03-09T16:15:17.419 INFO:tasks.workunit.client.1.vm05.stdout:4/570: rename d5/de/d15/da9/db1/dad/d37/d60/f93 to d5/de/d15/d21/d27/fc8 0 2026-03-09T16:15:17.420 INFO:tasks.workunit.client.1.vm05.stdout:6/527: mknod d17/d22/d9d/cc7 0 2026-03-09T16:15:17.421 INFO:tasks.workunit.client.1.vm05.stdout:4/571: mkdir d5/de/d15/d21/d27/d3c/d5c/da2/dc9 0 2026-03-09T16:15:17.445 INFO:tasks.workunit.client.1.vm05.stdout:4/572: sync 2026-03-09T16:15:17.450 INFO:tasks.workunit.client.1.vm05.stdout:4/573: truncate d5/de/f24 46338 0 2026-03-09T16:15:17.451 INFO:tasks.workunit.client.1.vm05.stdout:4/574: stat d5/de 0 2026-03-09T16:15:17.451 INFO:tasks.workunit.client.1.vm05.stdout:5/510: truncate d8/d18/d1b/f36 1669386 0 2026-03-09T16:15:17.454 INFO:tasks.workunit.client.1.vm05.stdout:8/522: dwrite d4/d6/d3a/d40/f7b [0,4194304] 0 2026-03-09T16:15:17.456 INFO:tasks.workunit.client.1.vm05.stdout:5/511: dwrite d8/d18/d1b/d47/d68/f70 [0,4194304] 0 2026-03-09T16:15:17.469 INFO:tasks.workunit.client.1.vm05.stdout:8/523: dread d4/d6/db/d59/f60 [0,4194304] 0 2026-03-09T16:15:17.469 INFO:tasks.workunit.client.1.vm05.stdout:5/512: unlink d8/d18/d1b/c49 0 2026-03-09T16:15:17.469 INFO:tasks.workunit.client.1.vm05.stdout:8/524: write d4/d6/db/dc/d2e/f46 [3751312,13444] 0 2026-03-09T16:15:17.469 INFO:tasks.workunit.client.1.vm05.stdout:8/525: write d4/f1c [3221350,29493] 0 2026-03-09T16:15:17.469 INFO:tasks.workunit.client.1.vm05.stdout:8/526: mkdir d4/d6/db/d59/db0 0 2026-03-09T16:15:17.470 INFO:tasks.workunit.client.1.vm05.stdout:5/513: dread - d8/d18/d1b/d47/d48/f60 zero size 2026-03-09T16:15:17.471 INFO:tasks.workunit.client.1.vm05.stdout:8/527: creat d4/d6/d53/fb1 x:0 0 0 2026-03-09T16:15:17.474 INFO:tasks.workunit.client.1.vm05.stdout:5/514: symlink d8/d53/d7a/lbb 0 2026-03-09T16:15:17.475 INFO:tasks.workunit.client.1.vm05.stdout:5/515: write d8/d59/f5c [3318640,118578] 0 2026-03-09T16:15:17.478 INFO:tasks.workunit.client.1.vm05.stdout:5/516: readlink d8/d59/d5b/d8b/da0/dae/daa/l9a 0 2026-03-09T16:15:17.480 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:17 vm05.local ceph-mon[58702]: pgmap v18: 65 pgs: 65 active+clean; 2.5 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 19 MiB/s rd, 117 MiB/s wr, 226 op/s 2026-03-09T16:15:17.483 INFO:tasks.workunit.client.1.vm05.stdout:5/517: mkdir d8/d18/dbc 0 2026-03-09T16:15:17.485 INFO:tasks.workunit.client.1.vm05.stdout:7/617: truncate d1/d2/d8/dc/d1b/d71/f46 3352927 0 2026-03-09T16:15:17.486 INFO:tasks.workunit.client.1.vm05.stdout:7/618: write d1/d2/d8/dc/d1b/d71/d3c/f60 [4148199,97968] 0 2026-03-09T16:15:17.486 INFO:tasks.workunit.client.1.vm05.stdout:5/518: rmdir d8/d91 0 2026-03-09T16:15:17.488 INFO:tasks.workunit.client.1.vm05.stdout:7/619: chown d1/d2/d8/dc/d1b/d30/d4b 14 1 2026-03-09T16:15:17.489 INFO:tasks.workunit.client.1.vm05.stdout:1/591: truncate d7/d62/f90 5155260 0 2026-03-09T16:15:17.489 INFO:tasks.workunit.client.1.vm05.stdout:5/519: symlink d8/d59/d5b/lbd 0 2026-03-09T16:15:17.491 INFO:tasks.workunit.client.1.vm05.stdout:7/620: creat d1/d2/d8/dc/d9c/fda x:0 0 0 2026-03-09T16:15:17.493 INFO:tasks.workunit.client.1.vm05.stdout:1/592: dwrite d7/d27/f57 [0,4194304] 0 2026-03-09T16:15:17.504 INFO:tasks.workunit.client.1.vm05.stdout:4/575: getdents d5/de/d15/da9/db1/dad/d37/d60 0 2026-03-09T16:15:17.510 INFO:tasks.workunit.client.1.vm05.stdout:4/576: dwrite d5/f6 [4194304,4194304] 0 2026-03-09T16:15:17.514 INFO:tasks.workunit.client.1.vm05.stdout:1/593: chown d7/dd/de/d52/cad 111540 1 2026-03-09T16:15:17.521 INFO:tasks.workunit.client.1.vm05.stdout:5/520: link d8/d18/d1b/d47/d4e/c7d d8/d59/d5b/d8b/da0/cbe 0 2026-03-09T16:15:17.521 INFO:tasks.workunit.client.1.vm05.stdout:4/577: chown d5/de/d15/da9/db1/f68 1804 1 2026-03-09T16:15:17.521 INFO:tasks.workunit.client.1.vm05.stdout:5/521: symlink d8/d18/d1b/d78/lbf 0 2026-03-09T16:15:17.521 INFO:tasks.workunit.client.1.vm05.stdout:4/578: write d5/de/d15/da9/db1/dad/d37/f51 [1317297,46223] 0 2026-03-09T16:15:17.521 INFO:tasks.workunit.client.1.vm05.stdout:1/594: dread d7/dd/d21/d39/d5a/f41 [0,4194304] 0 2026-03-09T16:15:17.527 INFO:tasks.workunit.client.1.vm05.stdout:8/528: truncate d4/d6/db/dc/f41 3316606 0 2026-03-09T16:15:17.534 INFO:tasks.workunit.client.1.vm05.stdout:8/529: dread d4/d6/db/dc/d2e/f47 [4194304,4194304] 0 2026-03-09T16:15:17.536 INFO:tasks.workunit.client.1.vm05.stdout:5/522: mknod d8/d18/d1b/d47/d68/cc0 0 2026-03-09T16:15:17.537 INFO:tasks.workunit.client.1.vm05.stdout:1/595: symlink d7/d62/db6/ld4 0 2026-03-09T16:15:17.539 INFO:tasks.workunit.client.1.vm05.stdout:8/530: mknod d4/d6/d3a/d15/d83/cb2 0 2026-03-09T16:15:17.541 INFO:tasks.workunit.client.1.vm05.stdout:5/523: dwrite d8/d59/d5b/d8b/da0/dae/daa/d43/f41 [0,4194304] 0 2026-03-09T16:15:17.542 INFO:tasks.workunit.client.1.vm05.stdout:1/596: symlink d7/d15/d16/dc2/ld5 0 2026-03-09T16:15:17.550 INFO:tasks.workunit.client.1.vm05.stdout:1/597: mkdir d7/d15/d6e/dbc/dd6 0 2026-03-09T16:15:17.550 INFO:tasks.workunit.client.1.vm05.stdout:5/524: creat d8/d59/d5b/d8b/da0/fc1 x:0 0 0 2026-03-09T16:15:17.552 INFO:tasks.workunit.client.1.vm05.stdout:1/598: read d7/d62/d72/f9f [460651,79781] 0 2026-03-09T16:15:17.558 INFO:tasks.workunit.client.1.vm05.stdout:4/579: write d5/de/d82/fbe [362473,129776] 0 2026-03-09T16:15:17.562 INFO:tasks.workunit.client.1.vm05.stdout:4/580: readlink d5/de/d15/da9/db1/dad/la4 0 2026-03-09T16:15:17.563 INFO:tasks.workunit.client.1.vm05.stdout:1/599: unlink d7/dd/d21/d39/d48/d5d/fa0 0 2026-03-09T16:15:17.563 INFO:tasks.workunit.client.1.vm05.stdout:5/525: mknod d8/d18/d1b/cc2 0 2026-03-09T16:15:17.572 INFO:tasks.workunit.client.1.vm05.stdout:1/600: dwrite d7/d62/fd2 [0,4194304] 0 2026-03-09T16:15:17.585 INFO:tasks.workunit.client.1.vm05.stdout:8/531: rename d4/d6/d3a/d15/d83 to d4/d6/d9a/db3 0 2026-03-09T16:15:17.590 INFO:tasks.workunit.client.1.vm05.stdout:1/601: getdents d7/d62/db6 0 2026-03-09T16:15:17.593 INFO:tasks.workunit.client.1.vm05.stdout:1/602: chown d7/dd/d21/d63/d71/l82 48 1 2026-03-09T16:15:17.595 INFO:tasks.workunit.client.1.vm05.stdout:4/581: write d5/de/d15/d21/d27/f36 [1297228,124461] 0 2026-03-09T16:15:17.598 INFO:tasks.workunit.client.1.vm05.stdout:5/526: dread d8/d59/d5b/d8b/da0/dae/daa/f9d [0,4194304] 0 2026-03-09T16:15:17.598 INFO:tasks.workunit.client.1.vm05.stdout:8/532: mknod d4/d6/db/dc/d3b/cb4 0 2026-03-09T16:15:17.599 INFO:tasks.workunit.client.1.vm05.stdout:5/527: chown d8/d5e/fb6 2106456 1 2026-03-09T16:15:17.601 INFO:tasks.workunit.client.1.vm05.stdout:5/528: write d8/d59/d5b/d8b/da0/dae/daa/f35 [516131,52527] 0 2026-03-09T16:15:17.603 INFO:tasks.workunit.client.1.vm05.stdout:8/533: dwrite d4/d6/d3a/d40/f7b [0,4194304] 0 2026-03-09T16:15:17.613 INFO:tasks.workunit.client.1.vm05.stdout:4/582: creat d5/de/d15/d21/d27/d3c/fca x:0 0 0 2026-03-09T16:15:17.621 INFO:tasks.workunit.client.1.vm05.stdout:4/583: write d5/de/d15/da9/db1/f64 [6624995,13130] 0 2026-03-09T16:15:17.621 INFO:tasks.workunit.client.1.vm05.stdout:5/529: creat d8/d18/dbc/fc3 x:0 0 0 2026-03-09T16:15:17.622 INFO:tasks.workunit.client.1.vm05.stdout:5/530: write d8/d1d/f21 [288265,55004] 0 2026-03-09T16:15:17.622 INFO:tasks.workunit.client.1.vm05.stdout:5/531: chown d8 62 1 2026-03-09T16:15:17.623 INFO:tasks.workunit.client.1.vm05.stdout:1/603: creat d7/dd/de/d52/fd7 x:0 0 0 2026-03-09T16:15:17.623 INFO:tasks.workunit.client.1.vm05.stdout:8/534: symlink d4/d6/d3a/d40/d6a/d97/lb5 0 2026-03-09T16:15:17.628 INFO:tasks.workunit.client.1.vm05.stdout:5/532: creat d8/d18/d1b/d78/d90/fc4 x:0 0 0 2026-03-09T16:15:17.631 INFO:tasks.workunit.client.1.vm05.stdout:1/604: mkdir d7/dd/d21/d39/d48/d8c/dd8 0 2026-03-09T16:15:17.632 INFO:tasks.workunit.client.1.vm05.stdout:8/535: dwrite d4/d6/d3a/d3c/f45 [0,4194304] 0 2026-03-09T16:15:17.632 INFO:tasks.workunit.client.1.vm05.stdout:1/605: truncate d7/dd/de/d52/fd7 347986 0 2026-03-09T16:15:17.633 INFO:tasks.workunit.client.1.vm05.stdout:5/533: unlink d8/d59/d5b/lbd 0 2026-03-09T16:15:17.635 INFO:tasks.workunit.client.1.vm05.stdout:1/606: symlink d7/ld9 0 2026-03-09T16:15:17.636 INFO:tasks.workunit.client.1.vm05.stdout:1/607: write d7/d62/d72/f79 [1451953,64956] 0 2026-03-09T16:15:17.637 INFO:tasks.workunit.client.1.vm05.stdout:8/536: link d4/d6/db/d75/c82 d4/d6/db/d75/cb6 0 2026-03-09T16:15:17.639 INFO:tasks.workunit.client.1.vm05.stdout:1/608: mkdir d7/d15/d6e/dbc/dd6/dda 0 2026-03-09T16:15:17.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:17 vm03.local ceph-mon[51019]: pgmap v18: 65 pgs: 65 active+clean; 2.5 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 19 MiB/s rd, 117 MiB/s wr, 226 op/s 2026-03-09T16:15:17.646 INFO:tasks.workunit.client.1.vm05.stdout:5/534: sync 2026-03-09T16:15:17.646 INFO:tasks.workunit.client.1.vm05.stdout:5/535: dread - d8/d5e/fb6 zero size 2026-03-09T16:15:17.650 INFO:tasks.workunit.client.1.vm05.stdout:5/536: creat d8/d18/fc5 x:0 0 0 2026-03-09T16:15:17.650 INFO:tasks.workunit.client.1.vm05.stdout:5/537: stat d8/d18/d1b/d78/lbf 0 2026-03-09T16:15:17.651 INFO:tasks.workunit.client.1.vm05.stdout:1/609: stat d7/d62/f90 0 2026-03-09T16:15:17.655 INFO:tasks.workunit.client.1.vm05.stdout:1/610: creat d7/dd/d21/d39/fdb x:0 0 0 2026-03-09T16:15:17.657 INFO:tasks.workunit.client.1.vm05.stdout:1/611: chown d7/f4b 127 1 2026-03-09T16:15:17.657 INFO:tasks.workunit.client.1.vm05.stdout:1/612: chown d7/dd/d21/d39/d48/d5d 127999 1 2026-03-09T16:15:17.661 INFO:tasks.workunit.client.1.vm05.stdout:5/538: rmdir d8/d59/d5b/db3 0 2026-03-09T16:15:17.662 INFO:tasks.workunit.client.1.vm05.stdout:5/539: write d8/d18/d1b/d47/d48/d73/d80/fac [149077,99775] 0 2026-03-09T16:15:17.666 INFO:tasks.workunit.client.1.vm05.stdout:8/537: write d4/d6/db/dc/f26 [861884,32745] 0 2026-03-09T16:15:17.667 INFO:tasks.workunit.client.1.vm05.stdout:1/613: truncate d7/d15/d16/dc2/fc7 52516 0 2026-03-09T16:15:17.669 INFO:tasks.workunit.client.1.vm05.stdout:1/614: dread - d7/dd/d21/d3b/f65 zero size 2026-03-09T16:15:17.670 INFO:tasks.workunit.client.1.vm05.stdout:8/538: symlink d4/d6/d3a/d15/lb7 0 2026-03-09T16:15:17.674 INFO:tasks.workunit.client.1.vm05.stdout:4/584: dwrite f0 [0,4194304] 0 2026-03-09T16:15:17.675 INFO:tasks.workunit.client.1.vm05.stdout:5/540: dwrite d8/d5e/f72 [0,4194304] 0 2026-03-09T16:15:17.685 INFO:tasks.workunit.client.1.vm05.stdout:4/585: chown d5/de/d15/d21/d27/d3c/d5c/d5f/db6 32810604 1 2026-03-09T16:15:17.685 INFO:tasks.workunit.client.1.vm05.stdout:8/539: dwrite d4/d6/db/dc/f30 [4194304,4194304] 0 2026-03-09T16:15:17.690 INFO:tasks.workunit.client.1.vm05.stdout:8/540: write d4/d6/d3a/d40/f7b [2733038,34631] 0 2026-03-09T16:15:17.696 INFO:tasks.workunit.client.1.vm05.stdout:1/615: dwrite d7/d15/d16/f66 [0,4194304] 0 2026-03-09T16:15:17.696 INFO:tasks.workunit.client.1.vm05.stdout:8/541: creat d4/d6/d3a/fb8 x:0 0 0 2026-03-09T16:15:17.697 INFO:tasks.workunit.client.1.vm05.stdout:4/586: creat d5/de/d15/da9/db1/dad/d90/dbb/fcb x:0 0 0 2026-03-09T16:15:17.697 INFO:tasks.workunit.client.1.vm05.stdout:4/587: dwrite d5/de/d15/d21/d27/d3c/f92 [0,4194304] 0 2026-03-09T16:15:17.697 INFO:tasks.workunit.client.1.vm05.stdout:8/542: creat d4/d6/db/df/d80/fb9 x:0 0 0 2026-03-09T16:15:17.704 INFO:tasks.workunit.client.1.vm05.stdout:4/588: creat d5/de/d15/d21/d27/fcc x:0 0 0 2026-03-09T16:15:17.707 INFO:tasks.workunit.client.1.vm05.stdout:1/616: sync 2026-03-09T16:15:17.715 INFO:tasks.workunit.client.1.vm05.stdout:5/541: dread d8/f13 [0,4194304] 0 2026-03-09T16:15:17.717 INFO:tasks.workunit.client.1.vm05.stdout:4/589: creat d5/de/d82/fcd x:0 0 0 2026-03-09T16:15:17.717 INFO:tasks.workunit.client.1.vm05.stdout:1/617: mkdir d7/dd/d21/d63/d71/ddc 0 2026-03-09T16:15:17.718 INFO:tasks.workunit.client.1.vm05.stdout:5/542: mknod d8/d18/d1b/d6b/cc6 0 2026-03-09T16:15:17.718 INFO:tasks.workunit.client.1.vm05.stdout:4/590: chown d5/de/d15/d21/d27/d3c/d5c/d5f/f57 95382 1 2026-03-09T16:15:17.719 INFO:tasks.workunit.client.1.vm05.stdout:1/618: unlink d7/d62/d72/f9f 0 2026-03-09T16:15:17.722 INFO:tasks.workunit.client.1.vm05.stdout:4/591: creat d5/fce x:0 0 0 2026-03-09T16:15:17.725 INFO:tasks.workunit.client.1.vm05.stdout:4/592: rmdir d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b 39 2026-03-09T16:15:17.730 INFO:tasks.workunit.client.1.vm05.stdout:5/543: link d8/d59/d5b/d8b/da0/dae/daa/f8c d8/d53/d7a/fc7 0 2026-03-09T16:15:17.730 INFO:tasks.workunit.client.1.vm05.stdout:5/544: mkdir d8/dc8 0 2026-03-09T16:15:17.730 INFO:tasks.workunit.client.1.vm05.stdout:4/593: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/ccf 0 2026-03-09T16:15:17.732 INFO:tasks.workunit.client.1.vm05.stdout:1/619: sync 2026-03-09T16:15:17.736 INFO:tasks.workunit.client.1.vm05.stdout:4/594: mknod d5/de/cd0 0 2026-03-09T16:15:17.738 INFO:tasks.workunit.client.1.vm05.stdout:5/545: dread d8/d59/d5b/d8b/da0/dae/daa/d43/f41 [0,4194304] 0 2026-03-09T16:15:17.740 INFO:tasks.workunit.client.1.vm05.stdout:1/620: sync 2026-03-09T16:15:17.742 INFO:tasks.workunit.client.1.vm05.stdout:5/546: readlink d8/d59/d5b/d8b/da0/dae/daa/d43/l9e 0 2026-03-09T16:15:17.743 INFO:tasks.workunit.client.1.vm05.stdout:1/621: symlink d7/daa/ldd 0 2026-03-09T16:15:17.743 INFO:tasks.workunit.client.1.vm05.stdout:4/595: dwrite d5/de/d15/da9/db1/dad/d90/dbb/fcb [0,4194304] 0 2026-03-09T16:15:17.746 INFO:tasks.workunit.client.1.vm05.stdout:8/543: write d4/d6/db/dc/fa2 [5161305,68072] 0 2026-03-09T16:15:17.747 INFO:tasks.workunit.client.1.vm05.stdout:8/544: readlink d4/d6/db/df/d4f/l70 0 2026-03-09T16:15:17.756 INFO:tasks.workunit.client.1.vm05.stdout:5/547: dwrite d8/d59/d5b/d8b/da0/dae/daa/f35 [0,4194304] 0 2026-03-09T16:15:17.762 INFO:tasks.workunit.client.1.vm05.stdout:5/548: stat d8/d18/d1b/d47/d68/l6d 0 2026-03-09T16:15:17.765 INFO:tasks.workunit.client.1.vm05.stdout:5/549: chown d8/d18 23135 1 2026-03-09T16:15:17.765 INFO:tasks.workunit.client.1.vm05.stdout:1/622: creat d7/dd/d21/fde x:0 0 0 2026-03-09T16:15:17.765 INFO:tasks.workunit.client.1.vm05.stdout:1/623: symlink d7/dd/de/d52/ldf 0 2026-03-09T16:15:17.767 INFO:tasks.workunit.client.1.vm05.stdout:1/624: creat d7/dd/d21/d44/fe0 x:0 0 0 2026-03-09T16:15:17.773 INFO:tasks.workunit.client.1.vm05.stdout:1/625: rename d7/d15/d16/f29 to d7/dd/d21/d39/d87/db9/fe1 0 2026-03-09T16:15:17.780 INFO:tasks.workunit.client.1.vm05.stdout:5/550: sync 2026-03-09T16:15:17.781 INFO:tasks.workunit.client.1.vm05.stdout:1/626: chown d7/l9a 41319812 1 2026-03-09T16:15:17.781 INFO:tasks.workunit.client.1.vm05.stdout:1/627: chown d7/d62/f90 0 1 2026-03-09T16:15:17.783 INFO:tasks.workunit.client.1.vm05.stdout:5/551: readlink d8/d53/d7e/la9 0 2026-03-09T16:15:17.808 INFO:tasks.workunit.client.1.vm05.stdout:5/552: sync 2026-03-09T16:15:17.811 INFO:tasks.workunit.client.1.vm05.stdout:5/553: unlink d8/d18/c1a 0 2026-03-09T16:15:17.813 INFO:tasks.workunit.client.1.vm05.stdout:5/554: symlink d8/d5e/d8e/lc9 0 2026-03-09T16:15:17.814 INFO:tasks.workunit.client.1.vm05.stdout:5/555: write d8/d59/d5b/d8b/da0/dae/daa/fb1 [845890,106195] 0 2026-03-09T16:15:17.817 INFO:tasks.workunit.client.1.vm05.stdout:5/556: rename d8/d18/d1b/d78/cb0 to d8/d18/d1b/d6b/cca 0 2026-03-09T16:15:17.817 INFO:tasks.workunit.client.1.vm05.stdout:5/557: truncate d8/d1d/f21 1992765 0 2026-03-09T16:15:17.821 INFO:tasks.workunit.client.1.vm05.stdout:5/558: dwrite d8/f55 [0,4194304] 0 2026-03-09T16:15:17.827 INFO:tasks.workunit.client.1.vm05.stdout:5/559: mknod d8/d53/d7a/ccb 0 2026-03-09T16:15:17.830 INFO:tasks.workunit.client.1.vm05.stdout:1/628: dread d7/fb [0,4194304] 0 2026-03-09T16:15:17.830 INFO:tasks.workunit.client.1.vm05.stdout:1/629: stat d7/d27/f64 0 2026-03-09T16:15:17.830 INFO:tasks.workunit.client.1.vm05.stdout:8/545: dread d4/d6/db/dc/fa2 [4194304,4194304] 0 2026-03-09T16:15:17.831 INFO:tasks.workunit.client.1.vm05.stdout:1/630: read - d7/dd/d21/d39/d48/d5d/f98 zero size 2026-03-09T16:15:17.832 INFO:tasks.workunit.client.1.vm05.stdout:8/546: read - d4/d6/fae zero size 2026-03-09T16:15:17.836 INFO:tasks.workunit.client.1.vm05.stdout:8/547: symlink d4/d6/db/dc/d2e/lba 0 2026-03-09T16:15:17.847 INFO:tasks.workunit.client.1.vm05.stdout:9/577: dread d4/d10/f80 [0,4194304] 0 2026-03-09T16:15:17.849 INFO:tasks.workunit.client.1.vm05.stdout:2/518: dread db/dd/d15/d3f/f75 [0,4194304] 0 2026-03-09T16:15:17.850 INFO:tasks.workunit.client.1.vm05.stdout:8/548: sync 2026-03-09T16:15:17.853 INFO:tasks.workunit.client.1.vm05.stdout:2/519: creat db/dd/d15/d46/d8d/faa x:0 0 0 2026-03-09T16:15:17.854 INFO:tasks.workunit.client.1.vm05.stdout:5/560: rename d8/d59/d5b/d8b/da0/dae to d8/d18/dbc/dcc 0 2026-03-09T16:15:17.854 INFO:tasks.workunit.client.1.vm05.stdout:8/549: dread d4/d6/d3a/d15/f93 [0,4194304] 0 2026-03-09T16:15:17.855 INFO:tasks.workunit.client.1.vm05.stdout:8/550: write d4/d6/d53/f5a [331921,119520] 0 2026-03-09T16:15:17.857 INFO:tasks.workunit.client.1.vm05.stdout:9/578: creat d4/d10/d35/fc0 x:0 0 0 2026-03-09T16:15:17.857 INFO:tasks.workunit.client.1.vm05.stdout:8/551: write d4/d6/db/dc/f17 [1044589,8155] 0 2026-03-09T16:15:17.858 INFO:tasks.workunit.client.1.vm05.stdout:8/552: write d4/d6/d3a/fb8 [111734,23475] 0 2026-03-09T16:15:17.861 INFO:tasks.workunit.client.1.vm05.stdout:0/540: dread d5/d1b/f47 [0,4194304] 0 2026-03-09T16:15:17.862 INFO:tasks.workunit.client.1.vm05.stdout:8/553: rmdir d4/d6/d3a 39 2026-03-09T16:15:17.866 INFO:tasks.workunit.client.1.vm05.stdout:0/541: rmdir d5/d97 39 2026-03-09T16:15:17.866 INFO:tasks.workunit.client.1.vm05.stdout:9/579: mkdir d4/d10/d35/d2b/dc1 0 2026-03-09T16:15:17.867 INFO:tasks.workunit.client.1.vm05.stdout:2/520: symlink db/dd/d98/lab 0 2026-03-09T16:15:17.869 INFO:tasks.workunit.client.1.vm05.stdout:0/542: creat d5/db/d77/fb4 x:0 0 0 2026-03-09T16:15:17.870 INFO:tasks.workunit.client.1.vm05.stdout:8/554: creat d4/d6/db/df/d4f/d9f/fbb x:0 0 0 2026-03-09T16:15:17.873 INFO:tasks.workunit.client.1.vm05.stdout:9/580: dread d4/d10/d35/d36/f67 [0,4194304] 0 2026-03-09T16:15:17.874 INFO:tasks.workunit.client.1.vm05.stdout:9/581: dread - d4/d10/d35/d36/d48/d60/fad zero size 2026-03-09T16:15:17.875 INFO:tasks.workunit.client.1.vm05.stdout:9/582: chown d4/d10/d35/d2b/f2f 2475 1 2026-03-09T16:15:17.881 INFO:tasks.workunit.client.1.vm05.stdout:5/561: dread d8/d18/d1b/d47/d48/fa2 [0,4194304] 0 2026-03-09T16:15:17.881 INFO:tasks.workunit.client.1.vm05.stdout:0/543: mknod d5/d2c/d49/cb5 0 2026-03-09T16:15:17.883 INFO:tasks.workunit.client.1.vm05.stdout:3/448: dread d0/d9/f2b [0,4194304] 0 2026-03-09T16:15:17.887 INFO:tasks.workunit.client.1.vm05.stdout:6/528: dread d17/d1d/f41 [0,4194304] 0 2026-03-09T16:15:17.891 INFO:tasks.workunit.client.1.vm05.stdout:4/596: dwrite d5/de/d15/f74 [0,4194304] 0 2026-03-09T16:15:17.895 INFO:tasks.workunit.client.1.vm05.stdout:7/621: dread d1/d2/d8/dc/d33/f9f [0,4194304] 0 2026-03-09T16:15:17.896 INFO:tasks.workunit.client.1.vm05.stdout:2/521: dread db/dd/d15/d3f/d5b/f9f [0,4194304] 0 2026-03-09T16:15:17.898 INFO:tasks.workunit.client.1.vm05.stdout:2/522: dread db/dd/d15/d3f/d5b/d60/f7c [0,4194304] 0 2026-03-09T16:15:17.903 INFO:tasks.workunit.client.1.vm05.stdout:0/544: fdatasync d5/db/d5f/f7b 0 2026-03-09T16:15:17.905 INFO:tasks.workunit.client.1.vm05.stdout:6/529: creat d17/d22/d27/d34/d4b/d7f/fc8 x:0 0 0 2026-03-09T16:15:17.907 INFO:tasks.workunit.client.1.vm05.stdout:9/583: mkdir d4/d10/d35/d2b/dc1/dc2 0 2026-03-09T16:15:17.910 INFO:tasks.workunit.client.1.vm05.stdout:4/597: creat d5/de/d15/da9/db1/dad/d90/dbb/fd1 x:0 0 0 2026-03-09T16:15:17.911 INFO:tasks.workunit.client.1.vm05.stdout:3/449: dwrite d0/d9/d22/f30 [0,4194304] 0 2026-03-09T16:15:17.911 INFO:tasks.workunit.client.1.vm05.stdout:7/622: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb 0 2026-03-09T16:15:17.912 INFO:tasks.workunit.client.1.vm05.stdout:1/631: dwrite d7/d15/d16/f53 [0,4194304] 0 2026-03-09T16:15:17.922 INFO:tasks.workunit.client.1.vm05.stdout:0/545: symlink d5/db/d48/d66/lb6 0 2026-03-09T16:15:17.924 INFO:tasks.workunit.client.1.vm05.stdout:7/623: dwrite d1/d2/d8/dc/d1b/d71/d3c/f9b [0,4194304] 0 2026-03-09T16:15:17.928 INFO:tasks.workunit.client.1.vm05.stdout:2/523: symlink db/dd/lac 0 2026-03-09T16:15:17.931 INFO:tasks.workunit.client.1.vm05.stdout:4/598: mknod d5/de/d15/d21/da0/cd2 0 2026-03-09T16:15:17.931 INFO:tasks.workunit.client.1.vm05.stdout:9/584: unlink d4/lb5 0 2026-03-09T16:15:17.932 INFO:tasks.workunit.client.1.vm05.stdout:9/585: fsync d4/d10/d35/d2b/fa1 0 2026-03-09T16:15:17.935 INFO:tasks.workunit.client.1.vm05.stdout:3/450: dread d0/f86 [0,4194304] 0 2026-03-09T16:15:17.935 INFO:tasks.workunit.client.1.vm05.stdout:7/624: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fdc x:0 0 0 2026-03-09T16:15:17.936 INFO:tasks.workunit.client.1.vm05.stdout:5/562: getdents d8/d5e 0 2026-03-09T16:15:17.937 INFO:tasks.workunit.client.1.vm05.stdout:5/563: chown d8/d18/d1b/d47/d4e/d76/d8f/dab/lb5 460526 1 2026-03-09T16:15:17.937 INFO:tasks.workunit.client.1.vm05.stdout:5/564: write f5 [3410585,17815] 0 2026-03-09T16:15:17.937 INFO:tasks.workunit.client.1.vm05.stdout:3/451: chown d0/d9/d22/d4c/d4e 33027814 1 2026-03-09T16:15:17.938 INFO:tasks.workunit.client.1.vm05.stdout:1/632: rmdir d7/dd/d21/d44/d5c 39 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:8/555: write d4/d6/db/dc/d5d/d79/f91 [1431693,28701] 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:0/546: creat d5/db/d5b/d82/fb7 x:0 0 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:3/452: write d0/d33/f77 [2319227,82471] 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:7/625: fdatasync d1/d2/d8/fcb 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:6/530: rename d17/d4f/c7d to d17/d22/d27/d44/cc9 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:0/547: chown d5/db/l33 201790 1 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:2/524: mknod db/dd/cad 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:5/565: symlink d8/d18/dbc/dcc/lcd 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:4/599: rename d5/de/d15/d21/d39/f44 to d5/de/d15/da9/db1/dad/d37/d60/dbf/fd3 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:0/548: creat d5/d1b/d30/fb8 x:0 0 0 2026-03-09T16:15:17.948 INFO:tasks.workunit.client.1.vm05.stdout:8/556: rename d4/d6/db/dc/d5d/l78 to d4/d6/d3a/d15/lbc 0 2026-03-09T16:15:17.952 INFO:tasks.workunit.client.1.vm05.stdout:7/626: dwrite d1/d2/d8/dc/d1b/d71/d3c/f95 [0,4194304] 0 2026-03-09T16:15:17.952 INFO:tasks.workunit.client.1.vm05.stdout:8/557: chown d4/d6/db/dc/d3b 767655007 1 2026-03-09T16:15:17.954 INFO:tasks.workunit.client.1.vm05.stdout:8/558: stat d4/d6/d3a/d3c 0 2026-03-09T16:15:17.961 INFO:tasks.workunit.client.1.vm05.stdout:5/566: rename d8/d53/ca8 to d8/d18/d1b/d47/d48/cce 0 2026-03-09T16:15:17.964 INFO:tasks.workunit.client.1.vm05.stdout:0/549: truncate d5/db/f12 6068126 0 2026-03-09T16:15:17.967 INFO:tasks.workunit.client.1.vm05.stdout:8/559: creat d4/d6/db/dc/d5d/fbd x:0 0 0 2026-03-09T16:15:17.967 INFO:tasks.workunit.client.1.vm05.stdout:7/627: creat d1/d2/d8/dc/d1b/d71/d3c/fdd x:0 0 0 2026-03-09T16:15:17.967 INFO:tasks.workunit.client.1.vm05.stdout:1/633: dread d7/dd/d21/d39/d48/f59 [0,4194304] 0 2026-03-09T16:15:17.967 INFO:tasks.workunit.client.1.vm05.stdout:4/600: mknod d5/de/d15/da9/db1/dad/cd4 0 2026-03-09T16:15:17.967 INFO:tasks.workunit.client.1.vm05.stdout:4/601: fdatasync d5/de/d15/d21/d27/d3c/f92 0 2026-03-09T16:15:17.969 INFO:tasks.workunit.client.1.vm05.stdout:8/560: fdatasync d4/d6/f9 0 2026-03-09T16:15:17.976 INFO:tasks.workunit.client.1.vm05.stdout:8/561: creat d4/d6/db/d9b/fbe x:0 0 0 2026-03-09T16:15:17.976 INFO:tasks.workunit.client.1.vm05.stdout:5/567: dread d8/d18/d1b/f28 [4194304,4194304] 0 2026-03-09T16:15:17.978 INFO:tasks.workunit.client.1.vm05.stdout:6/531: rename d17/d22/d27/d58/fb1 to d17/d22/d9d/fca 0 2026-03-09T16:15:17.980 INFO:tasks.workunit.client.1.vm05.stdout:5/568: mknod d8/d59/ccf 0 2026-03-09T16:15:17.981 INFO:tasks.workunit.client.1.vm05.stdout:0/550: rename d5/ca to d5/d2c/d49/d83/cb9 0 2026-03-09T16:15:17.981 INFO:tasks.workunit.client.1.vm05.stdout:8/562: truncate d4/d6/d3a/d3c/f3f 1333298 0 2026-03-09T16:15:17.981 INFO:tasks.workunit.client.1.vm05.stdout:5/569: mkdir d8/d5e/dd0 0 2026-03-09T16:15:17.982 INFO:tasks.workunit.client.1.vm05.stdout:8/563: mkdir d4/d6/db/dc/d5d/da0/dbf 0 2026-03-09T16:15:17.986 INFO:tasks.workunit.client.1.vm05.stdout:8/564: fsync d4/d6/d53/fb1 0 2026-03-09T16:15:17.986 INFO:tasks.workunit.client.1.vm05.stdout:0/551: truncate d5/d11/d4f/d68/f94 1598799 0 2026-03-09T16:15:17.986 INFO:tasks.workunit.client.1.vm05.stdout:8/565: fsync d4/d6/db/dc/d5d/f7a 0 2026-03-09T16:15:17.986 INFO:tasks.workunit.client.1.vm05.stdout:6/532: getdents d17/d22/d27/d34/d42 0 2026-03-09T16:15:17.986 INFO:tasks.workunit.client.1.vm05.stdout:6/533: stat d17/d22/d27/d34/d4b/f6d 0 2026-03-09T16:15:17.988 INFO:tasks.workunit.client.1.vm05.stdout:7/628: sync 2026-03-09T16:15:17.991 INFO:tasks.workunit.client.1.vm05.stdout:5/570: link d8/d18/d1b/d47/d4e/d76/d8f/dab/lb5 d8/d5e/d8e/ld1 0 2026-03-09T16:15:17.992 INFO:tasks.workunit.client.1.vm05.stdout:8/566: dwrite d4/f1c [4194304,4194304] 0 2026-03-09T16:15:17.993 INFO:tasks.workunit.client.1.vm05.stdout:6/534: dread d17/f4e [0,4194304] 0 2026-03-09T16:15:17.993 INFO:tasks.workunit.client.1.vm05.stdout:8/567: readlink d4/d6/d3a/d3c/l74 0 2026-03-09T16:15:17.994 INFO:tasks.workunit.client.1.vm05.stdout:7/629: symlink d1/d2/d8/dc/d1b/d30/d4b/d65/lde 0 2026-03-09T16:15:17.996 INFO:tasks.workunit.client.1.vm05.stdout:7/630: readlink d1/d2/d8/dc/d1b/d30/d4b/lc7 0 2026-03-09T16:15:17.996 INFO:tasks.workunit.client.1.vm05.stdout:6/535: fdatasync d17/d5d/f8e 0 2026-03-09T16:15:17.999 INFO:tasks.workunit.client.1.vm05.stdout:3/453: write d0/d33/f29 [2885911,117681] 0 2026-03-09T16:15:18.002 INFO:tasks.workunit.client.1.vm05.stdout:9/586: write d4/f17 [2100341,1094] 0 2026-03-09T16:15:18.003 INFO:tasks.workunit.client.1.vm05.stdout:2/525: truncate db/dd/d15/d3f/f4a 1464545 0 2026-03-09T16:15:18.011 INFO:tasks.workunit.client.1.vm05.stdout:3/454: write d0/d9/f4d [4230325,28198] 0 2026-03-09T16:15:18.012 INFO:tasks.workunit.client.1.vm05.stdout:8/568: dwrite d4/d6/d9a/db3/fa5 [0,4194304] 0 2026-03-09T16:15:18.018 INFO:tasks.workunit.client.1.vm05.stdout:4/602: write d5/fd [4428221,58755] 0 2026-03-09T16:15:18.025 INFO:tasks.workunit.client.1.vm05.stdout:4/603: write d5/de/d15/d21/d27/d3c/f92 [4926146,82106] 0 2026-03-09T16:15:18.025 INFO:tasks.workunit.client.1.vm05.stdout:1/634: dwrite d7/dd/de/f3e [0,4194304] 0 2026-03-09T16:15:18.028 INFO:tasks.workunit.client.1.vm05.stdout:1/635: write d7/fc [3658828,105013] 0 2026-03-09T16:15:18.042 INFO:tasks.workunit.client.1.vm05.stdout:0/552: write d5/d2c/f28 [1758577,110749] 0 2026-03-09T16:15:18.057 INFO:tasks.workunit.client.1.vm05.stdout:3/455: mknod d0/d33/c96 0 2026-03-09T16:15:18.057 INFO:tasks.workunit.client.1.vm05.stdout:6/536: dwrite d17/d1d/f1e [4194304,4194304] 0 2026-03-09T16:15:18.059 INFO:tasks.workunit.client.1.vm05.stdout:6/537: chown d17/d22/d27/d34/c36 31 1 2026-03-09T16:15:18.065 INFO:tasks.workunit.client.1.vm05.stdout:5/571: getdents d8/d18/d1b/d47 0 2026-03-09T16:15:18.067 INFO:tasks.workunit.client.1.vm05.stdout:8/569: dread - d4/d6/d3a/f49 zero size 2026-03-09T16:15:18.072 INFO:tasks.workunit.client.1.vm05.stdout:4/604: mknod d5/de/d2f/cd5 0 2026-03-09T16:15:18.072 INFO:tasks.workunit.client.1.vm05.stdout:4/605: readlink d5/de/d15/d21/d27/l88 0 2026-03-09T16:15:18.083 INFO:tasks.workunit.client.1.vm05.stdout:6/538: unlink d17/d4f/l99 0 2026-03-09T16:15:18.095 INFO:tasks.workunit.client.1.vm05.stdout:3/456: dread d0/f46 [0,4194304] 0 2026-03-09T16:15:18.099 INFO:tasks.workunit.client.1.vm05.stdout:9/587: creat d4/d10/d35/fc3 x:0 0 0 2026-03-09T16:15:18.099 INFO:tasks.workunit.client.1.vm05.stdout:9/588: chown d4/d10/d35/d36/d48/d4c 25710 1 2026-03-09T16:15:18.100 INFO:tasks.workunit.client.1.vm05.stdout:3/457: dwrite d0/d33/f36 [4194304,4194304] 0 2026-03-09T16:15:18.100 INFO:tasks.workunit.client.1.vm05.stdout:0/553: symlink d5/d2c/d49/d83/d8b/daf/lba 0 2026-03-09T16:15:18.102 INFO:tasks.workunit.client.1.vm05.stdout:7/631: link d1/d2/d8/dc/d1b/d71/f74 d1/d2/d8/dc/d1b/d30/d4b/fdf 0 2026-03-09T16:15:18.108 INFO:tasks.workunit.client.1.vm05.stdout:7/632: write d1/d2/d8/dc/d1b/d71/f97 [848045,31021] 0 2026-03-09T16:15:18.109 INFO:tasks.workunit.client.1.vm05.stdout:7/633: dread - d1/d2/d8/dc/d1b/d30/d4b/db2/fd9 zero size 2026-03-09T16:15:18.119 INFO:tasks.workunit.client.1.vm05.stdout:6/539: write d17/d22/f79 [250646,101143] 0 2026-03-09T16:15:18.131 INFO:tasks.workunit.client.1.vm05.stdout:4/606: symlink d5/de/d15/ld6 0 2026-03-09T16:15:18.137 INFO:tasks.workunit.client.1.vm05.stdout:1/636: rename d7/dd/d21/d44/f8e to d7/dd/d21/d39/d87/fe2 0 2026-03-09T16:15:18.141 INFO:tasks.workunit.client.1.vm05.stdout:3/458: mkdir d0/d9/d97 0 2026-03-09T16:15:18.143 INFO:tasks.workunit.client.1.vm05.stdout:2/526: getdents db/dd/d15/d46/d67 0 2026-03-09T16:15:18.158 INFO:tasks.workunit.client.1.vm05.stdout:8/570: rmdir d4/d6/db/d75/d84 0 2026-03-09T16:15:18.158 INFO:tasks.workunit.client.1.vm05.stdout:8/571: stat d4/d6/db/dc/d5d/fbd 0 2026-03-09T16:15:18.160 INFO:tasks.workunit.client.1.vm05.stdout:1/637: write d7/f3f [946040,15914] 0 2026-03-09T16:15:18.161 INFO:tasks.workunit.client.1.vm05.stdout:9/589: mknod d4/d10/d35/d36/d48/d60/dae/cc4 0 2026-03-09T16:15:18.162 INFO:tasks.workunit.client.1.vm05.stdout:9/590: chown d4/d10/d35/d36/c9c 1634394319 1 2026-03-09T16:15:18.163 INFO:tasks.workunit.client.1.vm05.stdout:3/459: readlink d0/d9/d22/d4c/d4e/l6f 0 2026-03-09T16:15:18.165 INFO:tasks.workunit.client.1.vm05.stdout:1/638: dread d7/dd/de/f56 [0,4194304] 0 2026-03-09T16:15:18.166 INFO:tasks.workunit.client.1.vm05.stdout:6/540: symlink d17/d22/d27/d8a/d8b/lcb 0 2026-03-09T16:15:18.167 INFO:tasks.workunit.client.1.vm05.stdout:6/541: write d17/d22/d27/d44/f86 [1022090,67693] 0 2026-03-09T16:15:18.167 INFO:tasks.workunit.client.1.vm05.stdout:5/572: getdents d8/d18/d1b/d47/d48 0 2026-03-09T16:15:18.169 INFO:tasks.workunit.client.1.vm05.stdout:4/607: mknod d5/de/d15/d21/d27/d3c/d5c/da2/cd7 0 2026-03-09T16:15:18.169 INFO:tasks.workunit.client.1.vm05.stdout:3/460: dwrite d0/d33/f29 [0,4194304] 0 2026-03-09T16:15:18.172 INFO:tasks.workunit.client.1.vm05.stdout:3/461: dread - d0/f69 zero size 2026-03-09T16:15:18.173 INFO:tasks.workunit.client.1.vm05.stdout:3/462: write d0/d33/f29 [369062,26170] 0 2026-03-09T16:15:18.179 INFO:tasks.workunit.client.1.vm05.stdout:9/591: mkdir d4/d10/d35/d2b/d31/d82/dc5 0 2026-03-09T16:15:18.179 INFO:tasks.workunit.client.1.vm05.stdout:0/554: creat d5/d1b/fbb x:0 0 0 2026-03-09T16:15:18.183 INFO:tasks.workunit.client.1.vm05.stdout:7/634: creat d1/d2/d11/d86/fe0 x:0 0 0 2026-03-09T16:15:18.183 INFO:tasks.workunit.client.1.vm05.stdout:1/639: creat d7/d62/da3/fe3 x:0 0 0 2026-03-09T16:15:18.186 INFO:tasks.workunit.client.1.vm05.stdout:5/573: write d8/d18/d1b/d47/d48/fa2 [1230569,30928] 0 2026-03-09T16:15:18.186 INFO:tasks.workunit.client.1.vm05.stdout:5/574: chown d8/dc8 8 1 2026-03-09T16:15:18.195 INFO:tasks.workunit.client.1.vm05.stdout:8/572: creat d4/d6/fc0 x:0 0 0 2026-03-09T16:15:18.197 INFO:tasks.workunit.client.1.vm05.stdout:4/608: mkdir d5/de/d15/da9/db1/dad/d90/dd8 0 2026-03-09T16:15:18.200 INFO:tasks.workunit.client.1.vm05.stdout:0/555: rename d5/db/c38 to d5/d97/cbc 0 2026-03-09T16:15:18.201 INFO:tasks.workunit.client.1.vm05.stdout:9/592: creat d4/d10/d35/d36/d48/d54/db0/fc6 x:0 0 0 2026-03-09T16:15:18.201 INFO:tasks.workunit.client.1.vm05.stdout:1/640: unlink d7/d62/da5/lce 0 2026-03-09T16:15:18.202 INFO:tasks.workunit.client.1.vm05.stdout:5/575: symlink d8/d53/ld2 0 2026-03-09T16:15:18.203 INFO:tasks.workunit.client.1.vm05.stdout:4/609: dwrite d5/de/d15/da9/db1/dad/d37/fa5 [0,4194304] 0 2026-03-09T16:15:18.203 INFO:tasks.workunit.client.1.vm05.stdout:0/556: creat d5/d9e/fbd x:0 0 0 2026-03-09T16:15:18.207 INFO:tasks.workunit.client.1.vm05.stdout:8/573: sync 2026-03-09T16:15:18.211 INFO:tasks.workunit.client.1.vm05.stdout:1/641: symlink d7/dd/le4 0 2026-03-09T16:15:18.211 INFO:tasks.workunit.client.1.vm05.stdout:9/593: dread d4/d10/d35/d36/f77 [0,4194304] 0 2026-03-09T16:15:18.213 INFO:tasks.workunit.client.1.vm05.stdout:1/642: truncate d7/dd/d21/d63/d71/fd0 336267 0 2026-03-09T16:15:18.213 INFO:tasks.workunit.client.1.vm05.stdout:1/643: read d7/fb [4749689,67433] 0 2026-03-09T16:15:18.215 INFO:tasks.workunit.client.1.vm05.stdout:6/542: truncate d17/f2d 2419112 0 2026-03-09T16:15:18.215 INFO:tasks.workunit.client.1.vm05.stdout:7/635: write d1/d2/d8/dc/f1a [4765438,104846] 0 2026-03-09T16:15:18.215 INFO:tasks.workunit.client.1.vm05.stdout:2/527: write db/dd/f1b [82014,50619] 0 2026-03-09T16:15:18.216 INFO:tasks.workunit.client.1.vm05.stdout:9/594: dwrite d4/d10/f8d [0,4194304] 0 2026-03-09T16:15:18.225 INFO:tasks.workunit.client.1.vm05.stdout:3/463: link d0/d9/l72 d0/d9/d22/d5f/l98 0 2026-03-09T16:15:18.226 INFO:tasks.workunit.client.1.vm05.stdout:3/464: chown d0/f56 0 1 2026-03-09T16:15:18.228 INFO:tasks.workunit.client.1.vm05.stdout:3/465: write d0/d9/d22/d4c/f7f [4240639,71988] 0 2026-03-09T16:15:18.230 INFO:tasks.workunit.client.1.vm05.stdout:4/610: rename d5/de/d15/d21/d27/d3c/d5c/d5f/f57 to d5/de/d15/da9/db1/dad/d90/fd9 0 2026-03-09T16:15:18.238 INFO:tasks.workunit.client.1.vm05.stdout:1/644: unlink d7/d15/d16/dc2/ld5 0 2026-03-09T16:15:18.238 INFO:tasks.workunit.client.1.vm05.stdout:4/611: chown c3 208 1 2026-03-09T16:15:18.238 INFO:tasks.workunit.client.1.vm05.stdout:4/612: chown d5/de/d15/d21/d27/d3c/d5c/da2 13 1 2026-03-09T16:15:18.238 INFO:tasks.workunit.client.1.vm05.stdout:6/543: rename d17/f4a to d17/d22/d9d/da5/fcc 0 2026-03-09T16:15:18.238 INFO:tasks.workunit.client.1.vm05.stdout:2/528: symlink db/dd/d15/lae 0 2026-03-09T16:15:18.239 INFO:tasks.workunit.client.1.vm05.stdout:6/544: write d17/d22/d27/d34/d4b/f6d [294961,69621] 0 2026-03-09T16:15:18.239 INFO:tasks.workunit.client.1.vm05.stdout:5/576: creat d8/d18/d1b/fd3 x:0 0 0 2026-03-09T16:15:18.243 INFO:tasks.workunit.client.1.vm05.stdout:5/577: write d8/d1d/f21 [2231265,81981] 0 2026-03-09T16:15:18.252 INFO:tasks.workunit.client.1.vm05.stdout:8/574: truncate d4/d6/f44 2269544 0 2026-03-09T16:15:18.255 INFO:tasks.workunit.client.1.vm05.stdout:3/466: mkdir d0/d9/d22/d5f/d7b/d99 0 2026-03-09T16:15:18.256 INFO:tasks.workunit.client.1.vm05.stdout:0/557: dwrite d5/db/d5f/f7b [0,4194304] 0 2026-03-09T16:15:18.261 INFO:tasks.workunit.client.1.vm05.stdout:2/529: dread db/dd/d15/d1f/d21/f29 [0,4194304] 0 2026-03-09T16:15:18.263 INFO:tasks.workunit.client.1.vm05.stdout:5/578: write d8/d18/dbc/dcc/daa/d43/f41 [1637514,5000] 0 2026-03-09T16:15:18.267 INFO:tasks.workunit.client.1.vm05.stdout:8/575: dread d4/d6/d3a/d15/f66 [0,4194304] 0 2026-03-09T16:15:18.271 INFO:tasks.workunit.client.1.vm05.stdout:4/613: dwrite d5/f35 [0,4194304] 0 2026-03-09T16:15:18.276 INFO:tasks.workunit.client.1.vm05.stdout:8/576: read d4/d6/db/dc/d5d/d79/f91 [1267899,124105] 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:6/545: truncate d17/d4f/f70 227679 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:3/467: creat d0/d9/d22/d5f/d7b/f9a x:0 0 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:9/595: rename d4/d10/d35/d2b/d31/cbe to d4/d10/d35/d36/cc7 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:0/558: link d5/d9e/fbd d5/db/d5b/da5/fbe 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:5/579: chown d8/fb 102938 1 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:2/530: readlink db/dd/l6c 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:9/596: mkdir d4/d10/d35/d2b/d31/dc8 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:3/468: symlink d0/d9/l9b 0 2026-03-09T16:15:18.285 INFO:tasks.workunit.client.1.vm05.stdout:8/577: truncate d4/d6/d3a/d3c/f8d 697224 0 2026-03-09T16:15:18.286 INFO:tasks.workunit.client.1.vm05.stdout:7/636: rename d1/d2/d8/dc/d14/c13 to d1/d2/d11/d86/da2/ce1 0 2026-03-09T16:15:18.289 INFO:tasks.workunit.client.1.vm05.stdout:4/614: dwrite d5/de/d15/da9/db1/f58 [0,4194304] 0 2026-03-09T16:15:18.289 INFO:tasks.workunit.client.1.vm05.stdout:6/546: getdents d17/d1d 0 2026-03-09T16:15:18.289 INFO:tasks.workunit.client.1.vm05.stdout:5/580: readlink d8/d59/d5b/l63 0 2026-03-09T16:15:18.292 INFO:tasks.workunit.client.1.vm05.stdout:6/547: fdatasync d17/d22/d27/d34/d4b/f6d 0 2026-03-09T16:15:18.292 INFO:tasks.workunit.client.1.vm05.stdout:0/559: truncate d5/db/d5b/f35 1449164 0 2026-03-09T16:15:18.296 INFO:tasks.workunit.client.1.vm05.stdout:7/637: chown d1/d2/d8/dc/d1b/d30/d4b/l90 54700 1 2026-03-09T16:15:18.297 INFO:tasks.workunit.client.1.vm05.stdout:3/469: truncate d0/f49 122405 0 2026-03-09T16:15:18.298 INFO:tasks.workunit.client.1.vm05.stdout:4/615: symlink d5/de/d15/da9/db1/dad/d90/dbb/lda 0 2026-03-09T16:15:18.299 INFO:tasks.workunit.client.1.vm05.stdout:1/645: rename d7/dbe/dca/lc4 to d7/d62/da3/le5 0 2026-03-09T16:15:18.299 INFO:tasks.workunit.client.1.vm05.stdout:3/470: write d0/d33/f85 [104242,38411] 0 2026-03-09T16:15:18.301 INFO:tasks.workunit.client.1.vm05.stdout:8/578: dwrite d4/f1c [0,4194304] 0 2026-03-09T16:15:18.302 INFO:tasks.workunit.client.1.vm05.stdout:1/646: read - d7/dd/d21/d39/fdb zero size 2026-03-09T16:15:18.309 INFO:tasks.workunit.client.1.vm05.stdout:1/647: read d7/d15/f22 [5498521,38142] 0 2026-03-09T16:15:18.310 INFO:tasks.workunit.client.1.vm05.stdout:4/616: dwrite d5/de/d15/d21/d27/f8f [0,4194304] 0 2026-03-09T16:15:18.310 INFO:tasks.workunit.client.1.vm05.stdout:4/617: dread - d5/fce zero size 2026-03-09T16:15:18.320 INFO:tasks.workunit.client.1.vm05.stdout:2/531: rename fa to db/dd/d15/d1f/d20/d23/faf 0 2026-03-09T16:15:18.321 INFO:tasks.workunit.client.1.vm05.stdout:8/579: creat d4/d6/db/dc/d3b/fc1 x:0 0 0 2026-03-09T16:15:18.322 INFO:tasks.workunit.client.1.vm05.stdout:8/580: symlink d4/d6/db/da6/lc2 0 2026-03-09T16:15:18.324 INFO:tasks.workunit.client.1.vm05.stdout:9/597: sync 2026-03-09T16:15:18.327 INFO:tasks.workunit.client.1.vm05.stdout:3/471: getdents d0/d9/d22/d5f/d75 0 2026-03-09T16:15:18.328 INFO:tasks.workunit.client.1.vm05.stdout:3/472: fdatasync d0/d9/d8b/f91 0 2026-03-09T16:15:18.328 INFO:tasks.workunit.client.1.vm05.stdout:6/548: rename l6 to d17/lcd 0 2026-03-09T16:15:18.329 INFO:tasks.workunit.client.1.vm05.stdout:8/581: mknod d4/d6/db/df/d4f/d9f/cc3 0 2026-03-09T16:15:18.329 INFO:tasks.workunit.client.1.vm05.stdout:6/549: fsync d17/d1d/f41 0 2026-03-09T16:15:18.330 INFO:tasks.workunit.client.1.vm05.stdout:9/598: chown d4/d10/d35/d2b/d38/c95 84717 1 2026-03-09T16:15:18.331 INFO:tasks.workunit.client.1.vm05.stdout:3/473: fsync d0/f86 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:8/582: write d4/d6/d3a/f88 [178810,83098] 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:5/581: rename d8/d18/d1b/d47/d48/fa2 to d8/d59/d5b/d8b/fd4 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:6/550: mkdir d17/d22/dce 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:0/560: rename d5/d1b/f47 to d5/d11/d4f/d70/fbf 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:2/532: dread db/dd/d15/d1f/f49 [4194304,4194304] 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:5/582: mkdir d8/dd5 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:6/551: readlink d17/d22/d27/d34/d4b/laa 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:5/583: unlink d8/d18/d1b/fd3 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:5/584: stat d8/d1d 0 2026-03-09T16:15:18.344 INFO:tasks.workunit.client.1.vm05.stdout:3/474: dwrite d0/d9/d22/d4c/f78 [0,4194304] 0 2026-03-09T16:15:18.348 INFO:tasks.workunit.client.1.vm05.stdout:9/599: dwrite d4/d10/d35/d36/d48/fb7 [0,4194304] 0 2026-03-09T16:15:18.351 INFO:tasks.workunit.client.1.vm05.stdout:5/585: stat d8/d53/d7e/cb7 0 2026-03-09T16:15:18.352 INFO:tasks.workunit.client.1.vm05.stdout:0/561: dwrite d5/d2c/d49/d83/d8b/f8f [0,4194304] 0 2026-03-09T16:15:18.357 INFO:tasks.workunit.client.1.vm05.stdout:8/583: sync 2026-03-09T16:15:18.364 INFO:tasks.workunit.client.1.vm05.stdout:0/562: write d5/d11/d4f/f81 [701081,78935] 0 2026-03-09T16:15:18.366 INFO:tasks.workunit.client.1.vm05.stdout:5/586: dwrite d8/d59/d5b/f66 [4194304,4194304] 0 2026-03-09T16:15:18.376 INFO:tasks.workunit.client.1.vm05.stdout:8/584: symlink d4/d6/db/df/d4f/lc4 0 2026-03-09T16:15:18.376 INFO:tasks.workunit.client.1.vm05.stdout:8/585: chown d4/d6/db/dc/d2e 5334587 1 2026-03-09T16:15:18.377 INFO:tasks.workunit.client.1.vm05.stdout:9/600: creat d4/d10/d35/d36/d48/d60/dae/fc9 x:0 0 0 2026-03-09T16:15:18.378 INFO:tasks.workunit.client.1.vm05.stdout:9/601: dread - d4/d10/d35/d36/d48/d60/fad zero size 2026-03-09T16:15:18.381 INFO:tasks.workunit.client.1.vm05.stdout:3/475: creat d0/d9/d22/d5f/d75/d76/d88/f9c x:0 0 0 2026-03-09T16:15:18.384 INFO:tasks.workunit.client.1.vm05.stdout:6/552: creat d17/d22/d27/d58/db8/fcf x:0 0 0 2026-03-09T16:15:18.384 INFO:tasks.workunit.client.1.vm05.stdout:0/563: rmdir d5/d2c/d49/d83/d8b/daf 39 2026-03-09T16:15:18.387 INFO:tasks.workunit.client.1.vm05.stdout:3/476: rename d0/d33/f81 to d0/d9/d22/d5f/d7b/d99/f9d 0 2026-03-09T16:15:18.387 INFO:tasks.workunit.client.1.vm05.stdout:3/477: stat d0/d9/d22/d4c/l87 0 2026-03-09T16:15:18.387 INFO:tasks.workunit.client.1.vm05.stdout:6/553: rmdir d17/d22/d27/d34/d42 39 2026-03-09T16:15:18.388 INFO:tasks.workunit.client.1.vm05.stdout:0/564: dwrite d5/d1b/f6a [0,4194304] 0 2026-03-09T16:15:18.388 INFO:tasks.workunit.client.1.vm05.stdout:3/478: readlink d0/d9/d22/d4c/l87 0 2026-03-09T16:15:18.388 INFO:tasks.workunit.client.1.vm05.stdout:3/479: fsync d0/d9/f2b 0 2026-03-09T16:15:18.390 INFO:tasks.workunit.client.1.vm05.stdout:5/587: getdents d8/dd5 0 2026-03-09T16:15:18.391 INFO:tasks.workunit.client.1.vm05.stdout:5/588: chown d8/d59 1343219033 1 2026-03-09T16:15:18.392 INFO:tasks.workunit.client.1.vm05.stdout:3/480: dwrite d0/d33/f41 [0,4194304] 0 2026-03-09T16:15:18.393 INFO:tasks.workunit.client.1.vm05.stdout:6/554: creat d17/d22/d27/d8a/fd0 x:0 0 0 2026-03-09T16:15:18.393 INFO:tasks.workunit.client.1.vm05.stdout:0/565: creat d5/d97/fc0 x:0 0 0 2026-03-09T16:15:18.394 INFO:tasks.workunit.client.1.vm05.stdout:3/481: dread - d0/d9/d22/d5f/d7b/d99/f9d zero size 2026-03-09T16:15:18.394 INFO:tasks.workunit.client.1.vm05.stdout:6/555: dread - d17/d5d/f8e zero size 2026-03-09T16:15:18.397 INFO:tasks.workunit.client.1.vm05.stdout:3/482: dwrite d0/d9/d22/f30 [4194304,4194304] 0 2026-03-09T16:15:18.397 INFO:tasks.workunit.client.1.vm05.stdout:3/483: fdatasync d0/f69 0 2026-03-09T16:15:18.398 INFO:tasks.workunit.client.1.vm05.stdout:3/484: fsync d0/d9/f93 0 2026-03-09T16:15:18.407 INFO:tasks.workunit.client.1.vm05.stdout:5/589: write d8/d59/f5c [4787655,31631] 0 2026-03-09T16:15:18.407 INFO:tasks.workunit.client.1.vm05.stdout:9/602: link d4/d10/d35/d36/d48/d60/d94/la2 d4/d10/d35/d2b/lca 0 2026-03-09T16:15:18.413 INFO:tasks.workunit.client.1.vm05.stdout:8/586: getdents d4/d6/d3a/d40 0 2026-03-09T16:15:18.414 INFO:tasks.workunit.client.1.vm05.stdout:5/590: creat d8/d5e/fd6 x:0 0 0 2026-03-09T16:15:18.417 INFO:tasks.workunit.client.1.vm05.stdout:5/591: read - d8/d59/d5b/d8b/da0/fc1 zero size 2026-03-09T16:15:18.417 INFO:tasks.workunit.client.1.vm05.stdout:0/566: truncate d5/d11/f40 1541925 0 2026-03-09T16:15:18.419 INFO:tasks.workunit.client.1.vm05.stdout:9/603: mkdir d4/d10/d35/d36/d48/d60/dcb 0 2026-03-09T16:15:18.421 INFO:tasks.workunit.client.1.vm05.stdout:8/587: dwrite d4/d6/db/df/d80/f9c [0,4194304] 0 2026-03-09T16:15:18.421 INFO:tasks.workunit.client.1.vm05.stdout:3/485: getdents d0/d9/d22/d5f 0 2026-03-09T16:15:18.425 INFO:tasks.workunit.client.1.vm05.stdout:0/567: dwrite d5/db/d77/fb4 [0,4194304] 0 2026-03-09T16:15:18.431 INFO:tasks.workunit.client.1.vm05.stdout:5/592: truncate d8/d59/f83 189049 0 2026-03-09T16:15:18.434 INFO:tasks.workunit.client.1.vm05.stdout:0/568: write d5/db/d48/d66/f91 [837301,46735] 0 2026-03-09T16:15:18.435 INFO:tasks.workunit.client.1.vm05.stdout:0/569: write d5/d2c/d49/f7a [333184,29889] 0 2026-03-09T16:15:18.436 INFO:tasks.workunit.client.1.vm05.stdout:8/588: dwrite d4/f77 [0,4194304] 0 2026-03-09T16:15:18.440 INFO:tasks.workunit.client.1.vm05.stdout:5/593: dwrite d8/d18/dbc/dcc/daa/fa5 [0,4194304] 0 2026-03-09T16:15:18.449 INFO:tasks.workunit.client.1.vm05.stdout:5/594: stat d8/d59/d5b/cba 0 2026-03-09T16:15:18.449 INFO:tasks.workunit.client.1.vm05.stdout:5/595: write f5 [1049529,33964] 0 2026-03-09T16:15:18.449 INFO:tasks.workunit.client.1.vm05.stdout:8/589: chown d4/d6/db/dc/f26 95316 1 2026-03-09T16:15:18.449 INFO:tasks.workunit.client.1.vm05.stdout:3/486: link d0/d9/f2f d0/d9/d22/d5f/d75/d76/d88/d89/f9e 0 2026-03-09T16:15:18.455 INFO:tasks.workunit.client.1.vm05.stdout:3/487: unlink d0/f86 0 2026-03-09T16:15:18.459 INFO:tasks.workunit.client.1.vm05.stdout:4/618: write d5/de/d15/d21/d27/f2c [599143,51719] 0 2026-03-09T16:15:18.459 INFO:tasks.workunit.client.1.vm05.stdout:1/648: truncate d7/d15/f8d 749251 0 2026-03-09T16:15:18.460 INFO:tasks.workunit.client.1.vm05.stdout:7/638: dwrite d1/d2/d8/dc/f3b [0,4194304] 0 2026-03-09T16:15:18.461 INFO:tasks.workunit.client.1.vm05.stdout:1/649: chown d7/dd/d21/d39/d5a/l94 184 1 2026-03-09T16:15:18.461 INFO:tasks.workunit.client.1.vm05.stdout:4/619: chown d5/de/d15/da9/db1/dad/d37/c47 7763334 1 2026-03-09T16:15:18.463 INFO:tasks.workunit.client.1.vm05.stdout:2/533: write db/f2d [327602,15127] 0 2026-03-09T16:15:18.464 INFO:tasks.workunit.client.1.vm05.stdout:7/639: truncate d1/d2/d8/d31/f39 2789488 0 2026-03-09T16:15:18.465 INFO:tasks.workunit.client.1.vm05.stdout:4/620: write d5/de/d15/da9/db1/dad/d90/fd9 [468953,10136] 0 2026-03-09T16:15:18.469 INFO:tasks.workunit.client.1.vm05.stdout:2/534: chown db/dd/d15/d1f/c82 499 1 2026-03-09T16:15:18.470 INFO:tasks.workunit.client.1.vm05.stdout:2/535: write db/dd/d15/f48 [4625971,107454] 0 2026-03-09T16:15:18.473 INFO:tasks.workunit.client.1.vm05.stdout:4/621: getdents d5/de/d15/da9/db1/dad/d90 0 2026-03-09T16:15:18.480 INFO:tasks.workunit.client.1.vm05.stdout:2/536: dread db/dd/d15/f48 [0,4194304] 0 2026-03-09T16:15:18.496 INFO:tasks.workunit.client.1.vm05.stdout:9/604: dread d4/d10/d35/d2b/d38/f62 [0,4194304] 0 2026-03-09T16:15:18.499 INFO:tasks.workunit.client.1.vm05.stdout:2/537: dread db/dd/d15/d1f/d20/d23/faf [4194304,4194304] 0 2026-03-09T16:15:18.499 INFO:tasks.workunit.client.1.vm05.stdout:8/590: sync 2026-03-09T16:15:18.506 INFO:tasks.workunit.client.1.vm05.stdout:9/605: fsync d4/d10/d35/d36/d48/f87 0 2026-03-09T16:15:18.509 INFO:tasks.workunit.client.1.vm05.stdout:8/591: symlink d4/d6/db/d75/lc5 0 2026-03-09T16:15:18.510 INFO:tasks.workunit.client.1.vm05.stdout:8/592: write d4/fa7 [324432,120052] 0 2026-03-09T16:15:18.512 INFO:tasks.workunit.client.1.vm05.stdout:2/538: rmdir db/dd/d15/d3f/d5b/d60 39 2026-03-09T16:15:18.528 INFO:tasks.workunit.client.1.vm05.stdout:9/606: symlink d4/d10/d35/d36/d48/d60/d94/lcc 0 2026-03-09T16:15:18.528 INFO:tasks.workunit.client.1.vm05.stdout:2/539: symlink db/lb0 0 2026-03-09T16:15:18.528 INFO:tasks.workunit.client.1.vm05.stdout:9/607: symlink d4/d10/d35/d2b/dc1/dc2/lcd 0 2026-03-09T16:15:18.529 INFO:tasks.workunit.client.1.vm05.stdout:2/540: mknod db/dd/d98/cb1 0 2026-03-09T16:15:18.529 INFO:tasks.workunit.client.1.vm05.stdout:9/608: truncate d4/d10/d35/d2b/f45 1558712 0 2026-03-09T16:15:18.529 INFO:tasks.workunit.client.1.vm05.stdout:2/541: mknod db/dd/d15/d1f/d20/d23/d78/cb2 0 2026-03-09T16:15:18.529 INFO:tasks.workunit.client.1.vm05.stdout:9/609: getdents d4/d10/d35/d2b/dba 0 2026-03-09T16:15:18.534 INFO:tasks.workunit.client.1.vm05.stdout:2/542: dwrite db/dd/d15/d1f/d20/d86/f8f [0,4194304] 0 2026-03-09T16:15:18.534 INFO:tasks.workunit.client.1.vm05.stdout:1/650: dread d7/dd/de/f38 [0,4194304] 0 2026-03-09T16:15:18.545 INFO:tasks.workunit.client.1.vm05.stdout:1/651: symlink d7/d62/db6/le6 0 2026-03-09T16:15:18.548 INFO:tasks.workunit.client.1.vm05.stdout:9/610: fsync d4/d10/d35/d36/d48/d54/d59/fb6 0 2026-03-09T16:15:18.548 INFO:tasks.workunit.client.1.vm05.stdout:2/543: mknod db/dd/d15/cb3 0 2026-03-09T16:15:18.553 INFO:tasks.workunit.client.1.vm05.stdout:4/622: dread d5/f2d [0,4194304] 0 2026-03-09T16:15:18.553 INFO:tasks.workunit.client.1.vm05.stdout:1/652: creat d7/dd/de/d52/fe7 x:0 0 0 2026-03-09T16:15:18.554 INFO:tasks.workunit.client.1.vm05.stdout:9/611: dwrite d4/d10/d35/d2b/d31/f55 [0,4194304] 0 2026-03-09T16:15:18.558 INFO:tasks.workunit.client.1.vm05.stdout:1/653: symlink d7/dd/d21/d63/le8 0 2026-03-09T16:15:18.558 INFO:tasks.workunit.client.1.vm05.stdout:1/654: fdatasync d7/dd/f93 0 2026-03-09T16:15:18.559 INFO:tasks.workunit.client.1.vm05.stdout:1/655: write d7/dd/d21/d39/fdb [424626,83913] 0 2026-03-09T16:15:18.561 INFO:tasks.workunit.client.1.vm05.stdout:1/656: write d7/d27/f84 [385561,19276] 0 2026-03-09T16:15:18.562 INFO:tasks.workunit.client.1.vm05.stdout:9/612: getdents d4/d10/d35/d36/d48 0 2026-03-09T16:15:18.567 INFO:tasks.workunit.client.1.vm05.stdout:1/657: rename d7/dd/de/d52/d5b/la2 to d7/dd/d21/d63/le9 0 2026-03-09T16:15:18.579 INFO:tasks.workunit.client.1.vm05.stdout:1/658: link d7/d15/c4e d7/d62/da3/cea 0 2026-03-09T16:15:18.579 INFO:tasks.workunit.client.1.vm05.stdout:1/659: fdatasync d7/dd/d21/d39/f86 0 2026-03-09T16:15:18.613 INFO:tasks.workunit.client.1.vm05.stdout:1/660: fsync d7/d27/f84 0 2026-03-09T16:15:18.613 INFO:tasks.workunit.client.1.vm05.stdout:1/661: fsync d7/dd/f1f 0 2026-03-09T16:15:18.616 INFO:tasks.workunit.client.1.vm05.stdout:1/662: mknod d7/dd/d21/d39/d87/ceb 0 2026-03-09T16:15:18.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:18 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:18.649 INFO:tasks.workunit.client.1.vm05.stdout:5/596: dread d8/d18/dbc/dcc/daa/fb1 [0,4194304] 0 2026-03-09T16:15:18.652 INFO:tasks.workunit.client.1.vm05.stdout:3/488: getdents d0/d33 0 2026-03-09T16:15:18.655 INFO:tasks.workunit.client.1.vm05.stdout:5/597: rename d8/d18/d1b/d47/c9c to d8/d18/d1b/d47/d48/d73/d80/cd7 0 2026-03-09T16:15:18.657 INFO:tasks.workunit.client.1.vm05.stdout:3/489: write d0/f56 [1550374,33629] 0 2026-03-09T16:15:18.659 INFO:tasks.workunit.client.1.vm05.stdout:6/556: dwrite d17/f1b [0,4194304] 0 2026-03-09T16:15:18.661 INFO:tasks.workunit.client.1.vm05.stdout:5/598: unlink d8/d5e/fb6 0 2026-03-09T16:15:18.662 INFO:tasks.workunit.client.1.vm05.stdout:3/490: unlink d0/d9/d22/d5f/c8f 0 2026-03-09T16:15:18.663 INFO:tasks.workunit.client.1.vm05.stdout:5/599: symlink d8/d1d/ld8 0 2026-03-09T16:15:18.665 INFO:tasks.workunit.client.1.vm05.stdout:3/491: symlink d0/d33/l9f 0 2026-03-09T16:15:18.665 INFO:tasks.workunit.client.1.vm05.stdout:6/557: mkdir d17/d22/d27/d34/dd1 0 2026-03-09T16:15:18.675 INFO:tasks.workunit.client.1.vm05.stdout:5/600: creat d8/d18/fd9 x:0 0 0 2026-03-09T16:15:18.676 INFO:tasks.workunit.client.1.vm05.stdout:5/601: fsync d8/d18/d1b/d47/d48/d73/d80/fac 0 2026-03-09T16:15:18.678 INFO:tasks.workunit.client.1.vm05.stdout:6/558: rename d17/d22/d27/d8a/cb5 to d17/d22/d27/cd2 0 2026-03-09T16:15:18.680 INFO:tasks.workunit.client.1.vm05.stdout:6/559: creat d17/d1d/fd3 x:0 0 0 2026-03-09T16:15:18.683 INFO:tasks.workunit.client.1.vm05.stdout:6/560: mkdir d17/d22/d27/d44/dd4 0 2026-03-09T16:15:18.689 INFO:tasks.workunit.client.1.vm05.stdout:6/561: creat d17/d5d/d73/d83/fd5 x:0 0 0 2026-03-09T16:15:18.690 INFO:tasks.workunit.client.1.vm05.stdout:5/602: sync 2026-03-09T16:15:18.694 INFO:tasks.workunit.client.1.vm05.stdout:5/603: truncate d8/d18/dbc/dcc/f3f 837174 0 2026-03-09T16:15:18.695 INFO:tasks.workunit.client.1.vm05.stdout:3/492: dread d0/d33/f5e [0,4194304] 0 2026-03-09T16:15:18.695 INFO:tasks.workunit.client.1.vm05.stdout:5/604: chown d8/f7b 351846 1 2026-03-09T16:15:18.705 INFO:tasks.workunit.client.1.vm05.stdout:3/493: creat d0/d9/d22/d6b/fa0 x:0 0 0 2026-03-09T16:15:18.706 INFO:tasks.workunit.client.1.vm05.stdout:0/570: dread d5/d11/f40 [0,4194304] 0 2026-03-09T16:15:18.716 INFO:tasks.workunit.client.1.vm05.stdout:7/640: write d1/d2/d8/dc/d1b/d30/d5e/f68 [130256,58773] 0 2026-03-09T16:15:18.729 INFO:tasks.workunit.client.1.vm05.stdout:2/544: write db/dd/f32 [2692151,74970] 0 2026-03-09T16:15:18.730 INFO:tasks.workunit.client.1.vm05.stdout:2/545: readlink db/dd/d15/d4c/l85 0 2026-03-09T16:15:18.731 INFO:tasks.workunit.client.1.vm05.stdout:4/623: truncate d5/de/d15/d21/d39/d91/faa 1151559 0 2026-03-09T16:15:18.731 INFO:tasks.workunit.client.1.vm05.stdout:2/546: readlink db/dd/d15/d4c/d56/l9e 0 2026-03-09T16:15:18.737 INFO:tasks.workunit.client.1.vm05.stdout:2/547: dread db/f2d [0,4194304] 0 2026-03-09T16:15:18.738 INFO:tasks.workunit.client.1.vm05.stdout:9/613: dwrite d4/d10/d35/d36/d48/f87 [0,4194304] 0 2026-03-09T16:15:18.746 INFO:tasks.workunit.client.1.vm05.stdout:2/548: dread db/dd/d15/d1f/f25 [8388608,4194304] 0 2026-03-09T16:15:18.750 INFO:tasks.workunit.client.1.vm05.stdout:9/614: dwrite d4/f66 [0,4194304] 0 2026-03-09T16:15:18.752 INFO:tasks.workunit.client.1.vm05.stdout:2/549: dread db/dd/d15/d46/d67/f9a [0,4194304] 0 2026-03-09T16:15:18.772 INFO:tasks.workunit.client.1.vm05.stdout:1/663: write d7/dd/de/f2e [1901666,574] 0 2026-03-09T16:15:18.772 INFO:tasks.workunit.client.1.vm05.stdout:1/664: stat d7/dd/d21/d39/d5a/f41 0 2026-03-09T16:15:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:18 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:18.786 INFO:tasks.workunit.client.1.vm05.stdout:1/665: creat d7/dd/d21/d39/d48/d5d/fec x:0 0 0 2026-03-09T16:15:18.788 INFO:tasks.workunit.client.1.vm05.stdout:5/605: getdents d8/d18/d1b/d6b 0 2026-03-09T16:15:18.788 INFO:tasks.workunit.client.1.vm05.stdout:6/562: dwrite d17/d4f/f77 [0,4194304] 0 2026-03-09T16:15:18.794 INFO:tasks.workunit.client.1.vm05.stdout:1/666: dwrite d7/dd/d21/d63/d71/fd0 [0,4194304] 0 2026-03-09T16:15:18.797 INFO:tasks.workunit.client.1.vm05.stdout:1/667: write d7/dd/d21/d39/f86 [231881,126494] 0 2026-03-09T16:15:18.811 INFO:tasks.workunit.client.1.vm05.stdout:4/624: symlink d5/de/d15/d21/d27/d3c/d5c/da2/dc9/ldb 0 2026-03-09T16:15:18.812 INFO:tasks.workunit.client.1.vm05.stdout:4/625: chown d5/de/d15/d21/d27/f8f 1 1 2026-03-09T16:15:18.813 INFO:tasks.workunit.client.1.vm05.stdout:2/550: symlink db/dd/d15/lb4 0 2026-03-09T16:15:18.821 INFO:tasks.workunit.client.1.vm05.stdout:7/641: link d1/d2/d8/dc/d1b/d71/c5c d1/d2/d8/dc/d14/ce2 0 2026-03-09T16:15:18.826 INFO:tasks.workunit.client.1.vm05.stdout:8/593: link d4/d6/d53/f89 d4/d6/d3a/d40/d71/fc6 0 2026-03-09T16:15:18.830 INFO:tasks.workunit.client.1.vm05.stdout:8/594: chown d4/d6/db/dc/d2e/d85 225887 1 2026-03-09T16:15:18.830 INFO:tasks.workunit.client.1.vm05.stdout:1/668: mkdir d7/dbe/ded 0 2026-03-09T16:15:18.830 INFO:tasks.workunit.client.1.vm05.stdout:6/563: symlink d17/d22/d9d/ld6 0 2026-03-09T16:15:18.833 INFO:tasks.workunit.client.1.vm05.stdout:1/669: mkdir d7/d15/d45/dee 0 2026-03-09T16:15:18.834 INFO:tasks.workunit.client.1.vm05.stdout:4/626: symlink d5/de/d15/d21/d27/d3c/d5c/d5f/db6/ldc 0 2026-03-09T16:15:18.835 INFO:tasks.workunit.client.1.vm05.stdout:8/595: mkdir d4/d6/db/dc/d2e/d85/dc7 0 2026-03-09T16:15:18.835 INFO:tasks.workunit.client.1.vm05.stdout:8/596: symlink d4/d6/d3a/d15/lc8 2 2026-03-09T16:15:18.835 INFO:tasks.workunit.client.1.vm05.stdout:1/670: write d7/dd/d21/d39/d48/f59 [544204,72079] 0 2026-03-09T16:15:18.837 INFO:tasks.workunit.client.1.vm05.stdout:8/597: chown d4/d6/db/dc/d2e/d85/dc7 485 1 2026-03-09T16:15:18.839 INFO:tasks.workunit.client.1.vm05.stdout:2/551: sync 2026-03-09T16:15:18.842 INFO:tasks.workunit.client.1.vm05.stdout:6/564: truncate d17/d22/d27/d34/d4b/f98 388827 0 2026-03-09T16:15:18.842 INFO:tasks.workunit.client.1.vm05.stdout:8/598: fdatasync d4/d6/d3a/d15/f66 0 2026-03-09T16:15:18.848 INFO:tasks.workunit.client.1.vm05.stdout:2/552: creat db/dd/d15/d3f/d5b/d60/d6a/fb5 x:0 0 0 2026-03-09T16:15:18.851 INFO:tasks.workunit.client.1.vm05.stdout:8/599: dread d4/f1c [0,4194304] 0 2026-03-09T16:15:18.851 INFO:tasks.workunit.client.1.vm05.stdout:6/565: creat d17/d22/d27/fd7 x:0 0 0 2026-03-09T16:15:18.852 INFO:tasks.workunit.client.1.vm05.stdout:2/553: creat db/dd/d15/d3f/d5b/d60/d6a/fb6 x:0 0 0 2026-03-09T16:15:18.854 INFO:tasks.workunit.client.1.vm05.stdout:2/554: write db/dd/d15/f70 [3739591,71257] 0 2026-03-09T16:15:18.858 INFO:tasks.workunit.client.1.vm05.stdout:6/566: mkdir d17/d22/d27/dd8 0 2026-03-09T16:15:18.860 INFO:tasks.workunit.client.1.vm05.stdout:2/555: read db/dd/d15/d3f/d5b/d60/d95/f80 [29275,75688] 0 2026-03-09T16:15:18.869 INFO:tasks.workunit.client.1.vm05.stdout:2/556: dwrite db/dd/d15/d1f/f25 [4194304,4194304] 0 2026-03-09T16:15:18.871 INFO:tasks.workunit.client.1.vm05.stdout:8/600: sync 2026-03-09T16:15:18.881 INFO:tasks.workunit.client.1.vm05.stdout:2/557: mknod db/dd/d15/d1f/d20/d23/cb7 0 2026-03-09T16:15:18.882 INFO:tasks.workunit.client.1.vm05.stdout:8/601: symlink d4/d6/db/df/lc9 0 2026-03-09T16:15:18.907 INFO:tasks.workunit.client.1.vm05.stdout:9/615: truncate d4/f2e 1471989 0 2026-03-09T16:15:18.918 INFO:tasks.workunit.client.1.vm05.stdout:9/616: dread d4/d10/d35/d36/d48/d54/d59/f5c [0,4194304] 0 2026-03-09T16:15:18.933 INFO:tasks.workunit.client.1.vm05.stdout:8/602: creat d4/fca x:0 0 0 2026-03-09T16:15:18.933 INFO:tasks.workunit.client.1.vm05.stdout:8/603: write d4/f13 [1764305,19381] 0 2026-03-09T16:15:18.934 INFO:tasks.workunit.client.1.vm05.stdout:8/604: chown d4/d6/d3a/f49 23439686 1 2026-03-09T16:15:18.936 INFO:tasks.workunit.client.1.vm05.stdout:4/627: creat d5/de/d15/d21/d27/d3c/d5c/d5f/fdd x:0 0 0 2026-03-09T16:15:18.937 INFO:tasks.workunit.client.1.vm05.stdout:4/628: mknod d5/d9c/cde 0 2026-03-09T16:15:18.938 INFO:tasks.workunit.client.1.vm05.stdout:8/605: getdents d4/d6/d3a/d40/d71 0 2026-03-09T16:15:18.939 INFO:tasks.workunit.client.1.vm05.stdout:5/606: mkdir d8/d18/d1b/d47/dda 0 2026-03-09T16:15:18.939 INFO:tasks.workunit.client.1.vm05.stdout:4/629: write d5/de/d15/d21/d27/d3c/d5c/d5f/db6/fb8 [787232,31026] 0 2026-03-09T16:15:18.939 INFO:tasks.workunit.client.1.vm05.stdout:8/606: creat d4/d6/db/d59/fcb x:0 0 0 2026-03-09T16:15:18.940 INFO:tasks.workunit.client.1.vm05.stdout:4/630: write d5/de/d15/d21/d39/d91/fae [180998,126984] 0 2026-03-09T16:15:18.940 INFO:tasks.workunit.client.1.vm05.stdout:8/607: chown d4/d6/d3a/d40/f76 2492 1 2026-03-09T16:15:18.943 INFO:tasks.workunit.client.1.vm05.stdout:4/631: chown d5/de/d15/da9/db1/dad/la4 1 1 2026-03-09T16:15:18.943 INFO:tasks.workunit.client.1.vm05.stdout:6/567: write d17/f3b [1626368,55181] 0 2026-03-09T16:15:18.944 INFO:tasks.workunit.client.1.vm05.stdout:6/568: fdatasync d17/f31 0 2026-03-09T16:15:18.945 INFO:tasks.workunit.client.1.vm05.stdout:4/632: write d5/f3e [658929,80004] 0 2026-03-09T16:15:18.949 INFO:tasks.workunit.client.1.vm05.stdout:6/569: chown d17/d22/d27/d34/d42/d65/lbc 44378 1 2026-03-09T16:15:18.949 INFO:tasks.workunit.client.1.vm05.stdout:6/570: chown d17/d22/d27/d34/d42/d53 0 1 2026-03-09T16:15:18.951 INFO:tasks.workunit.client.1.vm05.stdout:6/571: creat d17/d22/d9d/da5/fd9 x:0 0 0 2026-03-09T16:15:18.952 INFO:tasks.workunit.client.1.vm05.stdout:0/571: rename d5/db/d48/d66/l6c to d5/lc1 0 2026-03-09T16:15:18.953 INFO:tasks.workunit.client.1.vm05.stdout:3/494: rename d0/d9/d22/d5f/d75 to d0/d9/d22/d5f/d75/d76/d88/da1 22 2026-03-09T16:15:18.953 INFO:tasks.workunit.client.1.vm05.stdout:4/633: sync 2026-03-09T16:15:18.954 INFO:tasks.workunit.client.1.vm05.stdout:3/495: sync 2026-03-09T16:15:18.956 INFO:tasks.workunit.client.1.vm05.stdout:0/572: mkdir d5/d1b/d3b/dc2 0 2026-03-09T16:15:18.957 INFO:tasks.workunit.client.1.vm05.stdout:7/642: rename d1/d2/d8/dc/d33/fb4 to d1/d2/d8/dc/d1b/d30/d4b/d65/fe3 0 2026-03-09T16:15:18.960 INFO:tasks.workunit.client.1.vm05.stdout:3/496: rmdir d0/d9/d8b 39 2026-03-09T16:15:18.960 INFO:tasks.workunit.client.1.vm05.stdout:1/671: dwrite d7/d15/f8d [0,4194304] 0 2026-03-09T16:15:18.961 INFO:tasks.workunit.client.1.vm05.stdout:4/634: mknod d5/de/d15/d21/cdf 0 2026-03-09T16:15:18.961 INFO:tasks.workunit.client.1.vm05.stdout:6/572: dread d17/f1c [0,4194304] 0 2026-03-09T16:15:18.961 INFO:tasks.workunit.client.1.vm05.stdout:4/635: chown d5/de/d15 918 1 2026-03-09T16:15:18.965 INFO:tasks.workunit.client.1.vm05.stdout:7/643: readlink d1/d2/d8/dc/d14/l7e 0 2026-03-09T16:15:18.965 INFO:tasks.workunit.client.1.vm05.stdout:2/558: creat db/dd/fb8 x:0 0 0 2026-03-09T16:15:18.966 INFO:tasks.workunit.client.1.vm05.stdout:3/497: dread d0/d33/f85 [0,4194304] 0 2026-03-09T16:15:18.966 INFO:tasks.workunit.client.1.vm05.stdout:1/672: rmdir d7/d62/da3 39 2026-03-09T16:15:18.966 INFO:tasks.workunit.client.1.vm05.stdout:6/573: dwrite f16 [4194304,4194304] 0 2026-03-09T16:15:18.967 INFO:tasks.workunit.client.1.vm05.stdout:7/644: read - d1/d2/d11/d86/da2/fb0 zero size 2026-03-09T16:15:18.968 INFO:tasks.workunit.client.1.vm05.stdout:1/673: write d7/dd/de/d52/fd7 [478396,44520] 0 2026-03-09T16:15:18.969 INFO:tasks.workunit.client.1.vm05.stdout:3/498: sync 2026-03-09T16:15:18.969 INFO:tasks.workunit.client.1.vm05.stdout:2/559: readlink db/dd/d15/d1f/d21/l40 0 2026-03-09T16:15:18.971 INFO:tasks.workunit.client.1.vm05.stdout:3/499: truncate d0/d33/f92 289308 0 2026-03-09T16:15:18.971 INFO:tasks.workunit.client.1.vm05.stdout:7/645: chown d1/d2/d8/dc/d1b/d30/d4b/d65/c3a 113 1 2026-03-09T16:15:18.978 INFO:tasks.workunit.client.1.vm05.stdout:0/573: mkdir d5/db/d48/dc3 0 2026-03-09T16:15:18.979 INFO:tasks.workunit.client.1.vm05.stdout:6/574: readlink d17/d22/d27/d34/d4b/d7f/lc6 0 2026-03-09T16:15:18.985 INFO:tasks.workunit.client.1.vm05.stdout:9/617: rename d4/d10/d35/d36/d48/d60/f8f to d4/d10/d35/d36/fce 0 2026-03-09T16:15:18.986 INFO:tasks.workunit.client.1.vm05.stdout:4/636: creat d5/de/d15/d21/d27/d3c/d5c/fe0 x:0 0 0 2026-03-09T16:15:18.986 INFO:tasks.workunit.client.1.vm05.stdout:0/574: rmdir d5/db/d5b 39 2026-03-09T16:15:18.987 INFO:tasks.workunit.client.1.vm05.stdout:7/646: chown d1/d2/d8/d31/d8d/cd0 0 1 2026-03-09T16:15:18.987 INFO:tasks.workunit.client.1.vm05.stdout:6/575: creat d17/d22/d27/d34/d42/fda x:0 0 0 2026-03-09T16:15:18.988 INFO:tasks.workunit.client.1.vm05.stdout:1/674: dwrite d7/dd/d21/d39/d87/db9/fe1 [0,4194304] 0 2026-03-09T16:15:18.989 INFO:tasks.workunit.client.1.vm05.stdout:5/607: rename d8/d18/d1b/d47/d48/d73/d80/fac to d8/dc8/fdb 0 2026-03-09T16:15:18.995 INFO:tasks.workunit.client.1.vm05.stdout:8/608: rename d4/d6/db/dc/d5d to d4/d6/db/dc/d5d/d79/dcc 22 2026-03-09T16:15:18.995 INFO:tasks.workunit.client.1.vm05.stdout:3/500: creat d0/d9/d22/d5f/d90/fa2 x:0 0 0 2026-03-09T16:15:18.996 INFO:tasks.workunit.client.1.vm05.stdout:9/618: creat d4/d10/d35/d36/d48/d60/dae/fcf x:0 0 0 2026-03-09T16:15:18.996 INFO:tasks.workunit.client.1.vm05.stdout:7/647: chown d1/d2/d11/d86/lca 1 1 2026-03-09T16:15:18.996 INFO:tasks.workunit.client.1.vm05.stdout:1/675: stat d7/dd/d21/d3b/d55 0 2026-03-09T16:15:19.003 INFO:tasks.workunit.client.1.vm05.stdout:8/609: write d4/d6/db/dc/f17 [2354456,103516] 0 2026-03-09T16:15:19.005 INFO:tasks.workunit.client.1.vm05.stdout:8/610: fdatasync d4/d6/d53/fb1 0 2026-03-09T16:15:19.008 INFO:tasks.workunit.client.1.vm05.stdout:6/576: symlink d17/d22/d27/d8a/d8b/ldb 0 2026-03-09T16:15:19.010 INFO:tasks.workunit.client.1.vm05.stdout:5/608: stat d8/d18/d1b/d47/d4e/d76/d8f/dab/lb5 0 2026-03-09T16:15:19.010 INFO:tasks.workunit.client.1.vm05.stdout:4/637: rename d5/de/d15/da9/db1/dad/l1a to d5/de/d15/da9/db1/dad/d37/d60/le1 0 2026-03-09T16:15:19.010 INFO:tasks.workunit.client.1.vm05.stdout:9/619: mknod d4/d10/d35/d2b/d31/d82/cd0 0 2026-03-09T16:15:19.010 INFO:tasks.workunit.client.1.vm05.stdout:7/648: creat d1/d2/d8/dc/d14/fe4 x:0 0 0 2026-03-09T16:15:19.014 INFO:tasks.workunit.client.1.vm05.stdout:5/609: mknod d8/d5e/d8e/cdc 0 2026-03-09T16:15:19.015 INFO:tasks.workunit.client.1.vm05.stdout:4/638: write d5/de/f9d [3682736,41593] 0 2026-03-09T16:15:19.015 INFO:tasks.workunit.client.1.vm05.stdout:5/610: dread - d8/d18/fc5 zero size 2026-03-09T16:15:19.016 INFO:tasks.workunit.client.1.vm05.stdout:2/560: dread db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:19.022 INFO:tasks.workunit.client.1.vm05.stdout:9/620: write d4/d10/d35/f32 [1162086,41095] 0 2026-03-09T16:15:19.024 INFO:tasks.workunit.client.1.vm05.stdout:5/611: creat d8/d18/d1b/d47/d48/d73/fdd x:0 0 0 2026-03-09T16:15:19.025 INFO:tasks.workunit.client.1.vm05.stdout:0/575: link d5/d11/f40 d5/d2c/d49/d83/d8b/daf/fc4 0 2026-03-09T16:15:19.025 INFO:tasks.workunit.client.1.vm05.stdout:6/577: link d17/c4d d17/d22/d27/d8a/cdc 0 2026-03-09T16:15:19.026 INFO:tasks.workunit.client.1.vm05.stdout:5/612: write d8/d18/d1b/f28 [8590858,573] 0 2026-03-09T16:15:19.027 INFO:tasks.workunit.client.1.vm05.stdout:7/649: dread d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:19.029 INFO:tasks.workunit.client.1.vm05.stdout:5/613: chown d8/d53/d7a/f92 27 1 2026-03-09T16:15:19.030 INFO:tasks.workunit.client.1.vm05.stdout:5/614: write f5 [4845067,71586] 0 2026-03-09T16:15:19.030 INFO:tasks.workunit.client.1.vm05.stdout:7/650: dread - d1/d2/d8/dc/d9c/fda zero size 2026-03-09T16:15:19.032 INFO:tasks.workunit.client.1.vm05.stdout:7/651: chown d1/d2/d8/dc/d14/fe4 93165538 1 2026-03-09T16:15:19.034 INFO:tasks.workunit.client.1.vm05.stdout:7/652: chown d1/d2/d8/dc/d1b/d71/d3c/f9b 2 1 2026-03-09T16:15:19.034 INFO:tasks.workunit.client.1.vm05.stdout:4/639: truncate d5/de/d15/da9/db1/dad/d37/d60/dbf/f70 746945 0 2026-03-09T16:15:19.034 INFO:tasks.workunit.client.1.vm05.stdout:2/561: creat db/dd/fb9 x:0 0 0 2026-03-09T16:15:19.039 INFO:tasks.workunit.client.1.vm05.stdout:5/615: mknod d8/d5e/cde 0 2026-03-09T16:15:19.040 INFO:tasks.workunit.client.1.vm05.stdout:7/653: write d1/d2/d8/dc/d1b/d30/d4b/d65/fe3 [342245,95240] 0 2026-03-09T16:15:19.041 INFO:tasks.workunit.client.1.vm05.stdout:5/616: dread - d8/d18/d1b/d47/d48/f60 zero size 2026-03-09T16:15:19.041 INFO:tasks.workunit.client.1.vm05.stdout:6/578: mknod d17/d5d/d73/cdd 0 2026-03-09T16:15:19.041 INFO:tasks.workunit.client.1.vm05.stdout:2/562: getdents db/dd/d15/d46/d67 0 2026-03-09T16:15:19.042 INFO:tasks.workunit.client.1.vm05.stdout:7/654: chown d1/d2/d8/dc/c36 431215653 1 2026-03-09T16:15:19.043 INFO:tasks.workunit.client.1.vm05.stdout:6/579: write d17/d22/d27/d44/f48 [6559392,100344] 0 2026-03-09T16:15:19.045 INFO:tasks.workunit.client.1.vm05.stdout:7/655: dread - d1/d2/d8/d31/d8d/f6f zero size 2026-03-09T16:15:19.046 INFO:tasks.workunit.client.1.vm05.stdout:6/580: chown d17/d22/d9d/ld6 60 1 2026-03-09T16:15:19.048 INFO:tasks.workunit.client.1.vm05.stdout:4/640: creat d5/de/d15/da9/db1/dad/d90/dd8/fe2 x:0 0 0 2026-03-09T16:15:19.048 INFO:tasks.workunit.client.1.vm05.stdout:5/617: mknod d8/d95/cdf 0 2026-03-09T16:15:19.049 INFO:tasks.workunit.client.1.vm05.stdout:5/618: fsync f5 0 2026-03-09T16:15:19.053 INFO:tasks.workunit.client.1.vm05.stdout:2/563: mknod db/dd/d15/d1f/d20/d23/cba 0 2026-03-09T16:15:19.054 INFO:tasks.workunit.client.1.vm05.stdout:0/576: link d5/d1b/d30/l3e d5/db/lc5 0 2026-03-09T16:15:19.057 INFO:tasks.workunit.client.1.vm05.stdout:6/581: write d17/f4e [406741,73498] 0 2026-03-09T16:15:19.057 INFO:tasks.workunit.client.1.vm05.stdout:4/641: dwrite d5/de/d15/da9/db1/dad/d37/d60/dbf/fc7 [0,4194304] 0 2026-03-09T16:15:19.058 INFO:tasks.workunit.client.1.vm05.stdout:0/577: dread - d5/f73 zero size 2026-03-09T16:15:19.063 INFO:tasks.workunit.client.1.vm05.stdout:4/642: mkdir d5/de/d15/d21/da0/de3 0 2026-03-09T16:15:19.065 INFO:tasks.workunit.client.1.vm05.stdout:0/578: creat d5/db/d5f/da3/fc6 x:0 0 0 2026-03-09T16:15:19.067 INFO:tasks.workunit.client.1.vm05.stdout:0/579: write d5/f73 [1007893,24073] 0 2026-03-09T16:15:19.073 INFO:tasks.workunit.client.1.vm05.stdout:4/643: dwrite d5/f3e [0,4194304] 0 2026-03-09T16:15:19.075 INFO:tasks.workunit.client.1.vm05.stdout:4/644: stat d5/de/d15/d21/d27/f29 0 2026-03-09T16:15:19.076 INFO:tasks.workunit.client.1.vm05.stdout:6/582: rename d17/d22/d27/c46 to d17/d4f/cde 0 2026-03-09T16:15:19.084 INFO:tasks.workunit.client.1.vm05.stdout:4/645: symlink d5/le4 0 2026-03-09T16:15:19.084 INFO:tasks.workunit.client.1.vm05.stdout:2/564: getdents db/dd/d15/d1f 0 2026-03-09T16:15:19.084 INFO:tasks.workunit.client.1.vm05.stdout:0/580: creat d5/db/d5b/d82/fc7 x:0 0 0 2026-03-09T16:15:19.087 INFO:tasks.workunit.client.1.vm05.stdout:2/565: fdatasync db/dd/d15/d1f/d21/f5d 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:4/646: fdatasync d5/de/f24 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:0/581: rename d5/d9e/la9 to d5/db/d77/lc8 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:2/566: creat db/dd/d15/d1f/d20/d23/fbb x:0 0 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:2/567: symlink db/dd/d15/d46/d8d/lbc 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:4/647: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/ce5 0 2026-03-09T16:15:19.094 INFO:tasks.workunit.client.1.vm05.stdout:6/583: sync 2026-03-09T16:15:19.095 INFO:tasks.workunit.client.1.vm05.stdout:4/648: write d5/f59 [2918444,92555] 0 2026-03-09T16:15:19.099 INFO:tasks.workunit.client.1.vm05.stdout:4/649: chown d5/de/d15/da9/db1/dad/c4c 143987 1 2026-03-09T16:15:19.103 INFO:tasks.workunit.client.1.vm05.stdout:4/650: rename d5/de/d15/da9/db1/dad/d90/dbb/lda to d5/de/d2f/d8a/le6 0 2026-03-09T16:15:19.113 INFO:tasks.workunit.client.1.vm05.stdout:4/651: symlink d5/de/d15/da9/db1/dad/d90/dbb/le7 0 2026-03-09T16:15:19.114 INFO:tasks.workunit.client.1.vm05.stdout:1/676: dwrite d7/f4b [4194304,4194304] 0 2026-03-09T16:15:19.117 INFO:tasks.workunit.client.1.vm05.stdout:1/677: write d7/d62/db6/fc1 [617982,23481] 0 2026-03-09T16:15:19.117 INFO:tasks.workunit.client.1.vm05.stdout:8/611: dwrite d4/d6/f5f [0,4194304] 0 2026-03-09T16:15:19.118 INFO:tasks.workunit.client.1.vm05.stdout:4/652: dread - d5/de/d82/fcd zero size 2026-03-09T16:15:19.118 INFO:tasks.workunit.client.1.vm05.stdout:1/678: write d7/dd/de/f3e [2025721,100078] 0 2026-03-09T16:15:19.126 INFO:tasks.workunit.client.1.vm05.stdout:8/612: rmdir d4/d6/db/dc 39 2026-03-09T16:15:19.126 INFO:tasks.workunit.client.1.vm05.stdout:4/653: rename d5/de/d15/da9/db1/f68 to d5/de/d15/d21/d27/d3c/fe8 0 2026-03-09T16:15:19.127 INFO:tasks.workunit.client.1.vm05.stdout:8/613: truncate d4/d6/d3a/f25 3055397 0 2026-03-09T16:15:19.128 INFO:tasks.workunit.client.1.vm05.stdout:1/679: write d7/dd/de/f38 [2586434,6611] 0 2026-03-09T16:15:19.132 INFO:tasks.workunit.client.1.vm05.stdout:8/614: write d4/d6/d3a/d3c/f45 [582862,98170] 0 2026-03-09T16:15:19.135 INFO:tasks.workunit.client.1.vm05.stdout:4/654: mkdir d5/de/d15/d21/d39/d91/de9 0 2026-03-09T16:15:19.139 INFO:tasks.workunit.client.1.vm05.stdout:8/615: mknod d4/d6/db/dc/d3b/ccd 0 2026-03-09T16:15:19.139 INFO:tasks.workunit.client.1.vm05.stdout:1/680: dwrite d7/d15/d16/dc2/fc7 [4194304,4194304] 0 2026-03-09T16:15:19.140 INFO:tasks.workunit.client.1.vm05.stdout:1/681: write d7/d27/f4d [228076,128720] 0 2026-03-09T16:15:19.142 INFO:tasks.workunit.client.1.vm05.stdout:1/682: fdatasync d7/dd/d21/d39/fa4 0 2026-03-09T16:15:19.142 INFO:tasks.workunit.client.1.vm05.stdout:8/616: symlink d4/d6/d3a/d40/d6a/d97/lce 0 2026-03-09T16:15:19.143 INFO:tasks.workunit.client.1.vm05.stdout:4/655: sync 2026-03-09T16:15:19.153 INFO:tasks.workunit.client.1.vm05.stdout:8/617: creat d4/d6/db/dc/d5d/da0/dbf/fcf x:0 0 0 2026-03-09T16:15:19.154 INFO:tasks.workunit.client.1.vm05.stdout:1/683: unlink d7/d15/d45/l4a 0 2026-03-09T16:15:19.155 INFO:tasks.workunit.client.1.vm05.stdout:1/684: creat d7/daa/fef x:0 0 0 2026-03-09T16:15:19.155 INFO:tasks.workunit.client.1.vm05.stdout:8/618: symlink d4/d6/d3a/d7c/ld0 0 2026-03-09T16:15:19.156 INFO:tasks.workunit.client.1.vm05.stdout:8/619: stat d4/d6/d53/fb1 0 2026-03-09T16:15:19.157 INFO:tasks.workunit.client.1.vm05.stdout:1/685: mknod d7/dd/d21/d3b/cf0 0 2026-03-09T16:15:19.159 INFO:tasks.workunit.client.1.vm05.stdout:1/686: readlink d7/l9a 0 2026-03-09T16:15:19.160 INFO:tasks.workunit.client.1.vm05.stdout:8/620: truncate d4/d6/db/dc/d5d/fbd 466680 0 2026-03-09T16:15:19.160 INFO:tasks.workunit.client.1.vm05.stdout:1/687: readlink d7/d15/d16/dc2/ld1 0 2026-03-09T16:15:19.163 INFO:tasks.workunit.client.1.vm05.stdout:8/621: unlink d4/d6/d3a/f8e 0 2026-03-09T16:15:19.164 INFO:tasks.workunit.client.1.vm05.stdout:1/688: chown d7/d62/da3/fbb 3155 1 2026-03-09T16:15:19.171 INFO:tasks.workunit.client.1.vm05.stdout:8/622: rename d4/d92 to d4/d6/db/df/dd1 0 2026-03-09T16:15:19.171 INFO:tasks.workunit.client.1.vm05.stdout:1/689: dwrite d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:19.174 INFO:tasks.workunit.client.1.vm05.stdout:8/623: write d4/d6/d9a/db3/f9d [2036558,119190] 0 2026-03-09T16:15:19.178 INFO:tasks.workunit.client.1.vm05.stdout:0/582: dread d5/db/d48/d66/f91 [0,4194304] 0 2026-03-09T16:15:19.184 INFO:tasks.workunit.client.1.vm05.stdout:8/624: mknod d4/d6/d3a/cd2 0 2026-03-09T16:15:19.185 INFO:tasks.workunit.client.1.vm05.stdout:1/690: chown d7/d62/da3/cea 41572 1 2026-03-09T16:15:19.185 INFO:tasks.workunit.client.1.vm05.stdout:8/625: chown d4/d6/d3a/cd2 0 1 2026-03-09T16:15:19.186 INFO:tasks.workunit.client.1.vm05.stdout:8/626: write d4/d6/db/dc/f2a [2980568,122803] 0 2026-03-09T16:15:19.187 INFO:tasks.workunit.client.1.vm05.stdout:0/583: unlink d5/f79 0 2026-03-09T16:15:19.187 INFO:tasks.workunit.client.1.vm05.stdout:8/627: chown d4/d6/d3a/d40 1572 1 2026-03-09T16:15:19.188 INFO:tasks.workunit.client.1.vm05.stdout:0/584: truncate d5/d2c/f84 49274 0 2026-03-09T16:15:19.190 INFO:tasks.workunit.client.1.vm05.stdout:1/691: creat d7/d15/d45/dee/ff1 x:0 0 0 2026-03-09T16:15:19.191 INFO:tasks.workunit.client.1.vm05.stdout:1/692: stat d7/d27/f64 0 2026-03-09T16:15:19.192 INFO:tasks.workunit.client.1.vm05.stdout:0/585: truncate d5/db/d5b/f35 1232464 0 2026-03-09T16:15:19.193 INFO:tasks.workunit.client.1.vm05.stdout:8/628: link d4/d6/d3a/d15/c7d d4/d6/d3a/d7c/cd3 0 2026-03-09T16:15:19.199 INFO:tasks.workunit.client.1.vm05.stdout:8/629: dwrite d4/d6/f5f [0,4194304] 0 2026-03-09T16:15:19.202 INFO:tasks.workunit.client.1.vm05.stdout:5/619: write d8/f13 [5078907,109045] 0 2026-03-09T16:15:19.206 INFO:tasks.workunit.client.1.vm05.stdout:0/586: write d5/d1b/d30/f29 [1560814,24697] 0 2026-03-09T16:15:19.206 INFO:tasks.workunit.client.1.vm05.stdout:7/656: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fb3 [4194304,4194304] 0 2026-03-09T16:15:19.209 INFO:tasks.workunit.client.1.vm05.stdout:8/630: mknod d4/d6/db/dc/cd4 0 2026-03-09T16:15:19.209 INFO:tasks.workunit.client.1.vm05.stdout:5/620: chown d8/l79 3070 1 2026-03-09T16:15:19.210 INFO:tasks.workunit.client.1.vm05.stdout:0/587: creat d5/d2c/d49/d83/fc9 x:0 0 0 2026-03-09T16:15:19.211 INFO:tasks.workunit.client.1.vm05.stdout:7/657: chown d1/d2/d8/dc/d1b/d30/d4b/fcc 1659742 1 2026-03-09T16:15:19.213 INFO:tasks.workunit.client.1.vm05.stdout:8/631: symlink d4/d6/db/da6/ld5 0 2026-03-09T16:15:19.214 INFO:tasks.workunit.client.1.vm05.stdout:7/658: creat d1/d2/d11/d86/d8a/fe5 x:0 0 0 2026-03-09T16:15:19.215 INFO:tasks.workunit.client.1.vm05.stdout:0/588: creat d5/db/d5b/da5/fca x:0 0 0 2026-03-09T16:15:19.215 INFO:tasks.workunit.client.1.vm05.stdout:8/632: chown d4/d6/db/df/d80/d86 197871396 1 2026-03-09T16:15:19.216 INFO:tasks.workunit.client.1.vm05.stdout:0/589: creat d5/d11/d4f/d68/fcb x:0 0 0 2026-03-09T16:15:19.217 INFO:tasks.workunit.client.1.vm05.stdout:7/659: mkdir d1/d2/d8/dc/d1b/de6 0 2026-03-09T16:15:19.217 INFO:tasks.workunit.client.1.vm05.stdout:8/633: truncate d4/d6/d53/fb1 921436 0 2026-03-09T16:15:19.219 INFO:tasks.workunit.client.1.vm05.stdout:8/634: write d4/d6/db/df/d80/fb9 [337678,33994] 0 2026-03-09T16:15:19.219 INFO:tasks.workunit.client.1.vm05.stdout:0/590: creat d5/db/d5b/d82/fcc x:0 0 0 2026-03-09T16:15:19.223 INFO:tasks.workunit.client.1.vm05.stdout:0/591: write d5/d2c/d49/d83/d8b/d95/f2e [2425605,20888] 0 2026-03-09T16:15:19.231 INFO:tasks.workunit.client.1.vm05.stdout:0/592: creat d5/db/d5b/fcd x:0 0 0 2026-03-09T16:15:19.235 INFO:tasks.workunit.client.1.vm05.stdout:2/568: truncate f7 284431 0 2026-03-09T16:15:19.240 INFO:tasks.workunit.client.1.vm05.stdout:2/569: dread db/dd/d15/d1f/d20/d86/f8f [0,4194304] 0 2026-03-09T16:15:19.242 INFO:tasks.workunit.client.1.vm05.stdout:9/621: dwrite d4/f2e [0,4194304] 0 2026-03-09T16:15:19.248 INFO:tasks.workunit.client.1.vm05.stdout:7/660: dread d1/d2/d8/dc/d33/f9d [0,4194304] 0 2026-03-09T16:15:19.256 INFO:tasks.workunit.client.1.vm05.stdout:9/622: link d4/d10/d35/d36/d48/cb9 d4/d10/d35/d36/d48/cd1 0 2026-03-09T16:15:19.257 INFO:tasks.workunit.client.1.vm05.stdout:9/623: write d4/d10/d35/d2b/d31/f99 [583648,7690] 0 2026-03-09T16:15:19.259 INFO:tasks.workunit.client.1.vm05.stdout:9/624: mkdir d4/d10/d35/d36/d48/d60/dcb/dd2 0 2026-03-09T16:15:19.260 INFO:tasks.workunit.client.1.vm05.stdout:9/625: chown d4/d10/d35/d2b/d38/d65/lb4 42551 1 2026-03-09T16:15:19.262 INFO:tasks.workunit.client.1.vm05.stdout:9/626: creat d4/d10/d35/d36/d48/d60/fd3 x:0 0 0 2026-03-09T16:15:19.263 INFO:tasks.workunit.client.1.vm05.stdout:9/627: creat d4/d10/d35/d36/d48/d60/d94/fd4 x:0 0 0 2026-03-09T16:15:19.274 INFO:tasks.workunit.client.1.vm05.stdout:9/628: sync 2026-03-09T16:15:19.275 INFO:tasks.workunit.client.1.vm05.stdout:9/629: write d4/d10/d35/d36/d48/d60/d94/fd4 [160320,37188] 0 2026-03-09T16:15:19.281 INFO:tasks.workunit.client.1.vm05.stdout:9/630: link d4/d10/d35/l56 d4/d10/d35/d36/ld5 0 2026-03-09T16:15:19.283 INFO:tasks.workunit.client.1.vm05.stdout:2/570: dread db/dd/d15/d3f/d5b/f69 [0,4194304] 0 2026-03-09T16:15:19.284 INFO:tasks.workunit.client.1.vm05.stdout:2/571: mknod db/dd/d15/d46/cbd 0 2026-03-09T16:15:19.290 INFO:tasks.workunit.client.1.vm05.stdout:2/572: rmdir db/dd/d15/d46/d8d/da0 0 2026-03-09T16:15:19.292 INFO:tasks.workunit.client.1.vm05.stdout:2/573: creat db/dd/d15/d1f/d21/d87/fbe x:0 0 0 2026-03-09T16:15:19.298 INFO:tasks.workunit.client.1.vm05.stdout:5/621: dread d8/f55 [0,4194304] 0 2026-03-09T16:15:19.302 INFO:tasks.workunit.client.1.vm05.stdout:5/622: dwrite d8/d18/dbc/dcc/daa/f52 [0,4194304] 0 2026-03-09T16:15:19.305 INFO:tasks.workunit.client.1.vm05.stdout:5/623: mknod d8/dd5/ce0 0 2026-03-09T16:15:19.364 INFO:tasks.workunit.client.1.vm05.stdout:3/501: dread d0/d9/f2b [0,4194304] 0 2026-03-09T16:15:19.372 INFO:tasks.workunit.client.1.vm05.stdout:3/502: dread d0/d9/f5c [0,4194304] 0 2026-03-09T16:15:19.373 INFO:tasks.workunit.client.1.vm05.stdout:3/503: chown d0/d9/cb 408228116 1 2026-03-09T16:15:19.374 INFO:tasks.workunit.client.1.vm05.stdout:3/504: readlink d0/d9/d22/d4c/l53 0 2026-03-09T16:15:19.375 INFO:tasks.workunit.client.1.vm05.stdout:3/505: stat d0/d9/f2c 0 2026-03-09T16:15:19.376 INFO:tasks.workunit.client.1.vm05.stdout:3/506: write d0/f57 [1640554,86912] 0 2026-03-09T16:15:19.376 INFO:tasks.workunit.client.1.vm05.stdout:3/507: dread - d0/d9/f93 zero size 2026-03-09T16:15:19.378 INFO:tasks.workunit.client.1.vm05.stdout:3/508: fsync d0/d9/d22/f2a 0 2026-03-09T16:15:19.379 INFO:tasks.workunit.client.1.vm05.stdout:3/509: fdatasync d0/fd 0 2026-03-09T16:15:19.394 INFO:tasks.workunit.client.1.vm05.stdout:3/510: write d0/d9/d22/d6b/fa0 [2303,91486] 0 2026-03-09T16:15:19.397 INFO:tasks.workunit.client.1.vm05.stdout:3/511: fsync d0/d33/f63 0 2026-03-09T16:15:19.400 INFO:tasks.workunit.client.1.vm05.stdout:3/512: mkdir d0/d9/d22/d5f/d75/d76/d88/da3 0 2026-03-09T16:15:19.403 INFO:tasks.workunit.client.1.vm05.stdout:3/513: dwrite d0/f7c [0,4194304] 0 2026-03-09T16:15:19.405 INFO:tasks.workunit.client.1.vm05.stdout:3/514: symlink d0/d9/d22/d5f/d75/d76/d88/d89/la4 0 2026-03-09T16:15:19.411 INFO:tasks.workunit.client.1.vm05.stdout:3/515: creat d0/d9/d22/d5f/d75/d76/fa5 x:0 0 0 2026-03-09T16:15:19.412 INFO:tasks.workunit.client.1.vm05.stdout:3/516: creat d0/d9/d22/d5f/d90/fa6 x:0 0 0 2026-03-09T16:15:19.414 INFO:tasks.workunit.client.1.vm05.stdout:3/517: creat d0/d9/d22/d5f/fa7 x:0 0 0 2026-03-09T16:15:19.415 INFO:tasks.workunit.client.1.vm05.stdout:3/518: mkdir d0/d9/d22/d5f/d7b/da8 0 2026-03-09T16:15:19.416 INFO:tasks.workunit.client.1.vm05.stdout:3/519: fdatasync d0/d33/f92 0 2026-03-09T16:15:19.418 INFO:tasks.workunit.client.1.vm05.stdout:4/656: dread d5/de/d15/d21/d39/f53 [0,4194304] 0 2026-03-09T16:15:19.418 INFO:tasks.workunit.client.1.vm05.stdout:3/520: getdents d0/d9/d22/d4c/d80 0 2026-03-09T16:15:19.422 INFO:tasks.workunit.client.1.vm05.stdout:3/521: dwrite d0/d9/f5c [0,4194304] 0 2026-03-09T16:15:19.422 INFO:tasks.workunit.client.1.vm05.stdout:4/657: dread d5/de/d15/da9/db1/dad/d37/d60/dbf/f70 [0,4194304] 0 2026-03-09T16:15:19.433 INFO:tasks.workunit.client.1.vm05.stdout:4/658: mknod d5/de/d15/da9/db1/dad/d90/cea 0 2026-03-09T16:15:19.434 INFO:tasks.workunit.client.1.vm05.stdout:3/522: mkdir d0/da9 0 2026-03-09T16:15:19.434 INFO:tasks.workunit.client.1.vm05.stdout:4/659: rmdir d5/de/d15/da9/db1/dad/d90 39 2026-03-09T16:15:19.435 INFO:tasks.workunit.client.1.vm05.stdout:3/523: fdatasync d0/fd 0 2026-03-09T16:15:19.436 INFO:tasks.workunit.client.1.vm05.stdout:6/584: dread d17/f60 [4194304,4194304] 0 2026-03-09T16:15:19.437 INFO:tasks.workunit.client.1.vm05.stdout:6/585: write d17/f31 [2170127,12837] 0 2026-03-09T16:15:19.437 INFO:tasks.workunit.client.1.vm05.stdout:3/524: symlink d0/da9/laa 0 2026-03-09T16:15:19.438 INFO:tasks.workunit.client.1.vm05.stdout:4/660: rename d5/de/d15/da9/db1/dad/d37/d60/dbf/d8c to d5/de/d15/d21/d27/d3c/d5c/da2/dc9/deb 0 2026-03-09T16:15:19.439 INFO:tasks.workunit.client.1.vm05.stdout:4/661: stat d5/d9c/dbd/cc4 0 2026-03-09T16:15:19.439 INFO:tasks.workunit.client.1.vm05.stdout:6/586: truncate d17/d22/d27/d34/d4b/d7f/fc8 561306 0 2026-03-09T16:15:19.439 INFO:tasks.workunit.client.1.vm05.stdout:4/662: write d5/de/d15/da9/db1/dad/f1f [4892847,65053] 0 2026-03-09T16:15:19.441 INFO:tasks.workunit.client.1.vm05.stdout:6/587: write d17/f18 [2598214,82480] 0 2026-03-09T16:15:19.441 INFO:tasks.workunit.client.1.vm05.stdout:1/693: rmdir d7/d15/d45/dee 39 2026-03-09T16:15:19.443 INFO:tasks.workunit.client.1.vm05.stdout:6/588: read d17/d22/d27/d58/f97 [185084,6775] 0 2026-03-09T16:15:19.445 INFO:tasks.workunit.client.1.vm05.stdout:6/589: chown d17/d22/d27/d34/d4b/d7f/lc6 1043163 1 2026-03-09T16:15:19.450 INFO:tasks.workunit.client.1.vm05.stdout:1/694: mknod d7/dbe/dca/cf2 0 2026-03-09T16:15:19.451 INFO:tasks.workunit.client.1.vm05.stdout:3/525: getdents d0/d9/d22/d6b 0 2026-03-09T16:15:19.451 INFO:tasks.workunit.client.1.vm05.stdout:1/695: fsync d7/dd/d21/d39/fa4 0 2026-03-09T16:15:19.451 INFO:tasks.workunit.client.1.vm05.stdout:3/526: chown d0/f7c 5660615 1 2026-03-09T16:15:19.452 INFO:tasks.workunit.client.1.vm05.stdout:4/663: mkdir d5/de/d82/dc1/dec 0 2026-03-09T16:15:19.454 INFO:tasks.workunit.client.1.vm05.stdout:6/590: creat d17/d22/dce/fdf x:0 0 0 2026-03-09T16:15:19.455 INFO:tasks.workunit.client.1.vm05.stdout:3/527: chown d0/d33/l11 11 1 2026-03-09T16:15:19.456 INFO:tasks.workunit.client.1.vm05.stdout:4/664: rename d5/de/d15/da9/db1/dad/d90 to d5/de/d15/da9/db1/dad/d90/dd8/ded 22 2026-03-09T16:15:19.459 INFO:tasks.workunit.client.1.vm05.stdout:8/635: write d4/d6/d3a/d3c/f3f [1291148,79961] 0 2026-03-09T16:15:19.461 INFO:tasks.workunit.client.1.vm05.stdout:8/636: read d4/d6/f58 [3614361,72446] 0 2026-03-09T16:15:19.463 INFO:tasks.workunit.client.1.vm05.stdout:4/665: link d5/de/d15/da9/db1/dad/f1f d5/de/d15/d21/d27/d3c/d5c/da2/fee 0 2026-03-09T16:15:19.463 INFO:tasks.workunit.client.1.vm05.stdout:6/591: getdents d17/d22/d9d/db4 0 2026-03-09T16:15:19.464 INFO:tasks.workunit.client.1.vm05.stdout:8/637: truncate d4/d6/db/d59/f60 30256 0 2026-03-09T16:15:19.468 INFO:tasks.workunit.client.1.vm05.stdout:4/666: mknod d5/de/d15/d21/d39/d91/de9/cef 0 2026-03-09T16:15:19.469 INFO:tasks.workunit.client.1.vm05.stdout:7/661: truncate d1/d2/d8/d31/f51 281187 0 2026-03-09T16:15:19.470 INFO:tasks.workunit.client.1.vm05.stdout:6/592: unlink d17/c45 0 2026-03-09T16:15:19.471 INFO:tasks.workunit.client.1.vm05.stdout:6/593: write d17/f1b [3325351,74243] 0 2026-03-09T16:15:19.471 INFO:tasks.workunit.client.1.vm05.stdout:8/638: mkdir d4/d6/db/d59/db0/dd6 0 2026-03-09T16:15:19.476 INFO:tasks.workunit.client.1.vm05.stdout:6/594: rmdir d17/d22/d27/d34/d42 39 2026-03-09T16:15:19.477 INFO:tasks.workunit.client.1.vm05.stdout:4/667: rename d5/f45 to d5/de/d2f/d8a/ff0 0 2026-03-09T16:15:19.479 INFO:tasks.workunit.client.1.vm05.stdout:9/631: dwrite d4/d10/d35/d36/d48/f6e [0,4194304] 0 2026-03-09T16:15:19.480 INFO:tasks.workunit.client.1.vm05.stdout:6/595: fdatasync d17/d22/f79 0 2026-03-09T16:15:19.486 INFO:tasks.workunit.client.1.vm05.stdout:2/574: dwrite db/f2d [0,4194304] 0 2026-03-09T16:15:19.488 INFO:tasks.workunit.client.1.vm05.stdout:5/624: truncate d8/f11 813838 0 2026-03-09T16:15:19.489 INFO:tasks.workunit.client.1.vm05.stdout:9/632: mkdir d4/d10/d35/d2b/d38/d65/dd6 0 2026-03-09T16:15:19.489 INFO:tasks.workunit.client.1.vm05.stdout:2/575: creat db/dd/d15/d1f/d21/d87/fbf x:0 0 0 2026-03-09T16:15:19.491 INFO:tasks.workunit.client.1.vm05.stdout:5/625: creat d8/d18/d1b/d47/d4e/fe1 x:0 0 0 2026-03-09T16:15:19.495 INFO:tasks.workunit.client.1.vm05.stdout:7/662: rename d1/d2/d8/d31/f39 to d1/d2/d8/dc/d1b/d30/d4b/fe7 0 2026-03-09T16:15:19.501 INFO:tasks.workunit.client.1.vm05.stdout:5/626: mkdir d8/d59/d5b/d8b/da0/de2 0 2026-03-09T16:15:19.501 INFO:tasks.workunit.client.1.vm05.stdout:5/627: read - d8/d53/d7a/f84 zero size 2026-03-09T16:15:19.503 INFO:tasks.workunit.client.1.vm05.stdout:7/663: getdents d1/d2/d8/dc/d1b/d71 0 2026-03-09T16:15:19.504 INFO:tasks.workunit.client.1.vm05.stdout:5/628: symlink d8/d18/d1b/d6b/le3 0 2026-03-09T16:15:19.507 INFO:tasks.workunit.client.1.vm05.stdout:7/664: link d1/d2/d8/dc/d1b/d30/d4b/fdf d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fe8 0 2026-03-09T16:15:19.509 INFO:tasks.workunit.client.1.vm05.stdout:7/665: mkdir d1/d2/d8/dc/d1b/d30/d4b/db2/de9 0 2026-03-09T16:15:19.510 INFO:tasks.workunit.client.1.vm05.stdout:7/666: symlink d1/d2/d11/lea 0 2026-03-09T16:15:19.512 INFO:tasks.workunit.client.1.vm05.stdout:7/667: rmdir d1/d2/d8/dc/d1b/d30/d4b 39 2026-03-09T16:15:19.515 INFO:tasks.workunit.client.1.vm05.stdout:7/668: link d1/d2/d8/d67/d76/fc3 d1/d2/d8/dc/d1b/d71/d3c/feb 0 2026-03-09T16:15:19.519 INFO:tasks.workunit.client.1.vm05.stdout:7/669: dwrite d1/d2/d8/dc/d1b/d30/d5e/f81 [0,4194304] 0 2026-03-09T16:15:19.521 INFO:tasks.workunit.client.1.vm05.stdout:7/670: readlink d1/d2/d8/lb 0 2026-03-09T16:15:19.526 INFO:tasks.workunit.client.1.vm05.stdout:0/593: write d5/db/d5b/f35 [1974855,2328] 0 2026-03-09T16:15:19.528 INFO:tasks.workunit.client.1.vm05.stdout:0/594: rmdir d5/d9e 39 2026-03-09T16:15:19.529 INFO:tasks.workunit.client.1.vm05.stdout:0/595: creat d5/d1b/fce x:0 0 0 2026-03-09T16:15:19.530 INFO:tasks.workunit.client.1.vm05.stdout:0/596: symlink d5/db/lcf 0 2026-03-09T16:15:19.546 INFO:tasks.workunit.client.1.vm05.stdout:0/597: dread d5/db/d5f/f7b [0,4194304] 0 2026-03-09T16:15:19.546 INFO:tasks.workunit.client.1.vm05.stdout:0/598: chown d5/db/d5f 29479 1 2026-03-09T16:15:19.546 INFO:tasks.workunit.client.1.vm05.stdout:0/599: chown d5/db/d48/d66/l67 0 1 2026-03-09T16:15:19.553 INFO:tasks.workunit.client.1.vm05.stdout:6/596: truncate d17/d1d/f1e 5433801 0 2026-03-09T16:15:19.553 INFO:tasks.workunit.client.1.vm05.stdout:0/600: rename d5/f7 to d5/d11/d4f/d70/fd0 0 2026-03-09T16:15:19.554 INFO:tasks.workunit.client.1.vm05.stdout:3/528: dwrite d0/d9/f2f [0,4194304] 0 2026-03-09T16:15:19.558 INFO:tasks.workunit.client.1.vm05.stdout:2/576: truncate db/dd/d15/d46/d67/f73 3641131 0 2026-03-09T16:15:19.560 INFO:tasks.workunit.client.1.vm05.stdout:5/629: write d8/f6f [996537,110610] 0 2026-03-09T16:15:19.568 INFO:tasks.workunit.client.1.vm05.stdout:1/696: dwrite d7/d62/f90 [4194304,4194304] 0 2026-03-09T16:15:19.568 INFO:tasks.workunit.client.1.vm05.stdout:5/630: dread - d8/d18/dbc/fc3 zero size 2026-03-09T16:15:19.570 INFO:tasks.workunit.client.1.vm05.stdout:4/668: dwrite d5/de/f23 [0,4194304] 0 2026-03-09T16:15:19.570 INFO:tasks.workunit.client.1.vm05.stdout:1/697: chown d7/dd/d21/d39/d5a/f41 59 1 2026-03-09T16:15:19.570 INFO:tasks.workunit.client.1.vm05.stdout:0/601: dread d5/d1b/d3b/f3c [0,4194304] 0 2026-03-09T16:15:19.571 INFO:tasks.workunit.client.1.vm05.stdout:0/602: dread - d5/d11/f90 zero size 2026-03-09T16:15:19.573 INFO:tasks.workunit.client.1.vm05.stdout:8/639: dwrite d4/d6/d3a/d40/d71/fc6 [0,4194304] 0 2026-03-09T16:15:19.574 INFO:tasks.workunit.client.1.vm05.stdout:1/698: readlink d7/dd/d21/d39/d48/lac 0 2026-03-09T16:15:19.576 INFO:tasks.workunit.client.1.vm05.stdout:9/633: dwrite d4/d10/d35/d36/f49 [0,4194304] 0 2026-03-09T16:15:19.577 INFO:tasks.workunit.client.1.vm05.stdout:6/597: unlink d17/d22/d27/d34/d42/d65/fa6 0 2026-03-09T16:15:19.577 INFO:tasks.workunit.client.1.vm05.stdout:3/529: creat d0/d9/d22/d6b/fab x:0 0 0 2026-03-09T16:15:19.577 INFO:tasks.workunit.client.1.vm05.stdout:5/631: write d8/d18/dbc/dcc/daa/f9d [3180408,30607] 0 2026-03-09T16:15:19.580 INFO:tasks.workunit.client.1.vm05.stdout:6/598: write d17/f4e [584391,69864] 0 2026-03-09T16:15:19.586 INFO:tasks.workunit.client.1.vm05.stdout:0/603: dwrite d5/d2c/d49/f5d [0,4194304] 0 2026-03-09T16:15:19.588 INFO:tasks.workunit.client.1.vm05.stdout:8/640: rename d4/d6/db/df/d80/d86 to d4/d6/db/dc/d5d/da0/dd7 0 2026-03-09T16:15:19.588 INFO:tasks.workunit.client.1.vm05.stdout:1/699: mkdir d7/dbe/ded/df3 0 2026-03-09T16:15:19.592 INFO:tasks.workunit.client.1.vm05.stdout:8/641: dread d4/d6/d53/f7f [0,4194304] 0 2026-03-09T16:15:19.592 INFO:tasks.workunit.client.1.vm05.stdout:3/530: mkdir d0/d9/d97/dac 0 2026-03-09T16:15:19.594 INFO:tasks.workunit.client.1.vm05.stdout:4/669: creat d5/de/d15/d21/d27/d3c/d5c/ff1 x:0 0 0 2026-03-09T16:15:19.595 INFO:tasks.workunit.client.1.vm05.stdout:8/642: write d4/d6/db/d9b/fbe [599880,10657] 0 2026-03-09T16:15:19.595 INFO:tasks.workunit.client.1.vm05.stdout:2/577: getdents db/dd/d15/d4c/d56 0 2026-03-09T16:15:19.596 INFO:tasks.workunit.client.1.vm05.stdout:3/531: rmdir d0 39 2026-03-09T16:15:19.597 INFO:tasks.workunit.client.1.vm05.stdout:1/700: unlink d7/dd/de/c7c 0 2026-03-09T16:15:19.599 INFO:tasks.workunit.client.1.vm05.stdout:9/634: getdents d4/d10/d35/d36/d48/d60/dae 0 2026-03-09T16:15:19.599 INFO:tasks.workunit.client.1.vm05.stdout:4/670: symlink d5/de/d15/da9/lf2 0 2026-03-09T16:15:19.600 INFO:tasks.workunit.client.1.vm05.stdout:8/643: fdatasync d4/d6/d3a/d15/f63 0 2026-03-09T16:15:19.600 INFO:tasks.workunit.client.1.vm05.stdout:1/701: chown d7/dd/d21/d39 647 1 2026-03-09T16:15:19.600 INFO:tasks.workunit.client.1.vm05.stdout:2/578: symlink db/dd/d15/d1f/d20/d23/lc0 0 2026-03-09T16:15:19.600 INFO:tasks.workunit.client.1.vm05.stdout:3/532: fdatasync d0/f45 0 2026-03-09T16:15:19.602 INFO:tasks.workunit.client.1.vm05.stdout:2/579: truncate db/dd/d15/d1f/f49 11750050 0 2026-03-09T16:15:19.603 INFO:tasks.workunit.client.1.vm05.stdout:8/644: mkdir d4/d6/db/dc/d5d/da0/dd7/dd8 0 2026-03-09T16:15:19.603 INFO:tasks.workunit.client.1.vm05.stdout:3/533: mkdir d0/d9/d97/dad 0 2026-03-09T16:15:19.604 INFO:tasks.workunit.client.1.vm05.stdout:2/580: mkdir db/dd/d15/d1f/dc1 0 2026-03-09T16:15:19.605 INFO:tasks.workunit.client.1.vm05.stdout:1/702: write d7/dd/d21/d44/dcc/fcd [632263,69689] 0 2026-03-09T16:15:19.606 INFO:tasks.workunit.client.1.vm05.stdout:2/581: write db/dd/d15/d3f/d5b/d60/d95/f76 [5175302,46353] 0 2026-03-09T16:15:19.607 INFO:tasks.workunit.client.1.vm05.stdout:3/534: rmdir d0/d33 39 2026-03-09T16:15:19.608 INFO:tasks.workunit.client.1.vm05.stdout:8/645: fdatasync d4/d6/f44 0 2026-03-09T16:15:19.612 INFO:tasks.workunit.client.1.vm05.stdout:8/646: mkdir d4/d6/d3a/d15/dd9 0 2026-03-09T16:15:19.612 INFO:tasks.workunit.client.1.vm05.stdout:3/535: dread d0/d9/d22/d6b/fa0 [0,4194304] 0 2026-03-09T16:15:19.613 INFO:tasks.workunit.client.1.vm05.stdout:8/647: fsync d4/d6/db/d59/fcb 0 2026-03-09T16:15:19.613 INFO:tasks.workunit.client.1.vm05.stdout:1/703: creat d7/ff4 x:0 0 0 2026-03-09T16:15:19.623 INFO:tasks.workunit.client.1.vm05.stdout:2/582: dwrite db/dd/d15/d3f/d5b/d60/d6a/f8a [0,4194304] 0 2026-03-09T16:15:19.624 INFO:tasks.workunit.client.1.vm05.stdout:3/536: mkdir d0/d9/d22/d5f/d90/dae 0 2026-03-09T16:15:19.626 INFO:tasks.workunit.client.1.vm05.stdout:8/648: dwrite d4/d6/db/dc/f30 [4194304,4194304] 0 2026-03-09T16:15:19.626 INFO:tasks.workunit.client.1.vm05.stdout:8/649: readlink d4/d6/db/df/d4f/lc4 0 2026-03-09T16:15:19.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:19 vm03.local ceph-mon[51019]: pgmap v19: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 46 MiB/s rd, 189 MiB/s wr, 375 op/s 2026-03-09T16:15:19.642 INFO:tasks.workunit.client.1.vm05.stdout:8/650: dwrite d4/d6/d3a/f88 [0,4194304] 0 2026-03-09T16:15:19.643 INFO:tasks.workunit.client.1.vm05.stdout:3/537: mknod d0/d9/d97/dac/caf 0 2026-03-09T16:15:19.645 INFO:tasks.workunit.client.1.vm05.stdout:8/651: dread - d4/d6/db/df/d4f/d9f/fbb zero size 2026-03-09T16:15:19.649 INFO:tasks.workunit.client.1.vm05.stdout:7/671: truncate d1/d2/d8/dc/d9c/f6b 1263685 0 2026-03-09T16:15:19.655 INFO:tasks.workunit.client.1.vm05.stdout:8/652: dread d4/d6/d3a/d3c/f8d [0,4194304] 0 2026-03-09T16:15:19.656 INFO:tasks.workunit.client.1.vm05.stdout:7/672: mknod d1/d2/d8/dc/d9c/cec 0 2026-03-09T16:15:19.657 INFO:tasks.workunit.client.1.vm05.stdout:7/673: readlink d1/d2/d8/dc/dd4/l70 0 2026-03-09T16:15:19.659 INFO:tasks.workunit.client.1.vm05.stdout:3/538: dwrite d0/d9/d22/d5f/d75/d76/fa5 [0,4194304] 0 2026-03-09T16:15:19.659 INFO:tasks.workunit.client.1.vm05.stdout:7/674: dread - d1/d2/d8/dc/d1b/d71/d3c/fdd zero size 2026-03-09T16:15:19.661 INFO:tasks.workunit.client.1.vm05.stdout:8/653: mkdir d4/d6/db/dc/d2e/d85/dc7/dda 0 2026-03-09T16:15:19.665 INFO:tasks.workunit.client.1.vm05.stdout:7/675: chown d1/d2/d8/dc/d1b/d71/d3c/ld2 4282 1 2026-03-09T16:15:19.668 INFO:tasks.workunit.client.1.vm05.stdout:3/539: creat d0/d9/d8b/fb0 x:0 0 0 2026-03-09T16:15:19.668 INFO:tasks.workunit.client.1.vm05.stdout:8/654: write d4/d6/db/f5e [21974,37587] 0 2026-03-09T16:15:19.668 INFO:tasks.workunit.client.1.vm05.stdout:5/632: sync 2026-03-09T16:15:19.669 INFO:tasks.workunit.client.1.vm05.stdout:7/676: unlink d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fb3 0 2026-03-09T16:15:19.670 INFO:tasks.workunit.client.1.vm05.stdout:5/633: readlink d8/d18/dbc/dcc/daa/l4a 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:3/540: dread d0/d33/f92 [0,4194304] 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:8/655: creat d4/d6/db/d59/fdb x:0 0 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:8/656: chown d4/d6/d3a/d15/f63 21829326 1 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:7/677: mkdir d1/d2/d8/d31/d8d/ded 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:8/657: creat d4/d6/db/df/fdc x:0 0 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:3/541: creat d0/d9/d22/d5f/d90/dae/fb1 x:0 0 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:7/678: symlink d1/lee 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:3/542: link d0/d9/l82 d0/d9/d22/lb2 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:8/658: dread d4/d6/f5f [0,4194304] 0 2026-03-09T16:15:19.679 INFO:tasks.workunit.client.1.vm05.stdout:3/543: unlink d0/c43 0 2026-03-09T16:15:19.683 INFO:tasks.workunit.client.1.vm05.stdout:1/704: sync 2026-03-09T16:15:19.683 INFO:tasks.workunit.client.1.vm05.stdout:0/604: sync 2026-03-09T16:15:19.684 INFO:tasks.workunit.client.1.vm05.stdout:8/659: chown d4/d6/d3a/c2c 110 1 2026-03-09T16:15:19.684 INFO:tasks.workunit.client.1.vm05.stdout:3/544: mkdir d0/d9/d22/d4c/d4e/db3 0 2026-03-09T16:15:19.686 INFO:tasks.workunit.client.1.vm05.stdout:1/705: creat d7/dd/d21/d3b/ff5 x:0 0 0 2026-03-09T16:15:19.689 INFO:tasks.workunit.client.1.vm05.stdout:3/545: mknod d0/d9/cb4 0 2026-03-09T16:15:19.695 INFO:tasks.workunit.client.1.vm05.stdout:3/546: link d0/d9/f2b d0/da9/fb5 0 2026-03-09T16:15:19.695 INFO:tasks.workunit.client.1.vm05.stdout:0/605: dwrite d5/d11/d4f/d70/fbf [0,4194304] 0 2026-03-09T16:15:19.697 INFO:tasks.workunit.client.1.vm05.stdout:0/606: mknod d5/db/d5b/d82/cd1 0 2026-03-09T16:15:19.698 INFO:tasks.workunit.client.1.vm05.stdout:0/607: mkdir d5/d2c/dd2 0 2026-03-09T16:15:19.714 INFO:tasks.workunit.client.1.vm05.stdout:0/608: dwrite d5/d1b/d3b/f92 [0,4194304] 0 2026-03-09T16:15:19.715 INFO:tasks.workunit.client.1.vm05.stdout:3/547: sync 2026-03-09T16:15:19.716 INFO:tasks.workunit.client.1.vm05.stdout:0/609: symlink d5/db/d5b/da5/ld3 0 2026-03-09T16:15:19.716 INFO:tasks.workunit.client.1.vm05.stdout:0/610: chown d5/d1b/l62 323 1 2026-03-09T16:15:19.718 INFO:tasks.workunit.client.1.vm05.stdout:3/548: rename d0/d9/d22/d4c/l53 to d0/d9/d97/dac/lb6 0 2026-03-09T16:15:19.718 INFO:tasks.workunit.client.1.vm05.stdout:0/611: creat d5/fd4 x:0 0 0 2026-03-09T16:15:19.721 INFO:tasks.workunit.client.1.vm05.stdout:3/549: rename d0/d9/d22/f2a to d0/d9/d22/d5f/d7b/fb7 0 2026-03-09T16:15:19.721 INFO:tasks.workunit.client.1.vm05.stdout:0/612: mkdir d5/d2c/d49/d83/d8b/dd5 0 2026-03-09T16:15:19.723 INFO:tasks.workunit.client.1.vm05.stdout:0/613: rename d5/d1b/d30/fb8 to d5/db/d5f/fd6 0 2026-03-09T16:15:19.727 INFO:tasks.workunit.client.1.vm05.stdout:0/614: dwrite d5/d2c/d49/f5d [4194304,4194304] 0 2026-03-09T16:15:19.730 INFO:tasks.workunit.client.1.vm05.stdout:0/615: chown d5/db/d48/d66 3493 1 2026-03-09T16:15:19.743 INFO:tasks.workunit.client.1.vm05.stdout:0/616: dread d5/d2c/f41 [0,4194304] 0 2026-03-09T16:15:19.745 INFO:tasks.workunit.client.1.vm05.stdout:6/599: dwrite d17/d5d/f71 [8388608,4194304] 0 2026-03-09T16:15:19.750 INFO:tasks.workunit.client.1.vm05.stdout:0/617: sync 2026-03-09T16:15:19.753 INFO:tasks.workunit.client.1.vm05.stdout:0/618: rename d5/db/f7c to d5/d11/d4f/d70/fd7 0 2026-03-09T16:15:19.755 INFO:tasks.workunit.client.1.vm05.stdout:0/619: symlink d5/db/d48/d66/ld8 0 2026-03-09T16:15:19.757 INFO:tasks.workunit.client.1.vm05.stdout:0/620: dread - d5/d11/d4f/d68/f6b zero size 2026-03-09T16:15:19.758 INFO:tasks.workunit.client.1.vm05.stdout:0/621: write d5/db/d48/d66/f72 [3142382,98408] 0 2026-03-09T16:15:19.762 INFO:tasks.workunit.client.1.vm05.stdout:0/622: symlink d5/d11/d4f/ld9 0 2026-03-09T16:15:19.764 INFO:tasks.workunit.client.1.vm05.stdout:0/623: mknod d5/d97/cda 0 2026-03-09T16:15:19.764 INFO:tasks.workunit.client.1.vm05.stdout:9/635: dwrite d4/d10/d35/d36/fce [0,4194304] 0 2026-03-09T16:15:19.772 INFO:tasks.workunit.client.1.vm05.stdout:9/636: dread d4/f61 [0,4194304] 0 2026-03-09T16:15:19.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:19 vm05.local ceph-mon[58702]: pgmap v19: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 46 MiB/s rd, 189 MiB/s wr, 375 op/s 2026-03-09T16:15:19.777 INFO:tasks.workunit.client.1.vm05.stdout:0/624: dwrite d5/db/d5b/da5/fca [0,4194304] 0 2026-03-09T16:15:19.782 INFO:tasks.workunit.client.1.vm05.stdout:0/625: symlink d5/d2c/dd2/ldb 0 2026-03-09T16:15:19.784 INFO:tasks.workunit.client.1.vm05.stdout:0/626: mkdir d5/d11/d4f/ddc 0 2026-03-09T16:15:19.789 INFO:tasks.workunit.client.1.vm05.stdout:0/627: link d5/d1b/d30/lb0 d5/ldd 0 2026-03-09T16:15:19.795 INFO:tasks.workunit.client.1.vm05.stdout:0/628: creat d5/d2c/fde x:0 0 0 2026-03-09T16:15:19.795 INFO:tasks.workunit.client.1.vm05.stdout:6/600: dread d17/d22/d27/d34/d4b/fa4 [0,4194304] 0 2026-03-09T16:15:19.795 INFO:tasks.workunit.client.1.vm05.stdout:6/601: write d17/d22/d27/d34/d4b/d7f/fc8 [1251414,75398] 0 2026-03-09T16:15:19.798 INFO:tasks.workunit.client.1.vm05.stdout:6/602: rmdir d17/d22/d27/d44/dd4 0 2026-03-09T16:15:19.799 INFO:tasks.workunit.client.1.vm05.stdout:6/603: mknod d17/d22/ce0 0 2026-03-09T16:15:19.800 INFO:tasks.workunit.client.1.vm05.stdout:6/604: write d17/f60 [7917510,57750] 0 2026-03-09T16:15:19.802 INFO:tasks.workunit.client.1.vm05.stdout:6/605: creat d17/d1d/fe1 x:0 0 0 2026-03-09T16:15:19.804 INFO:tasks.workunit.client.1.vm05.stdout:6/606: write d17/d5d/d73/d83/f9a [321531,83628] 0 2026-03-09T16:15:19.812 INFO:tasks.workunit.client.1.vm05.stdout:6/607: unlink d17/d5d/f8e 0 2026-03-09T16:15:19.812 INFO:tasks.workunit.client.1.vm05.stdout:6/608: dread - d17/d1d/fd3 zero size 2026-03-09T16:15:19.815 INFO:tasks.workunit.client.1.vm05.stdout:6/609: rename d17/d5d/f78 to d17/d22/d27/d34/dd1/fe2 0 2026-03-09T16:15:19.896 INFO:tasks.workunit.client.1.vm05.stdout:4/671: write d5/de/d15/d21/d27/d3c/fe8 [230412,71098] 0 2026-03-09T16:15:19.897 INFO:tasks.workunit.client.1.vm05.stdout:8/660: rmdir d4/d6/d3a 39 2026-03-09T16:15:19.904 INFO:tasks.workunit.client.1.vm05.stdout:4/672: dwrite d5/de/d15/da9/db1/dad/d90/dd8/fe2 [0,4194304] 0 2026-03-09T16:15:19.905 INFO:tasks.workunit.client.1.vm05.stdout:2/583: dread db/dd/d15/d1f/d20/d23/faf [4194304,4194304] 0 2026-03-09T16:15:19.909 INFO:tasks.workunit.client.1.vm05.stdout:2/584: write db/dd/d15/d3f/d5b/d60/d6a/fb5 [440999,57007] 0 2026-03-09T16:15:19.910 INFO:tasks.workunit.client.1.vm05.stdout:2/585: truncate db/dd/d15/d1f/d20/d23/d78/f92 346622 0 2026-03-09T16:15:19.916 INFO:tasks.workunit.client.1.vm05.stdout:2/586: symlink db/dd/d15/d46/lc2 0 2026-03-09T16:15:19.922 INFO:tasks.workunit.client.1.vm05.stdout:1/706: truncate d7/fb 4719934 0 2026-03-09T16:15:19.923 INFO:tasks.workunit.client.1.vm05.stdout:1/707: chown d7/dd/d21/f2b 1961 1 2026-03-09T16:15:19.926 INFO:tasks.workunit.client.1.vm05.stdout:5/634: dwrite d8/d18/dbc/dcc/f89 [0,4194304] 0 2026-03-09T16:15:19.935 INFO:tasks.workunit.client.1.vm05.stdout:2/587: sync 2026-03-09T16:15:19.937 INFO:tasks.workunit.client.1.vm05.stdout:5/635: mkdir d8/d18/d1b/d47/d48/d73/d80/de4 0 2026-03-09T16:15:19.937 INFO:tasks.workunit.client.1.vm05.stdout:2/588: mknod db/dd/d98/cc3 0 2026-03-09T16:15:19.938 INFO:tasks.workunit.client.1.vm05.stdout:2/589: dread - db/dd/fb9 zero size 2026-03-09T16:15:19.942 INFO:tasks.workunit.client.1.vm05.stdout:5/636: rename d8/d18/fb2 to d8/d18/d1b/d47/d48/d73/d80/fe5 0 2026-03-09T16:15:19.943 INFO:tasks.workunit.client.1.vm05.stdout:3/550: truncate d0/d9/f4d 1471174 0 2026-03-09T16:15:19.945 INFO:tasks.workunit.client.1.vm05.stdout:3/551: dread - d0/d9/d22/d5f/d75/d76/f7e zero size 2026-03-09T16:15:19.945 INFO:tasks.workunit.client.1.vm05.stdout:2/590: dwrite db/dd/f10 [0,4194304] 0 2026-03-09T16:15:19.946 INFO:tasks.workunit.client.1.vm05.stdout:5/637: dread - d8/d5e/fd6 zero size 2026-03-09T16:15:19.947 INFO:tasks.workunit.client.1.vm05.stdout:4/673: dread d5/de/d15/d21/d39/f46 [0,4194304] 0 2026-03-09T16:15:19.949 INFO:tasks.workunit.client.1.vm05.stdout:4/674: dread - d5/fce zero size 2026-03-09T16:15:19.949 INFO:tasks.workunit.client.1.vm05.stdout:3/552: chown d0/d9/d22/d5f/d75/d76/d88/d89/la4 169 1 2026-03-09T16:15:19.954 INFO:tasks.workunit.client.1.vm05.stdout:4/675: fdatasync d5/de/d15/d21/d27/d3c/f3d 0 2026-03-09T16:15:19.955 INFO:tasks.workunit.client.1.vm05.stdout:3/553: mknod d0/d9/d22/d5f/d75/cb8 0 2026-03-09T16:15:19.958 INFO:tasks.workunit.client.1.vm05.stdout:4/676: truncate d5/de/d2f/d8a/fb0 753540 0 2026-03-09T16:15:19.959 INFO:tasks.workunit.client.1.vm05.stdout:3/554: mknod d0/d9/d97/dad/cb9 0 2026-03-09T16:15:19.962 INFO:tasks.workunit.client.1.vm05.stdout:5/638: dread d8/d18/dbc/dcc/f94 [0,4194304] 0 2026-03-09T16:15:19.965 INFO:tasks.workunit.client.1.vm05.stdout:3/555: rename d0/d9/d22/d4c/c52 to d0/d9/d22/d4c/d4e/db3/cba 0 2026-03-09T16:15:19.965 INFO:tasks.workunit.client.1.vm05.stdout:5/639: symlink d8/d18/d1b/d47/d48/d73/d80/le6 0 2026-03-09T16:15:19.967 INFO:tasks.workunit.client.1.vm05.stdout:5/640: rmdir d8/d18/dbc/dcc/daa 39 2026-03-09T16:15:19.968 INFO:tasks.workunit.client.1.vm05.stdout:5/641: chown d8/l79 87543 1 2026-03-09T16:15:19.968 INFO:tasks.workunit.client.1.vm05.stdout:5/642: fdatasync f5 0 2026-03-09T16:15:19.970 INFO:tasks.workunit.client.1.vm05.stdout:3/556: rename d0/d9/c26 to d0/d9/d22/cbb 0 2026-03-09T16:15:19.972 INFO:tasks.workunit.client.1.vm05.stdout:3/557: unlink d0/d9/l72 0 2026-03-09T16:15:19.974 INFO:tasks.workunit.client.1.vm05.stdout:3/558: fdatasync d0/da9/fb5 0 2026-03-09T16:15:19.975 INFO:tasks.workunit.client.1.vm05.stdout:3/559: fdatasync d0/d33/f29 0 2026-03-09T16:15:19.977 INFO:tasks.workunit.client.1.vm05.stdout:3/560: chown d0/l7 2 1 2026-03-09T16:15:19.983 INFO:tasks.workunit.client.1.vm05.stdout:5/643: dwrite d8/d18/dbc/dcc/daa/f35 [0,4194304] 0 2026-03-09T16:15:19.985 INFO:tasks.workunit.client.1.vm05.stdout:7/679: dread d1/d2/d8/dc/d1b/d30/d4b/d65/f8f [0,4194304] 0 2026-03-09T16:15:19.986 INFO:tasks.workunit.client.1.vm05.stdout:7/680: stat d1/d2/d8/dc/f1e 0 2026-03-09T16:15:19.986 INFO:tasks.workunit.client.1.vm05.stdout:6/610: truncate d17/f18 2809187 0 2026-03-09T16:15:19.986 INFO:tasks.workunit.client.1.vm05.stdout:0/629: write d5/db/f54 [751370,93030] 0 2026-03-09T16:15:19.988 INFO:tasks.workunit.client.1.vm05.stdout:8/661: write d4/f23 [690098,94300] 0 2026-03-09T16:15:19.988 INFO:tasks.workunit.client.1.vm05.stdout:9/637: dwrite d4/d10/d35/d2b/d38/f4b [0,4194304] 0 2026-03-09T16:15:19.990 INFO:tasks.workunit.client.1.vm05.stdout:0/630: chown d5/d97/cda 309229 1 2026-03-09T16:15:19.991 INFO:tasks.workunit.client.1.vm05.stdout:9/638: read - d4/d10/d35/d36/d48/d60/fad zero size 2026-03-09T16:15:19.995 INFO:tasks.workunit.client.1.vm05.stdout:0/631: chown d5/d1b/f61 126 1 2026-03-09T16:15:19.999 INFO:tasks.workunit.client.1.vm05.stdout:5/644: rename d8/d18/dbc/dcc/daa/d43/cb4 to d8/d18/d1b/d47/dda/ce7 0 2026-03-09T16:15:20.000 INFO:tasks.workunit.client.1.vm05.stdout:6/611: dwrite d17/d22/d27/d8a/f88 [0,4194304] 0 2026-03-09T16:15:20.005 INFO:tasks.workunit.client.1.vm05.stdout:0/632: write d5/d1b/d3b/f92 [1200242,23162] 0 2026-03-09T16:15:20.007 INFO:tasks.workunit.client.1.vm05.stdout:5/645: unlink d8/fad 0 2026-03-09T16:15:20.009 INFO:tasks.workunit.client.1.vm05.stdout:5/646: chown d8/d53/d7a/f84 25 1 2026-03-09T16:15:20.012 INFO:tasks.workunit.client.1.vm05.stdout:7/681: creat d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/fef x:0 0 0 2026-03-09T16:15:20.013 INFO:tasks.workunit.client.1.vm05.stdout:8/662: dwrite d4/d6/d3a/d15/f66 [4194304,4194304] 0 2026-03-09T16:15:20.014 INFO:tasks.workunit.client.1.vm05.stdout:0/633: dread - d5/d2c/d49/d83/fc9 zero size 2026-03-09T16:15:20.014 INFO:tasks.workunit.client.1.vm05.stdout:6/612: sync 2026-03-09T16:15:20.014 INFO:tasks.workunit.client.1.vm05.stdout:9/639: write d4/d10/d35/d36/d48/d54/d59/f9f [968444,7604] 0 2026-03-09T16:15:20.015 INFO:tasks.workunit.client.1.vm05.stdout:5/647: mknod d8/d18/d1b/d47/ce8 0 2026-03-09T16:15:20.021 INFO:tasks.workunit.client.1.vm05.stdout:7/682: mknod d1/d2/d8/dc/d1b/d71/d3c/dd3/cf0 0 2026-03-09T16:15:20.026 INFO:tasks.workunit.client.1.vm05.stdout:0/634: mknod d5/db/d5b/d82/cdf 0 2026-03-09T16:15:20.029 INFO:tasks.workunit.client.1.vm05.stdout:5/648: symlink d8/d18/dbc/dcc/daa/le9 0 2026-03-09T16:15:20.029 INFO:tasks.workunit.client.1.vm05.stdout:8/663: symlink d4/d6/db/dc/d5d/da0/dd7/dd8/ldd 0 2026-03-09T16:15:20.029 INFO:tasks.workunit.client.1.vm05.stdout:9/640: rmdir d4/d10/d35/d2b/dba 0 2026-03-09T16:15:20.029 INFO:tasks.workunit.client.1.vm05.stdout:9/641: stat d4/d10/d35/d36/d48/d4c/f93 0 2026-03-09T16:15:20.029 INFO:tasks.workunit.client.1.vm05.stdout:9/642: mkdir d4/d10/dd7 0 2026-03-09T16:15:20.030 INFO:tasks.workunit.client.1.vm05.stdout:6/613: creat d17/d5d/d73/fe3 x:0 0 0 2026-03-09T16:15:20.031 INFO:tasks.workunit.client.1.vm05.stdout:6/614: chown d17/d22/d27/d58/db8 6354 1 2026-03-09T16:15:20.032 INFO:tasks.workunit.client.1.vm05.stdout:5/649: write d8/d18/d1b/d47/d48/d73/d80/fe5 [114889,74997] 0 2026-03-09T16:15:20.032 INFO:tasks.workunit.client.1.vm05.stdout:9/643: symlink d4/ld8 0 2026-03-09T16:15:20.038 INFO:tasks.workunit.client.1.vm05.stdout:9/644: creat d4/d10/d35/d36/d48/d54/fd9 x:0 0 0 2026-03-09T16:15:20.039 INFO:tasks.workunit.client.1.vm05.stdout:1/708: truncate d7/f34 848367 0 2026-03-09T16:15:20.039 INFO:tasks.workunit.client.1.vm05.stdout:5/650: mknod d8/d18/d1b/d6b/cea 0 2026-03-09T16:15:20.046 INFO:tasks.workunit.client.1.vm05.stdout:1/709: write d7/dd/d21/d39/d48/d5d/f98 [170048,79097] 0 2026-03-09T16:15:20.047 INFO:tasks.workunit.client.1.vm05.stdout:0/635: dread d5/db/d48/d66/f72 [0,4194304] 0 2026-03-09T16:15:20.049 INFO:tasks.workunit.client.1.vm05.stdout:6/615: creat d17/d22/fe4 x:0 0 0 2026-03-09T16:15:20.050 INFO:tasks.workunit.client.1.vm05.stdout:0/636: mkdir d5/db/d5b/d82/de0 0 2026-03-09T16:15:20.050 INFO:tasks.workunit.client.1.vm05.stdout:0/637: dread - d5/d2c/f63 zero size 2026-03-09T16:15:20.055 INFO:tasks.workunit.client.1.vm05.stdout:9/645: creat d4/d10/d35/d2b/d38/fda x:0 0 0 2026-03-09T16:15:20.057 INFO:tasks.workunit.client.1.vm05.stdout:9/646: write d4/d10/d35/fc3 [316043,38493] 0 2026-03-09T16:15:20.059 INFO:tasks.workunit.client.1.vm05.stdout:1/710: rename d7/dd/d21/d3b to d7/dd/de/d52/df6 0 2026-03-09T16:15:20.061 INFO:tasks.workunit.client.1.vm05.stdout:9/647: rename d4/d10/d35/d2b/d31/f99 to d4/d10/d35/fdb 0 2026-03-09T16:15:20.062 INFO:tasks.workunit.client.1.vm05.stdout:4/677: write d5/de/d15/d21/d27/f29 [517887,107132] 0 2026-03-09T16:15:20.065 INFO:tasks.workunit.client.1.vm05.stdout:2/591: dwrite db/dd/d15/d46/d67/f77 [0,4194304] 0 2026-03-09T16:15:20.071 INFO:tasks.workunit.client.1.vm05.stdout:4/678: chown d5/de/f9d 3121 1 2026-03-09T16:15:20.081 INFO:tasks.workunit.client.1.vm05.stdout:1/711: dwrite d7/dd/d21/d39/d5a/f41 [0,4194304] 0 2026-03-09T16:15:20.085 INFO:tasks.workunit.client.1.vm05.stdout:4/679: creat d5/de/d15/da9/db1/dad/d90/dbb/ff3 x:0 0 0 2026-03-09T16:15:20.088 INFO:tasks.workunit.client.1.vm05.stdout:4/680: chown d5/de/d15/da9/fc6 210006 1 2026-03-09T16:15:20.088 INFO:tasks.workunit.client.1.vm05.stdout:9/648: getdents d4/d10/d35/d36/d48/d54/d59 0 2026-03-09T16:15:20.088 INFO:tasks.workunit.client.1.vm05.stdout:9/649: stat d4/d10/faa 0 2026-03-09T16:15:20.089 INFO:tasks.workunit.client.1.vm05.stdout:1/712: dread - d7/dd/de/d52/df6/d55/fb1 zero size 2026-03-09T16:15:20.089 INFO:tasks.workunit.client.1.vm05.stdout:9/650: read d4/d10/d35/d2b/d38/f78 [2015101,52748] 0 2026-03-09T16:15:20.090 INFO:tasks.workunit.client.1.vm05.stdout:9/651: dread - d4/d10/d35/d36/d48/d60/dae/fc9 zero size 2026-03-09T16:15:20.096 INFO:tasks.workunit.client.1.vm05.stdout:9/652: mkdir d4/d10/d35/d36/d48/d60/ddc 0 2026-03-09T16:15:20.096 INFO:tasks.workunit.client.1.vm05.stdout:1/713: mknod d7/dd/d21/d39/d48/d8c/dd8/cf7 0 2026-03-09T16:15:20.098 INFO:tasks.workunit.client.1.vm05.stdout:4/681: dwrite d5/de/d15/d21/d27/f8f [0,4194304] 0 2026-03-09T16:15:20.105 INFO:tasks.workunit.client.1.vm05.stdout:4/682: dread - d5/de/d15/da9/db1/dad/d90/dbb/fd1 zero size 2026-03-09T16:15:20.108 INFO:tasks.workunit.client.1.vm05.stdout:3/561: getdents d0/d9/d22 0 2026-03-09T16:15:20.115 INFO:tasks.workunit.client.1.vm05.stdout:3/562: rmdir d0/d9/d22/d4c 39 2026-03-09T16:15:20.115 INFO:tasks.workunit.client.1.vm05.stdout:4/683: creat d5/ff4 x:0 0 0 2026-03-09T16:15:20.119 INFO:tasks.workunit.client.1.vm05.stdout:3/563: dwrite d0/f60 [4194304,4194304] 0 2026-03-09T16:15:20.121 INFO:tasks.workunit.client.1.vm05.stdout:4/684: fdatasync d5/de/d2f/f78 0 2026-03-09T16:15:20.126 INFO:tasks.workunit.client.1.vm05.stdout:4/685: creat d5/de/d15/da9/db1/dad/d90/dbb/ff5 x:0 0 0 2026-03-09T16:15:20.126 INFO:tasks.workunit.client.1.vm05.stdout:3/564: mkdir d0/d9/d97/dbc 0 2026-03-09T16:15:20.128 INFO:tasks.workunit.client.1.vm05.stdout:4/686: rmdir d5/de/d82/dc1 39 2026-03-09T16:15:20.130 INFO:tasks.workunit.client.1.vm05.stdout:4/687: mkdir d5/de/d15/da9/df6 0 2026-03-09T16:15:20.138 INFO:tasks.workunit.client.1.vm05.stdout:4/688: dwrite d5/de/d15/fa3 [0,4194304] 0 2026-03-09T16:15:20.138 INFO:tasks.workunit.client.1.vm05.stdout:3/565: dwrite d0/f60 [4194304,4194304] 0 2026-03-09T16:15:20.140 INFO:tasks.workunit.client.1.vm05.stdout:4/689: chown d5/de/d15/d21/da0/de3 0 1 2026-03-09T16:15:20.148 INFO:tasks.workunit.client.1.vm05.stdout:4/690: creat d5/de/d82/ff7 x:0 0 0 2026-03-09T16:15:20.150 INFO:tasks.workunit.client.1.vm05.stdout:3/566: symlink d0/d9/d22/d4c/d4e/db3/lbd 0 2026-03-09T16:15:20.155 INFO:tasks.workunit.client.1.vm05.stdout:4/691: rename d5/de/d15/da9/db1/dad/c4c to d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/cf8 0 2026-03-09T16:15:20.156 INFO:tasks.workunit.client.1.vm05.stdout:4/692: chown d5/de/d15/da9/db1/dad/d37/c47 3057 1 2026-03-09T16:15:20.156 INFO:tasks.workunit.client.1.vm05.stdout:4/693: chown d5 59 1 2026-03-09T16:15:20.160 INFO:tasks.workunit.client.1.vm05.stdout:4/694: symlink d5/lf9 0 2026-03-09T16:15:20.161 INFO:tasks.workunit.client.1.vm05.stdout:4/695: readlink d5/de/d15/d21/d27/d3c/l40 0 2026-03-09T16:15:20.162 INFO:tasks.workunit.client.1.vm05.stdout:4/696: readlink d5/de/d15/da9/db1/dad/la4 0 2026-03-09T16:15:20.169 INFO:tasks.workunit.client.1.vm05.stdout:4/697: symlink d5/de/d15/d21/d27/lfa 0 2026-03-09T16:15:20.171 INFO:tasks.workunit.client.1.vm05.stdout:4/698: mkdir d5/de/d15/da9/db1/dad/d37/dfb 0 2026-03-09T16:15:20.174 INFO:tasks.workunit.client.1.vm05.stdout:4/699: mkdir d5/de/d15/d21/d27/d3c/d5c/dfc 0 2026-03-09T16:15:20.176 INFO:tasks.workunit.client.1.vm05.stdout:4/700: link d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f98 d5/ffd 0 2026-03-09T16:15:20.181 INFO:tasks.workunit.client.1.vm05.stdout:4/701: dwrite d5/de/d15/da9/db1/dad/d90/dbb/fd1 [0,4194304] 0 2026-03-09T16:15:20.191 INFO:tasks.workunit.client.1.vm05.stdout:4/702: dwrite d5/de/d15/da9/db1/dad/d37/fa5 [0,4194304] 0 2026-03-09T16:15:20.193 INFO:tasks.workunit.client.1.vm05.stdout:4/703: write d5/de/f9d [3849741,117282] 0 2026-03-09T16:15:20.199 INFO:tasks.workunit.client.1.vm05.stdout:7/683: rmdir d1/d2/d8/dc/d1b/d71/d3c 39 2026-03-09T16:15:20.204 INFO:tasks.workunit.client.1.vm05.stdout:7/684: rename d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/fef to d1/d2/d8/d67/d76/ff1 0 2026-03-09T16:15:20.204 INFO:tasks.workunit.client.1.vm05.stdout:7/685: chown d1 0 1 2026-03-09T16:15:20.215 INFO:tasks.workunit.client.1.vm05.stdout:4/704: mkdir d5/de/d15/d21/dfe 0 2026-03-09T16:15:20.219 INFO:tasks.workunit.client.1.vm05.stdout:8/664: truncate d4/d6/d3a/d15/f66 1284303 0 2026-03-09T16:15:20.220 INFO:tasks.workunit.client.1.vm05.stdout:4/705: write d5/de/d15/d21/d39/f53 [382398,78813] 0 2026-03-09T16:15:20.221 INFO:tasks.workunit.client.1.vm05.stdout:7/686: symlink d1/d2/d11/d86/da2/db6/lf2 0 2026-03-09T16:15:20.231 INFO:tasks.workunit.client.1.vm05.stdout:8/665: rename d4/d6/db/da6/ld5 to d4/d6/d3a/d40/d71/lde 0 2026-03-09T16:15:20.236 INFO:tasks.workunit.client.1.vm05.stdout:7/687: mkdir d1/d2/d8/dc/d1b/d30/d4b/db2/df3 0 2026-03-09T16:15:20.237 INFO:tasks.workunit.client.1.vm05.stdout:8/666: mknod d4/d6/db/df/dd1/cdf 0 2026-03-09T16:15:20.238 INFO:tasks.workunit.client.1.vm05.stdout:7/688: creat d1/d2/d8/dc/dd4/ff4 x:0 0 0 2026-03-09T16:15:20.249 INFO:tasks.workunit.client.1.vm05.stdout:5/651: dwrite d8/f11 [0,4194304] 0 2026-03-09T16:15:20.249 INFO:tasks.workunit.client.1.vm05.stdout:8/667: dwrite d4/d6/d53/fb1 [0,4194304] 0 2026-03-09T16:15:20.250 INFO:tasks.workunit.client.1.vm05.stdout:4/706: link d5/de/d15/da9/db1/dad/d90/dbb/le7 d5/de/d15/d21/d39/lff 0 2026-03-09T16:15:20.253 INFO:tasks.workunit.client.1.vm05.stdout:2/592: truncate db/dd/f10 1473764 0 2026-03-09T16:15:20.257 INFO:tasks.workunit.client.1.vm05.stdout:2/593: chown db/dd/d15/d3f/d5b/d60/d6a/l72 610088844 1 2026-03-09T16:15:20.257 INFO:tasks.workunit.client.1.vm05.stdout:2/594: stat db/dd/d15/d3f/d5b/d60 0 2026-03-09T16:15:20.258 INFO:tasks.workunit.client.1.vm05.stdout:6/616: dwrite d17/fb7 [0,4194304] 0 2026-03-09T16:15:20.259 INFO:tasks.workunit.client.1.vm05.stdout:6/617: write d17/d22/d27/d34/d42/d65/f75 [3267066,62399] 0 2026-03-09T16:15:20.266 INFO:tasks.workunit.client.1.vm05.stdout:4/707: unlink d5/de/c77 0 2026-03-09T16:15:20.267 INFO:tasks.workunit.client.1.vm05.stdout:4/708: stat d5/de/d2f/d8a 0 2026-03-09T16:15:20.267 INFO:tasks.workunit.client.1.vm05.stdout:6/618: write d17/d22/fe4 [109916,32461] 0 2026-03-09T16:15:20.269 INFO:tasks.workunit.client.1.vm05.stdout:7/689: getdents d1/d2/d8/dc/d1b/d30/d4b 0 2026-03-09T16:15:20.271 INFO:tasks.workunit.client.1.vm05.stdout:2/595: dwrite db/dd/d15/d3f/d5b/d60/d95/f76 [4194304,4194304] 0 2026-03-09T16:15:20.276 INFO:tasks.workunit.client.1.vm05.stdout:6/619: symlink d17/d5d/le5 0 2026-03-09T16:15:20.281 INFO:tasks.workunit.client.1.vm05.stdout:2/596: dread - db/dd/d15/f6f zero size 2026-03-09T16:15:20.282 INFO:tasks.workunit.client.1.vm05.stdout:4/709: dwrite d5/de/d15/d21/d27/fc8 [0,4194304] 0 2026-03-09T16:15:20.286 INFO:tasks.workunit.client.1.vm05.stdout:2/597: truncate db/dd/d15/d1f/d20/d23/fbb 487950 0 2026-03-09T16:15:20.286 INFO:tasks.workunit.client.1.vm05.stdout:2/598: chown db/dd/d15/d4c/d56/c8e 2 1 2026-03-09T16:15:20.286 INFO:tasks.workunit.client.1.vm05.stdout:6/620: creat d17/d5d/d73/d83/fe6 x:0 0 0 2026-03-09T16:15:20.289 INFO:tasks.workunit.client.1.vm05.stdout:6/621: fsync d17/d22/d9d/fb2 0 2026-03-09T16:15:20.293 INFO:tasks.workunit.client.1.vm05.stdout:6/622: mknod d17/d22/d9d/da5/ce7 0 2026-03-09T16:15:20.295 INFO:tasks.workunit.client.1.vm05.stdout:2/599: link db/dd/d15/d46/cbd db/dd/d15/d3f/d5b/d7e/cc4 0 2026-03-09T16:15:20.300 INFO:tasks.workunit.client.1.vm05.stdout:2/600: chown db/dd/d15/d3f/f4a 1729647970 1 2026-03-09T16:15:20.305 INFO:tasks.workunit.client.1.vm05.stdout:2/601: creat db/dd/d15/d3f/d5b/d60/da2/fc5 x:0 0 0 2026-03-09T16:15:20.305 INFO:tasks.workunit.client.1.vm05.stdout:2/602: fdatasync db/dd/fb8 0 2026-03-09T16:15:20.307 INFO:tasks.workunit.client.1.vm05.stdout:2/603: chown db/dd/d15/d1f/d20/d23/lc0 214 1 2026-03-09T16:15:20.312 INFO:tasks.workunit.client.1.vm05.stdout:9/653: dwrite d4/d10/d35/d36/d48/f9e [0,4194304] 0 2026-03-09T16:15:20.316 INFO:tasks.workunit.client.1.vm05.stdout:2/604: dwrite db/dd/d15/d1f/d20/d23/f9b [0,4194304] 0 2026-03-09T16:15:20.322 INFO:tasks.workunit.client.1.vm05.stdout:9/654: mkdir d4/d10/d35/d2b/d31/d96/ddd 0 2026-03-09T16:15:20.324 INFO:tasks.workunit.client.1.vm05.stdout:9/655: read - d4/d10/d35/d2b/d38/fda zero size 2026-03-09T16:15:20.327 INFO:tasks.workunit.client.1.vm05.stdout:2/605: rename db/dd/fb9 to db/fc6 0 2026-03-09T16:15:20.329 INFO:tasks.workunit.client.1.vm05.stdout:9/656: dwrite d4/d10/faa [0,4194304] 0 2026-03-09T16:15:20.332 INFO:tasks.workunit.client.1.vm05.stdout:9/657: chown d4/d10/d35/d2b/d38/d65/f6a 854 1 2026-03-09T16:15:20.336 INFO:tasks.workunit.client.1.vm05.stdout:9/658: rename d4/d10/l3e to d4/d10/d35/lde 0 2026-03-09T16:15:20.336 INFO:tasks.workunit.client.1.vm05.stdout:9/659: fdatasync d4/d10/d35/f32 0 2026-03-09T16:15:20.348 INFO:tasks.workunit.client.1.vm05.stdout:2/606: dread db/dd/d15/d4c/f58 [0,4194304] 0 2026-03-09T16:15:20.352 INFO:tasks.workunit.client.1.vm05.stdout:2/607: truncate db/dd/d15/d1f/d21/d87/fbf 458032 0 2026-03-09T16:15:20.352 INFO:tasks.workunit.client.1.vm05.stdout:2/608: fsync db/fa4 0 2026-03-09T16:15:20.356 INFO:tasks.workunit.client.1.vm05.stdout:9/660: dread d4/f5b [4194304,4194304] 0 2026-03-09T16:15:20.358 INFO:tasks.workunit.client.1.vm05.stdout:9/661: symlink d4/d10/d35/d2b/d31/d82/ldf 0 2026-03-09T16:15:20.360 INFO:tasks.workunit.client.1.vm05.stdout:9/662: symlink d4/d10/d35/d36/d48/d60/dcb/dd2/le0 0 2026-03-09T16:15:20.361 INFO:tasks.workunit.client.1.vm05.stdout:9/663: mknod d4/d10/dd7/ce1 0 2026-03-09T16:15:20.364 INFO:tasks.workunit.client.1.vm05.stdout:9/664: rename d4/d10/d35/c19 to d4/d10/d35/d2b/d31/dc8/ce2 0 2026-03-09T16:15:20.371 INFO:tasks.workunit.client.1.vm05.stdout:9/665: rename d4/d10/d35/d36/d48/d4c to d4/d10/d35/d2b/d38/d65/dd6/de3 0 2026-03-09T16:15:20.371 INFO:tasks.workunit.client.1.vm05.stdout:9/666: chown d4/d10/d35/d2b/d38/d65/lb4 6 1 2026-03-09T16:15:20.371 INFO:tasks.workunit.client.1.vm05.stdout:9/667: fsync d4/d10/d35/d2b/d38/f5e 0 2026-03-09T16:15:20.371 INFO:tasks.workunit.client.1.vm05.stdout:9/668: rename d4/d10/l25 to d4/d10/d35/d2b/d31/d82/le4 0 2026-03-09T16:15:20.377 INFO:tasks.workunit.client.1.vm05.stdout:3/567: truncate d0/d9/d22/d5f/d7b/d99/f9d 664575 0 2026-03-09T16:15:20.381 INFO:tasks.workunit.client.1.vm05.stdout:3/568: dwrite d0/d33/f5e [0,4194304] 0 2026-03-09T16:15:20.382 INFO:tasks.workunit.client.1.vm05.stdout:3/569: chown d0/da9 152894 1 2026-03-09T16:15:20.419 INFO:tasks.workunit.client.1.vm05.stdout:4/710: getdents d5/de/d15/da9/db1/dad 0 2026-03-09T16:15:20.421 INFO:tasks.workunit.client.1.vm05.stdout:4/711: mkdir d5/de/d15/d21/da0/de3/d100 0 2026-03-09T16:15:20.422 INFO:tasks.workunit.client.1.vm05.stdout:4/712: read d5/de/d15/d21/f50 [45299,28781] 0 2026-03-09T16:15:20.423 INFO:tasks.workunit.client.1.vm05.stdout:4/713: creat d5/de/d15/da9/f101 x:0 0 0 2026-03-09T16:15:20.426 INFO:tasks.workunit.client.1.vm05.stdout:4/714: mknod d5/de/d82/c102 0 2026-03-09T16:15:20.426 INFO:tasks.workunit.client.1.vm05.stdout:4/715: chown d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/ccf 10 1 2026-03-09T16:15:20.451 INFO:tasks.workunit.client.1.vm05.stdout:0/638: dread d5/db/d5f/f85 [0,4194304] 0 2026-03-09T16:15:20.454 INFO:tasks.workunit.client.1.vm05.stdout:0/639: unlink d5/d97/c98 0 2026-03-09T16:15:20.459 INFO:tasks.workunit.client.1.vm05.stdout:4/716: rename d5/de/d15/da9/db1/dad/d90/dbb to d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103 0 2026-03-09T16:15:20.460 INFO:tasks.workunit.client.1.vm05.stdout:7/690: rmdir d1/d2/d8 39 2026-03-09T16:15:20.460 INFO:tasks.workunit.client.1.vm05.stdout:0/640: unlink d5/db/d5f/da3/fb1 0 2026-03-09T16:15:20.461 INFO:tasks.workunit.client.1.vm05.stdout:0/641: stat d5/d2c/d49/d83/d8b 0 2026-03-09T16:15:20.465 INFO:tasks.workunit.client.1.vm05.stdout:5/652: dwrite d8/f7b [0,4194304] 0 2026-03-09T16:15:20.472 INFO:tasks.workunit.client.1.vm05.stdout:6/623: dwrite d17/d22/f79 [0,4194304] 0 2026-03-09T16:15:20.475 INFO:tasks.workunit.client.1.vm05.stdout:4/717: dwrite d5/fd [4194304,4194304] 0 2026-03-09T16:15:20.477 INFO:tasks.workunit.client.1.vm05.stdout:2/609: dwrite db/dd/d15/d1f/d20/f53 [0,4194304] 0 2026-03-09T16:15:20.479 INFO:tasks.workunit.client.1.vm05.stdout:7/691: mknod d1/cf5 0 2026-03-09T16:15:20.479 INFO:tasks.workunit.client.1.vm05.stdout:6/624: fsync d17/d22/d27/fd7 0 2026-03-09T16:15:20.485 INFO:tasks.workunit.client.1.vm05.stdout:5/653: creat d8/d59/d5b/d8b/da0/feb x:0 0 0 2026-03-09T16:15:20.485 INFO:tasks.workunit.client.1.vm05.stdout:4/718: readlink d5/l1d 0 2026-03-09T16:15:20.492 INFO:tasks.workunit.client.1.vm05.stdout:4/719: symlink d5/de/d15/d21/d27/l104 0 2026-03-09T16:15:20.498 INFO:tasks.workunit.client.1.vm05.stdout:7/692: rename d1/cf5 to d1/d2/d8/dc/d1b/d30/d4b/db2/cf6 0 2026-03-09T16:15:20.498 INFO:tasks.workunit.client.1.vm05.stdout:5/654: dwrite d8/d18/dbc/dcc/daa/d43/f41 [0,4194304] 0 2026-03-09T16:15:20.498 INFO:tasks.workunit.client.1.vm05.stdout:6/625: dwrite d17/d22/d27/d44/f48 [4194304,4194304] 0 2026-03-09T16:15:20.501 INFO:tasks.workunit.client.1.vm05.stdout:6/626: chown d17/d22/d27/d8a/f88 320574293 1 2026-03-09T16:15:20.505 INFO:tasks.workunit.client.1.vm05.stdout:5/655: dwrite d8/d18/dbc/dcc/daa/f52 [4194304,4194304] 0 2026-03-09T16:15:20.508 INFO:tasks.workunit.client.1.vm05.stdout:6/627: dread - d17/d22/d27/d8a/fd0 zero size 2026-03-09T16:15:20.511 INFO:tasks.workunit.client.1.vm05.stdout:6/628: fdatasync d17/d22/d27/d8a/fa7 0 2026-03-09T16:15:20.511 INFO:tasks.workunit.client.1.vm05.stdout:7/693: creat d1/d2/d8/d31/ff7 x:0 0 0 2026-03-09T16:15:20.515 INFO:tasks.workunit.client.1.vm05.stdout:6/629: rmdir d17/d22/d27/d44 39 2026-03-09T16:15:20.516 INFO:tasks.workunit.client.1.vm05.stdout:6/630: write d17/d22/fe4 [346142,11798] 0 2026-03-09T16:15:20.517 INFO:tasks.workunit.client.1.vm05.stdout:5/656: dwrite d8/d18/d1b/d78/d90/fc4 [0,4194304] 0 2026-03-09T16:15:20.517 INFO:tasks.workunit.client.1.vm05.stdout:7/694: read - d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f49 zero size 2026-03-09T16:15:20.525 INFO:tasks.workunit.client.1.vm05.stdout:5/657: chown d8/c12 1162686 1 2026-03-09T16:15:20.527 INFO:tasks.workunit.client.1.vm05.stdout:5/658: chown d8/d18/dbc/dcc/daa 171 1 2026-03-09T16:15:20.528 INFO:tasks.workunit.client.1.vm05.stdout:7/695: dwrite d1/d2/d8/d31/fc5 [0,4194304] 0 2026-03-09T16:15:20.529 INFO:tasks.workunit.client.1.vm05.stdout:6/631: rename d17/f1b to d17/d22/d9d/fe8 0 2026-03-09T16:15:20.530 INFO:tasks.workunit.client.1.vm05.stdout:6/632: write d17/f5b [1165116,76698] 0 2026-03-09T16:15:20.532 INFO:tasks.workunit.client.1.vm05.stdout:6/633: chown d17/l49 50401346 1 2026-03-09T16:15:20.532 INFO:tasks.workunit.client.1.vm05.stdout:3/570: truncate d0/d9/d22/f30 4538334 0 2026-03-09T16:15:20.536 INFO:tasks.workunit.client.1.vm05.stdout:9/669: dwrite d4/f4a [4194304,4194304] 0 2026-03-09T16:15:20.539 INFO:tasks.workunit.client.1.vm05.stdout:7/696: dread d1/d2/d8/dc/f1a [4194304,4194304] 0 2026-03-09T16:15:20.539 INFO:tasks.workunit.client.1.vm05.stdout:1/714: dread d7/f34 [0,4194304] 0 2026-03-09T16:15:20.539 INFO:tasks.workunit.client.1.vm05.stdout:5/659: link d8/d18/d1b/l6a d8/d59/d5b/d8b/da0/lec 0 2026-03-09T16:15:20.539 INFO:tasks.workunit.client.1.vm05.stdout:9/670: getdents d4/d10/d35/d36/d48/d60/dae 0 2026-03-09T16:15:20.545 INFO:tasks.workunit.client.1.vm05.stdout:3/571: rename d0/d9/d22/c95 to d0/d9/d97/dbc/cbe 0 2026-03-09T16:15:20.547 INFO:tasks.workunit.client.1.vm05.stdout:6/634: getdents d17/d22/d27/d58/db8 0 2026-03-09T16:15:20.548 INFO:tasks.workunit.client.1.vm05.stdout:1/715: rename d7/d62/da3 to d7/dd/d21/d63/d71/ddc/df8 0 2026-03-09T16:15:20.549 INFO:tasks.workunit.client.1.vm05.stdout:3/572: creat d0/d9/d22/d4c/d80/fbf x:0 0 0 2026-03-09T16:15:20.549 INFO:tasks.workunit.client.1.vm05.stdout:9/671: dwrite d4/d10/d35/d36/f85 [4194304,4194304] 0 2026-03-09T16:15:20.555 INFO:tasks.workunit.client.1.vm05.stdout:6/635: truncate d17/d5d/d73/d83/fd5 566805 0 2026-03-09T16:15:20.555 INFO:tasks.workunit.client.1.vm05.stdout:9/672: chown d4/d10/d35/d36/f49 24231 1 2026-03-09T16:15:20.556 INFO:tasks.workunit.client.1.vm05.stdout:5/660: truncate d8/d18/d1b/f30 5059143 0 2026-03-09T16:15:20.556 INFO:tasks.workunit.client.1.vm05.stdout:3/573: mknod d0/d9/d22/d4c/d4e/cc0 0 2026-03-09T16:15:20.559 INFO:tasks.workunit.client.1.vm05.stdout:7/697: link d1/d2/d8/dc/d1b/d30/d4b/fe7 d1/d2/d8/dc/d1b/de6/ff8 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:1/716: rename d7/d15/d6e/dbc/dd6/dda to d7/dd/de/d52/df6/d55/df9 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:3/574: write d0/d9/d8b/fb0 [409118,86597] 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:9/673: mknod d4/ce5 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:5/661: chown d8/d18/d1b/f32 482 1 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:1/717: chown d7/dd/d21/d63/d71/f7b 3620136 1 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:6/636: rename d17/d22/d27/d58/lc0 to d17/d22/d27/d8a/le9 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:9/674: mknod d4/d10/ce6 0 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:9/675: chown d4/d10/faa 197462824 1 2026-03-09T16:15:20.568 INFO:tasks.workunit.client.1.vm05.stdout:1/718: rename d7/dd/l24 to d7/dbe/ded/df3/lfa 0 2026-03-09T16:15:20.573 INFO:tasks.workunit.client.1.vm05.stdout:6/637: creat d17/d22/d27/d58/fea x:0 0 0 2026-03-09T16:15:20.574 INFO:tasks.workunit.client.1.vm05.stdout:1/719: chown d7/ca 47961918 1 2026-03-09T16:15:20.574 INFO:tasks.workunit.client.1.vm05.stdout:5/662: mknod d8/d18/ced 0 2026-03-09T16:15:20.574 INFO:tasks.workunit.client.1.vm05.stdout:9/676: dwrite d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:20.576 INFO:tasks.workunit.client.1.vm05.stdout:3/575: rename d0/d9/d22/d5f/d75/d76/d88/d89/la4 to d0/d9/d22/d5f/d75/d76/d88/da3/lc1 0 2026-03-09T16:15:20.587 INFO:tasks.workunit.client.1.vm05.stdout:3/576: dread d0/d9/d22/d4c/d4e/f5d [0,4194304] 0 2026-03-09T16:15:20.590 INFO:tasks.workunit.client.1.vm05.stdout:1/720: link d7/dd/de/c12 d7/d15/d6e/cfb 0 2026-03-09T16:15:20.591 INFO:tasks.workunit.client.1.vm05.stdout:1/721: rmdir d7/dbe/ded 39 2026-03-09T16:15:20.592 INFO:tasks.workunit.client.1.vm05.stdout:9/677: link d4/d10/d35/d36/d48/d60/cb1 d4/d10/d35/d36/ce7 0 2026-03-09T16:15:20.599 INFO:tasks.workunit.client.1.vm05.stdout:9/678: rename f2 to d4/d10/d35/d2b/d31/d96/ddd/fe8 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:9/679: write d4/d10/d35/d36/d48/fb8 [929668,16238] 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:5/663: dwrite d8/d59/f5c [4194304,4194304] 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:9/680: fdatasync d4/d10/d35/d36/d48/f8e 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:3/577: dwrite d0/d9/d22/d5f/d7b/f9a [0,4194304] 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:3/578: write d0/d9/d22/d5f/d75/d76/d88/d89/f9e [426631,75419] 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:3/579: readlink d0/d9/d22/d4c/d4e/l59 0 2026-03-09T16:15:20.617 INFO:tasks.workunit.client.1.vm05.stdout:5/664: dread d8/d18/dbc/dcc/daa/f52 [4194304,4194304] 0 2026-03-09T16:15:20.618 INFO:tasks.workunit.client.1.vm05.stdout:9/681: creat d4/d10/d35/d2b/d38/fe9 x:0 0 0 2026-03-09T16:15:20.620 INFO:tasks.workunit.client.1.vm05.stdout:9/682: unlink d4/d10/l33 0 2026-03-09T16:15:20.622 INFO:tasks.workunit.client.1.vm05.stdout:9/683: mkdir d4/d10/d35/d2b/d38/d65/dea 0 2026-03-09T16:15:20.622 INFO:tasks.workunit.client.1.vm05.stdout:9/684: stat d4/d10/d35/d2b/c74 0 2026-03-09T16:15:20.624 INFO:tasks.workunit.client.1.vm05.stdout:9/685: creat d4/d10/dd7/feb x:0 0 0 2026-03-09T16:15:20.624 INFO:tasks.workunit.client.1.vm05.stdout:5/665: dwrite d8/d59/d5b/d8b/da0/feb [0,4194304] 0 2026-03-09T16:15:20.626 INFO:tasks.workunit.client.1.vm05.stdout:5/666: read d8/f7b [2111010,49630] 0 2026-03-09T16:15:20.627 INFO:tasks.workunit.client.1.vm05.stdout:5/667: dread - d8/d5e/fd6 zero size 2026-03-09T16:15:20.633 INFO:tasks.workunit.client.1.vm05.stdout:9/686: mkdir d4/d10/d35/d2b/d31/d82/dec 0 2026-03-09T16:15:20.633 INFO:tasks.workunit.client.1.vm05.stdout:5/668: rename d8/d18/d1b/d6b/cc6 to d8/d5e/d8e/cee 0 2026-03-09T16:15:20.635 INFO:tasks.workunit.client.1.vm05.stdout:5/669: fsync d8/d53/d7a/fc7 0 2026-03-09T16:15:20.635 INFO:tasks.workunit.client.1.vm05.stdout:5/670: readlink d8/d18/d1b/d47/l87 0 2026-03-09T16:15:20.649 INFO:tasks.workunit.client.1.vm05.stdout:5/671: rename d8/d59/l7f to d8/d53/d7e/lef 0 2026-03-09T16:15:20.649 INFO:tasks.workunit.client.1.vm05.stdout:5/672: dwrite d8/d18/dbc/dcc/daa/d43/f41 [4194304,4194304] 0 2026-03-09T16:15:20.651 INFO:tasks.workunit.client.1.vm05.stdout:5/673: unlink d8/d59/d5b/d8b/da0/cbe 0 2026-03-09T16:15:20.653 INFO:tasks.workunit.client.1.vm05.stdout:5/674: creat d8/d18/d1b/d47/d48/d73/d80/ff0 x:0 0 0 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/675: rmdir d8/d18/d1b/d47/d4e/d76/d8f/dab 39 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/676: creat d8/d18/d1b/d47/d4e/ff1 x:0 0 0 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/677: chown d8/d18/d1b/l42 56750 1 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/678: creat d8/d18/d1b/d47/d4e/d76/d8f/dab/ff2 x:0 0 0 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/679: rmdir d8/dd5 39 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/680: dwrite d8/d1d/f85 [0,4194304] 0 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/681: creat d8/dc8/ff3 x:0 0 0 2026-03-09T16:15:20.675 INFO:tasks.workunit.client.1.vm05.stdout:5/682: write d8/d18/dbc/dcc/daa/fb1 [1080060,20340] 0 2026-03-09T16:15:20.744 INFO:tasks.workunit.client.1.vm05.stdout:7/698: sync 2026-03-09T16:15:20.744 INFO:tasks.workunit.client.1.vm05.stdout:3/580: sync 2026-03-09T16:15:20.744 INFO:tasks.workunit.client.1.vm05.stdout:1/722: sync 2026-03-09T16:15:20.744 INFO:tasks.workunit.client.1.vm05.stdout:1/723: stat d7/d15/f8d 0 2026-03-09T16:15:20.746 INFO:tasks.workunit.client.1.vm05.stdout:3/581: mkdir d0/d9/d97/dc2 0 2026-03-09T16:15:20.747 INFO:tasks.workunit.client.1.vm05.stdout:7/699: mknod d1/d2/d8/dc/d1b/cf9 0 2026-03-09T16:15:20.749 INFO:tasks.workunit.client.1.vm05.stdout:1/724: dwrite d7/ff4 [0,4194304] 0 2026-03-09T16:15:20.750 INFO:tasks.workunit.client.1.vm05.stdout:7/700: chown d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c75 59 1 2026-03-09T16:15:20.756 INFO:tasks.workunit.client.1.vm05.stdout:7/701: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f63 [0,4194304] 0 2026-03-09T16:15:20.765 INFO:tasks.workunit.client.1.vm05.stdout:7/702: dread d1/d2/d8/d31/d8d/f52 [0,4194304] 0 2026-03-09T16:15:20.768 INFO:tasks.workunit.client.1.vm05.stdout:7/703: mkdir d1/d2/d11/d86/da2/dfa 0 2026-03-09T16:15:20.769 INFO:tasks.workunit.client.1.vm05.stdout:7/704: mknod d1/d2/d8/dc/d1b/d30/d4b/db2/cfb 0 2026-03-09T16:15:20.770 INFO:tasks.workunit.client.1.vm05.stdout:7/705: mknod d1/d2/d8/d31/cfc 0 2026-03-09T16:15:20.872 INFO:tasks.workunit.client.1.vm05.stdout:0/642: write d5/db/d5b/da5/fbe [248781,1557] 0 2026-03-09T16:15:20.872 INFO:tasks.workunit.client.1.vm05.stdout:2/610: write db/dd/d15/d4c/d56/f62 [2142665,124716] 0 2026-03-09T16:15:20.873 INFO:tasks.workunit.client.1.vm05.stdout:4/720: write d5/de/f24 [761814,19304] 0 2026-03-09T16:15:20.877 INFO:tasks.workunit.client.1.vm05.stdout:2/611: symlink db/dd/d15/d1f/lc7 0 2026-03-09T16:15:20.879 INFO:tasks.workunit.client.1.vm05.stdout:0/643: link d5/d2c/d49/d83/f9c d5/d1b/d3b/dc2/fe1 0 2026-03-09T16:15:20.879 INFO:tasks.workunit.client.1.vm05.stdout:2/612: creat db/dd/d15/d3f/d5b/d60/d6a/fc8 x:0 0 0 2026-03-09T16:15:20.880 INFO:tasks.workunit.client.1.vm05.stdout:4/721: link d5/de/d2f/fc5 d5/de/d15/da9/db1/dad/d90/f105 0 2026-03-09T16:15:20.881 INFO:tasks.workunit.client.1.vm05.stdout:0/644: write d5/d2c/d49/d83/d8b/d95/f6d [857301,59439] 0 2026-03-09T16:15:20.885 INFO:tasks.workunit.client.1.vm05.stdout:0/645: creat d5/d97/fe2 x:0 0 0 2026-03-09T16:15:20.885 INFO:tasks.workunit.client.1.vm05.stdout:4/722: symlink d5/de/d15/da9/db1/dad/d90/l106 0 2026-03-09T16:15:20.887 INFO:tasks.workunit.client.1.vm05.stdout:4/723: fdatasync d5/de/d15/da9/db1/f58 0 2026-03-09T16:15:20.887 INFO:tasks.workunit.client.1.vm05.stdout:0/646: truncate d5/d1b/d3b/dc2/fe1 461544 0 2026-03-09T16:15:20.892 INFO:tasks.workunit.client.1.vm05.stdout:4/724: creat d5/de/d15/d21/d27/d3c/d5c/f107 x:0 0 0 2026-03-09T16:15:20.895 INFO:tasks.workunit.client.1.vm05.stdout:4/725: mkdir d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d108 0 2026-03-09T16:15:20.896 INFO:tasks.workunit.client.1.vm05.stdout:2/613: dread db/dd/f1b [0,4194304] 0 2026-03-09T16:15:20.902 INFO:tasks.workunit.client.1.vm05.stdout:4/726: dwrite d5/de/d15/d21/d27/d3c/d5c/ff1 [0,4194304] 0 2026-03-09T16:15:20.905 INFO:tasks.workunit.client.1.vm05.stdout:4/727: write d5/de/d15/d21/d27/f36 [1205763,20296] 0 2026-03-09T16:15:20.906 INFO:tasks.workunit.client.1.vm05.stdout:4/728: creat d5/d9c/f109 x:0 0 0 2026-03-09T16:15:20.908 INFO:tasks.workunit.client.1.vm05.stdout:4/729: rmdir d5/de/d82/dc1 39 2026-03-09T16:15:20.933 INFO:tasks.workunit.client.1.vm05.stdout:6/638: truncate d17/f31 4012553 0 2026-03-09T16:15:20.936 INFO:tasks.workunit.client.1.vm05.stdout:6/639: mknod d17/d22/d27/d34/d4b/d7f/ceb 0 2026-03-09T16:15:20.939 INFO:tasks.workunit.client.1.vm05.stdout:6/640: mknod d17/cec 0 2026-03-09T16:15:20.942 INFO:tasks.workunit.client.1.vm05.stdout:8/668: dread d4/d6/d3a/d40/f7b [0,4194304] 0 2026-03-09T16:15:20.943 INFO:tasks.workunit.client.1.vm05.stdout:6/641: symlink d17/d22/d27/d34/led 0 2026-03-09T16:15:20.944 INFO:tasks.workunit.client.1.vm05.stdout:8/669: mknod d4/d6/db/dc/ce0 0 2026-03-09T16:15:20.948 INFO:tasks.workunit.client.1.vm05.stdout:8/670: mknod d4/d6/db/dc/d2e/d85/dc7/dda/ce1 0 2026-03-09T16:15:20.950 INFO:tasks.workunit.client.1.vm05.stdout:6/642: dwrite d17/f95 [0,4194304] 0 2026-03-09T16:15:20.953 INFO:tasks.workunit.client.1.vm05.stdout:8/671: creat d4/d6/d3a/d40/d71/fe2 x:0 0 0 2026-03-09T16:15:20.953 INFO:tasks.workunit.client.1.vm05.stdout:8/672: fsync d4/d6/db/dc/d5d/da0/dbf/fcf 0 2026-03-09T16:15:20.953 INFO:tasks.workunit.client.1.vm05.stdout:5/683: truncate d8/d18/f20 1786924 0 2026-03-09T16:15:20.954 INFO:tasks.workunit.client.1.vm05.stdout:6/643: mknod d17/d22/d9d/da5/cee 0 2026-03-09T16:15:20.955 INFO:tasks.workunit.client.1.vm05.stdout:5/684: creat d8/d5e/ff4 x:0 0 0 2026-03-09T16:15:20.957 INFO:tasks.workunit.client.1.vm05.stdout:5/685: truncate d8/d59/d5b/d8b/da0/feb 4284293 0 2026-03-09T16:15:20.961 INFO:tasks.workunit.client.1.vm05.stdout:6/644: dwrite d17/d4f/f77 [0,4194304] 0 2026-03-09T16:15:20.967 INFO:tasks.workunit.client.1.vm05.stdout:8/673: dread d4/f1c [0,4194304] 0 2026-03-09T16:15:20.969 INFO:tasks.workunit.client.1.vm05.stdout:8/674: chown d4/d6/db/df/d4f/lc4 263414 1 2026-03-09T16:15:20.976 INFO:tasks.workunit.client.1.vm05.stdout:1/725: dwrite d7/d27/f3c [0,4194304] 0 2026-03-09T16:15:20.977 INFO:tasks.workunit.client.1.vm05.stdout:7/706: write d1/d2/f5 [1373567,12750] 0 2026-03-09T16:15:20.978 INFO:tasks.workunit.client.1.vm05.stdout:8/675: dread - d4/d6/d3a/d15/f65 zero size 2026-03-09T16:15:20.979 INFO:tasks.workunit.client.1.vm05.stdout:3/582: dread d0/f5a [0,4194304] 0 2026-03-09T16:15:20.986 INFO:tasks.workunit.client.1.vm05.stdout:1/726: rmdir d7/dd/de/d52/df6 39 2026-03-09T16:15:20.988 INFO:tasks.workunit.client.1.vm05.stdout:6/645: creat d17/d5d/fef x:0 0 0 2026-03-09T16:15:20.990 INFO:tasks.workunit.client.1.vm05.stdout:8/676: write d4/d6/db/dc/d2e/f47 [7302062,86240] 0 2026-03-09T16:15:20.990 INFO:tasks.workunit.client.1.vm05.stdout:3/583: dwrite d0/d9/d22/d5f/d75/d76/fa5 [0,4194304] 0 2026-03-09T16:15:20.991 INFO:tasks.workunit.client.1.vm05.stdout:6/646: read d17/f5b [2620039,66793] 0 2026-03-09T16:15:21.000 INFO:tasks.workunit.client.1.vm05.stdout:7/707: dread d1/d2/d8/dc/d33/f9d [0,4194304] 0 2026-03-09T16:15:21.000 INFO:tasks.workunit.client.1.vm05.stdout:3/584: rmdir d0/d9/d8b 39 2026-03-09T16:15:21.004 INFO:tasks.workunit.client.1.vm05.stdout:8/677: rename d4/d6/f1b to d4/d6/db/d9b/fe3 0 2026-03-09T16:15:21.005 INFO:tasks.workunit.client.1.vm05.stdout:4/730: rmdir d5/de/d2f 39 2026-03-09T16:15:21.011 INFO:tasks.workunit.client.1.vm05.stdout:9/687: dread d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:21.014 INFO:tasks.workunit.client.1.vm05.stdout:0/647: truncate d5/f76 2829440 0 2026-03-09T16:15:21.016 INFO:tasks.workunit.client.1.vm05.stdout:2/614: truncate db/f2d 1027964 0 2026-03-09T16:15:21.016 INFO:tasks.workunit.client.1.vm05.stdout:2/615: dread - db/dd/fb8 zero size 2026-03-09T16:15:21.017 INFO:tasks.workunit.client.1.vm05.stdout:3/585: mknod d0/d9/d22/d4c/d80/cc3 0 2026-03-09T16:15:21.018 INFO:tasks.workunit.client.1.vm05.stdout:8/678: fdatasync f0 0 2026-03-09T16:15:21.019 INFO:tasks.workunit.client.1.vm05.stdout:6/647: creat d17/d22/d9d/da9/ff0 x:0 0 0 2026-03-09T16:15:21.020 INFO:tasks.workunit.client.1.vm05.stdout:4/731: mknod d5/de/d15/da9/c10a 0 2026-03-09T16:15:21.021 INFO:tasks.workunit.client.1.vm05.stdout:0/648: creat d5/d2c/d49/d83/d8b/daf/fe3 x:0 0 0 2026-03-09T16:15:21.023 INFO:tasks.workunit.client.1.vm05.stdout:9/688: getdents d4/d10/d35/d36 0 2026-03-09T16:15:21.029 INFO:tasks.workunit.client.1.vm05.stdout:9/689: chown d4/d10/d35/d2b/d38/f62 137535 1 2026-03-09T16:15:21.029 INFO:tasks.workunit.client.1.vm05.stdout:4/732: link d5/de/d15/da9/db1/l97 d5/de/d15/da9/db1/dad/d37/d60/dbf/d7d/l10b 0 2026-03-09T16:15:21.032 INFO:tasks.workunit.client.1.vm05.stdout:6/648: read d17/d1d/f67 [935373,124764] 0 2026-03-09T16:15:21.033 INFO:tasks.workunit.client.1.vm05.stdout:8/679: dread d4/d6/d53/f5a [0,4194304] 0 2026-03-09T16:15:21.035 INFO:tasks.workunit.client.1.vm05.stdout:4/733: link d5/de/d15/d21/d27/d3c/d5c/d5f/db6/fb8 d5/de/d15/d21/d27/d3c/d5c/d5f/f10c 0 2026-03-09T16:15:21.038 INFO:tasks.workunit.client.1.vm05.stdout:4/734: dread - d5/de/d15/d21/f6d zero size 2026-03-09T16:15:21.038 INFO:tasks.workunit.client.1.vm05.stdout:9/690: dread d4/d10/d35/f32 [0,4194304] 0 2026-03-09T16:15:21.039 INFO:tasks.workunit.client.1.vm05.stdout:8/680: unlink d4/d6/db/dc/d2e/d85/dc7/dda/ce1 0 2026-03-09T16:15:21.039 INFO:tasks.workunit.client.1.vm05.stdout:4/735: creat d5/de/d2f/f10d x:0 0 0 2026-03-09T16:15:21.040 INFO:tasks.workunit.client.1.vm05.stdout:9/691: creat d4/d10/d35/d36/d48/d54/db0/fed x:0 0 0 2026-03-09T16:15:21.043 INFO:tasks.workunit.client.1.vm05.stdout:8/681: creat d4/d6/db/d75/fe4 x:0 0 0 2026-03-09T16:15:21.043 INFO:tasks.workunit.client.1.vm05.stdout:4/736: symlink d5/de/d2f/l10e 0 2026-03-09T16:15:21.043 INFO:tasks.workunit.client.1.vm05.stdout:9/692: symlink d4/d10/d35/d36/lee 0 2026-03-09T16:15:21.052 INFO:tasks.workunit.client.1.vm05.stdout:3/586: sync 2026-03-09T16:15:21.055 INFO:tasks.workunit.client.1.vm05.stdout:4/737: creat d5/de/d15/d21/dfe/f10f x:0 0 0 2026-03-09T16:15:21.057 INFO:tasks.workunit.client.1.vm05.stdout:3/587: fdatasync d0/d9/d22/d5f/f66 0 2026-03-09T16:15:21.071 INFO:tasks.workunit.client.1.vm05.stdout:3/588: unlink d0/da9/laa 0 2026-03-09T16:15:21.071 INFO:tasks.workunit.client.1.vm05.stdout:4/738: fdatasync d5/de/d15/d21/f6d 0 2026-03-09T16:15:21.072 INFO:tasks.workunit.client.1.vm05.stdout:4/739: stat d5/de/d15/d21/da0 0 2026-03-09T16:15:21.086 INFO:tasks.workunit.client.1.vm05.stdout:8/682: dread d4/f77 [0,4194304] 0 2026-03-09T16:15:21.089 INFO:tasks.workunit.client.1.vm05.stdout:8/683: write d4/d6/d3a/d3c/f3f [98771,79605] 0 2026-03-09T16:15:21.089 INFO:tasks.workunit.client.1.vm05.stdout:8/684: readlink d4/d6/db/dc/d3b/l3d 0 2026-03-09T16:15:21.090 INFO:tasks.workunit.client.1.vm05.stdout:5/686: write d8/d53/d7e/f8a [718492,83385] 0 2026-03-09T16:15:21.093 INFO:tasks.workunit.client.1.vm05.stdout:8/685: write d4/d6/d3a/d40/d71/fe2 [919536,70742] 0 2026-03-09T16:15:21.094 INFO:tasks.workunit.client.1.vm05.stdout:3/589: getdents d0/d9/d22/d4c/d4e 0 2026-03-09T16:15:21.097 INFO:tasks.workunit.client.1.vm05.stdout:5/687: write d8/d18/dbc/dcc/f94 [1080192,118656] 0 2026-03-09T16:15:21.099 INFO:tasks.workunit.client.1.vm05.stdout:3/590: rename d0/d9/d22/d4c/l87 to d0/d33/lc4 0 2026-03-09T16:15:21.101 INFO:tasks.workunit.client.1.vm05.stdout:5/688: rmdir d8/d18/d1b/d47/d4e 39 2026-03-09T16:15:21.101 INFO:tasks.workunit.client.1.vm05.stdout:3/591: chown d0/d9/d22/d5f/d75/d76/d88/da3 1441 1 2026-03-09T16:15:21.104 INFO:tasks.workunit.client.1.vm05.stdout:8/686: truncate d4/d6/d53/f7f 154901 0 2026-03-09T16:15:21.104 INFO:tasks.workunit.client.1.vm05.stdout:5/689: read - d8/d18/d1b/d47/d4e/d76/d8f/dab/ff2 zero size 2026-03-09T16:15:21.104 INFO:tasks.workunit.client.1.vm05.stdout:3/592: rmdir d0/d9/d22/d4c/d4e/db3 39 2026-03-09T16:15:21.106 INFO:tasks.workunit.client.1.vm05.stdout:5/690: symlink d8/lf5 0 2026-03-09T16:15:21.109 INFO:tasks.workunit.client.1.vm05.stdout:8/687: creat d4/d6/d3a/d3c/fe5 x:0 0 0 2026-03-09T16:15:21.109 INFO:tasks.workunit.client.1.vm05.stdout:3/593: truncate d0/d9/f2b 4452439 0 2026-03-09T16:15:21.113 INFO:tasks.workunit.client.1.vm05.stdout:8/688: dread - d4/d6/db/df/fdc zero size 2026-03-09T16:15:21.116 INFO:tasks.workunit.client.1.vm05.stdout:5/691: mknod d8/d18/dbc/dcc/daa/cf6 0 2026-03-09T16:15:21.121 INFO:tasks.workunit.client.1.vm05.stdout:8/689: rename d4/d6/db/df/dd1/cdf to d4/d6/db/d9b/ce6 0 2026-03-09T16:15:21.122 INFO:tasks.workunit.client.1.vm05.stdout:0/649: dread d5/d11/d4f/d70/fd7 [0,4194304] 0 2026-03-09T16:15:21.123 INFO:tasks.workunit.client.1.vm05.stdout:5/692: symlink d8/d18/d1b/d47/d48/d73/lf7 0 2026-03-09T16:15:21.123 INFO:tasks.workunit.client.1.vm05.stdout:8/690: chown d4/d6/db/dc/d5d/da0/dbf/fcf 3 1 2026-03-09T16:15:21.124 INFO:tasks.workunit.client.1.vm05.stdout:8/691: chown d4/d6/db/dc/d5d/da0/dbf 187 1 2026-03-09T16:15:21.125 INFO:tasks.workunit.client.1.vm05.stdout:1/727: truncate d7/d62/d72/f79 1442350 0 2026-03-09T16:15:21.128 INFO:tasks.workunit.client.1.vm05.stdout:1/728: chown d7/dd/d21/d63/d71/f7b 3 1 2026-03-09T16:15:21.129 INFO:tasks.workunit.client.1.vm05.stdout:1/729: stat d7/d15/d16/f5f 0 2026-03-09T16:15:21.132 INFO:tasks.workunit.client.1.vm05.stdout:5/693: mknod d8/d18/d1b/d47/dda/cf8 0 2026-03-09T16:15:21.134 INFO:tasks.workunit.client.1.vm05.stdout:7/708: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f7f [0,4194304] 0 2026-03-09T16:15:21.138 INFO:tasks.workunit.client.1.vm05.stdout:6/649: truncate d17/d22/d27/d34/d4b/fa4 17874 0 2026-03-09T16:15:21.138 INFO:tasks.workunit.client.1.vm05.stdout:2/616: dwrite db/dd/d15/d3f/d5b/f97 [0,4194304] 0 2026-03-09T16:15:21.139 INFO:tasks.workunit.client.1.vm05.stdout:2/617: chown db/dd/d15/f28 1145 1 2026-03-09T16:15:21.141 INFO:tasks.workunit.client.1.vm05.stdout:8/692: mknod d4/d6/db/ce7 0 2026-03-09T16:15:21.147 INFO:tasks.workunit.client.1.vm05.stdout:0/650: read d5/d1b/d3b/f6f [1733924,117441] 0 2026-03-09T16:15:21.147 INFO:tasks.workunit.client.1.vm05.stdout:5/694: dread d8/d18/dbc/dcc/daa/f52 [4194304,4194304] 0 2026-03-09T16:15:21.153 INFO:tasks.workunit.client.1.vm05.stdout:0/651: dread d5/d1b/d3b/dc2/fe1 [0,4194304] 0 2026-03-09T16:15:21.153 INFO:tasks.workunit.client.1.vm05.stdout:5/695: read d8/d53/d7a/fc7 [509875,110644] 0 2026-03-09T16:15:21.154 INFO:tasks.workunit.client.1.vm05.stdout:0/652: chown d5/d1b/f56 3 1 2026-03-09T16:15:21.155 INFO:tasks.workunit.client.1.vm05.stdout:6/650: creat d17/d22/d27/d8a/d8b/ff1 x:0 0 0 2026-03-09T16:15:21.155 INFO:tasks.workunit.client.1.vm05.stdout:7/709: mkdir d1/d2/d8/dfd 0 2026-03-09T16:15:21.158 INFO:tasks.workunit.client.1.vm05.stdout:6/651: fdatasync d17/d4f/fbd 0 2026-03-09T16:15:21.159 INFO:tasks.workunit.client.1.vm05.stdout:2/618: dwrite db/dd/d15/d3f/d5b/d60/f7c [0,4194304] 0 2026-03-09T16:15:21.161 INFO:tasks.workunit.client.1.vm05.stdout:8/693: creat d4/d6/db/dc/d5d/d79/fe8 x:0 0 0 2026-03-09T16:15:21.161 INFO:tasks.workunit.client.1.vm05.stdout:3/594: link d0/d9/d22/d6b/c94 d0/d9/d22/d5f/d90/cc5 0 2026-03-09T16:15:21.165 INFO:tasks.workunit.client.1.vm05.stdout:3/595: write d0/d9/d22/d5f/d90/fa6 [758566,47706] 0 2026-03-09T16:15:21.171 INFO:tasks.workunit.client.1.vm05.stdout:5/696: chown d8/d18/d1b/d47/dda/ce7 5 1 2026-03-09T16:15:21.174 INFO:tasks.workunit.client.1.vm05.stdout:0/653: mknod d5/db/d5f/da3/da4/ce4 0 2026-03-09T16:15:21.179 INFO:tasks.workunit.client.1.vm05.stdout:8/694: mkdir d4/de9 0 2026-03-09T16:15:21.179 INFO:tasks.workunit.client.1.vm05.stdout:0/654: chown d5/db/d5b/da5/fbe 3888 1 2026-03-09T16:15:21.179 INFO:tasks.workunit.client.1.vm05.stdout:0/655: fsync d5/d97/fc0 0 2026-03-09T16:15:21.184 INFO:tasks.workunit.client.1.vm05.stdout:9/693: dwrite d4/d10/f15 [0,4194304] 0 2026-03-09T16:15:21.186 INFO:tasks.workunit.client.1.vm05.stdout:0/656: rmdir d5/db/d5b/da5 39 2026-03-09T16:15:21.188 INFO:tasks.workunit.client.1.vm05.stdout:3/596: chown d0/d9/d22/d4c/d4e/db3/cba 19 1 2026-03-09T16:15:21.191 INFO:tasks.workunit.client.1.vm05.stdout:5/697: mkdir d8/d18/d1b/d47/d4e/d76/d8f/df9 0 2026-03-09T16:15:21.191 INFO:tasks.workunit.client.1.vm05.stdout:3/597: fsync d0/d33/f5e 0 2026-03-09T16:15:21.192 INFO:tasks.workunit.client.1.vm05.stdout:3/598: dread - d0/d33/f7d zero size 2026-03-09T16:15:21.200 INFO:tasks.workunit.client.1.vm05.stdout:5/698: dwrite d8/d59/d5b/d8b/da0/feb [0,4194304] 0 2026-03-09T16:15:21.214 INFO:tasks.workunit.client.1.vm05.stdout:5/699: fdatasync d8/d18/d1b/f32 0 2026-03-09T16:15:21.216 INFO:tasks.workunit.client.1.vm05.stdout:4/740: write d5/de/d2f/f99 [539374,2311] 0 2026-03-09T16:15:21.217 INFO:tasks.workunit.client.1.vm05.stdout:3/599: read d0/d9/d22/d4c/f7f [3102375,55682] 0 2026-03-09T16:15:21.217 INFO:tasks.workunit.client.1.vm05.stdout:4/741: chown d5/de 12425 1 2026-03-09T16:15:21.221 INFO:tasks.workunit.client.1.vm05.stdout:4/742: fsync d5/de/d15/da9/db1/dad/d90/dd8/fe2 0 2026-03-09T16:15:21.222 INFO:tasks.workunit.client.1.vm05.stdout:3/600: symlink d0/d9/d22/d5f/d7b/d99/lc6 0 2026-03-09T16:15:21.223 INFO:tasks.workunit.client.1.vm05.stdout:5/700: fdatasync d8/d59/f83 0 2026-03-09T16:15:21.225 INFO:tasks.workunit.client.1.vm05.stdout:5/701: dread - d8/d18/d1b/d47/d48/d73/d80/ff0 zero size 2026-03-09T16:15:21.226 INFO:tasks.workunit.client.1.vm05.stdout:4/743: mknod d5/de/d15/d21/c110 0 2026-03-09T16:15:21.226 INFO:tasks.workunit.client.1.vm05.stdout:3/601: creat d0/d9/d8b/fc7 x:0 0 0 2026-03-09T16:15:21.227 INFO:tasks.workunit.client.1.vm05.stdout:5/702: stat d8/d18/dbc/dcc/daa/f3c 0 2026-03-09T16:15:21.230 INFO:tasks.workunit.client.1.vm05.stdout:3/602: mknod d0/d33/cc8 0 2026-03-09T16:15:21.231 INFO:tasks.workunit.client.1.vm05.stdout:4/744: mknod d5/de/d15/da9/db1/dad/c111 0 2026-03-09T16:15:21.233 INFO:tasks.workunit.client.1.vm05.stdout:4/745: chown c3 36515735 1 2026-03-09T16:15:21.235 INFO:tasks.workunit.client.1.vm05.stdout:3/603: rename d0/d9/d22/d5f/l98 to d0/d9/d22/d5f/d75/d76/lc9 0 2026-03-09T16:15:21.236 INFO:tasks.workunit.client.1.vm05.stdout:3/604: creat d0/d9/d97/fca x:0 0 0 2026-03-09T16:15:21.239 INFO:tasks.workunit.client.1.vm05.stdout:4/746: creat d5/de/d15/d21/d27/d3c/d5c/d5f/f112 x:0 0 0 2026-03-09T16:15:21.241 INFO:tasks.workunit.client.1.vm05.stdout:4/747: creat d5/de/d15/da9/db1/dad/d37/d60/f113 x:0 0 0 2026-03-09T16:15:21.242 INFO:tasks.workunit.client.1.vm05.stdout:4/748: mknod d5/de/d15/da9/db1/dad/d37/d60/c114 0 2026-03-09T16:15:21.243 INFO:tasks.workunit.client.1.vm05.stdout:4/749: write d5/de/d15/d21/d39/f42 [754766,23213] 0 2026-03-09T16:15:21.245 INFO:tasks.workunit.client.1.vm05.stdout:4/750: unlink d5/de/d15/d21/d27/l104 0 2026-03-09T16:15:21.246 INFO:tasks.workunit.client.1.vm05.stdout:4/751: creat d5/de/d15/da9/db1/dad/d37/f115 x:0 0 0 2026-03-09T16:15:21.252 INFO:tasks.workunit.client.1.vm05.stdout:4/752: rename d5/de/d15/d21/d27/d3c/d5c/da2/dc9/deb to d5/d116 0 2026-03-09T16:15:21.264 INFO:tasks.workunit.client.1.vm05.stdout:8/695: rmdir d4/d6/db/d9b 39 2026-03-09T16:15:21.266 INFO:tasks.workunit.client.1.vm05.stdout:4/753: dread d5/de/f16 [0,4194304] 0 2026-03-09T16:15:21.268 INFO:tasks.workunit.client.1.vm05.stdout:8/696: symlink d4/d6/db/df/d4f/d9f/lea 0 2026-03-09T16:15:21.272 INFO:tasks.workunit.client.1.vm05.stdout:4/754: truncate f1 131991 0 2026-03-09T16:15:21.276 INFO:tasks.workunit.client.1.vm05.stdout:4/755: write d5/d9c/f109 [737874,91386] 0 2026-03-09T16:15:21.282 INFO:tasks.workunit.client.1.vm05.stdout:6/652: dread d17/d22/d27/f6b [0,4194304] 0 2026-03-09T16:15:21.285 INFO:tasks.workunit.client.1.vm05.stdout:1/730: dwrite d7/d27/f33 [0,4194304] 0 2026-03-09T16:15:21.286 INFO:tasks.workunit.client.1.vm05.stdout:1/731: dread - d7/daa/fef zero size 2026-03-09T16:15:21.287 INFO:tasks.workunit.client.1.vm05.stdout:8/697: creat d4/d6/db/dc/d5d/da0/dbf/feb x:0 0 0 2026-03-09T16:15:21.300 INFO:tasks.workunit.client.1.vm05.stdout:4/756: unlink d5/de/d15/fa3 0 2026-03-09T16:15:21.302 INFO:tasks.workunit.client.1.vm05.stdout:8/698: creat d4/d6/db/dc/fec x:0 0 0 2026-03-09T16:15:21.309 INFO:tasks.workunit.client.1.vm05.stdout:7/710: write d1/d2/d8/dc/d1b/d30/d4b/fe7 [3176361,2757] 0 2026-03-09T16:15:21.316 INFO:tasks.workunit.client.1.vm05.stdout:4/757: creat d5/de/d15/da9/db1/dad/d90/dd8/f117 x:0 0 0 2026-03-09T16:15:21.316 INFO:tasks.workunit.client.1.vm05.stdout:2/619: write db/dd/d15/d1f/d21/d87/f99 [934813,5522] 0 2026-03-09T16:15:21.316 INFO:tasks.workunit.client.1.vm05.stdout:0/657: write d5/d11/d4f/d68/f94 [1160158,103685] 0 2026-03-09T16:15:21.317 INFO:tasks.workunit.client.1.vm05.stdout:1/732: sync 2026-03-09T16:15:21.325 INFO:tasks.workunit.client.1.vm05.stdout:4/758: read d5/de/f9d [1939208,52117] 0 2026-03-09T16:15:21.326 INFO:tasks.workunit.client.1.vm05.stdout:7/711: rmdir d1 39 2026-03-09T16:15:21.326 INFO:tasks.workunit.client.1.vm05.stdout:8/699: dwrite d4/d6/db/df/f18 [0,4194304] 0 2026-03-09T16:15:21.327 INFO:tasks.workunit.client.1.vm05.stdout:0/658: read - d5/db/d5b/d82/fc7 zero size 2026-03-09T16:15:21.334 INFO:tasks.workunit.client.1.vm05.stdout:0/659: write d5/d2c/d49/d83/d8b/d95/f2e [3133638,48365] 0 2026-03-09T16:15:21.341 INFO:tasks.workunit.client.1.vm05.stdout:4/759: chown d5/de/d15/d21/d27/d3c/d5c/d5f/fdd 2 1 2026-03-09T16:15:21.343 INFO:tasks.workunit.client.1.vm05.stdout:2/620: creat db/dd/d15/d3f/d5b/d60/d95/fc9 x:0 0 0 2026-03-09T16:15:21.343 INFO:tasks.workunit.client.1.vm05.stdout:9/694: dwrite d4/d10/d35/d36/d48/d54/d59/fb6 [4194304,4194304] 0 2026-03-09T16:15:21.348 INFO:tasks.workunit.client.1.vm05.stdout:5/703: fsync d8/dc8/fdb 0 2026-03-09T16:15:21.354 INFO:tasks.workunit.client.1.vm05.stdout:1/733: truncate d7/dd/de/d52/df6/d55/fb1 108411 0 2026-03-09T16:15:21.355 INFO:tasks.workunit.client.1.vm05.stdout:5/704: chown d8/d59/d5b/f66 1960287 1 2026-03-09T16:15:21.357 INFO:tasks.workunit.client.1.vm05.stdout:8/700: dwrite d4/d6/db/d59/fdb [0,4194304] 0 2026-03-09T16:15:21.358 INFO:tasks.workunit.client.1.vm05.stdout:3/605: dwrite d0/d9/d22/f18 [0,4194304] 0 2026-03-09T16:15:21.363 INFO:tasks.workunit.client.1.vm05.stdout:0/660: read d5/d2c/f41 [136455,27569] 0 2026-03-09T16:15:21.364 INFO:tasks.workunit.client.1.vm05.stdout:8/701: chown d4/d6/db/df/fdc 62646 1 2026-03-09T16:15:21.372 INFO:tasks.workunit.client.1.vm05.stdout:6/653: dwrite d17/d4f/f70 [0,4194304] 0 2026-03-09T16:15:21.378 INFO:tasks.workunit.client.1.vm05.stdout:7/712: creat d1/d2/d11/d86/d8a/d91/ffe x:0 0 0 2026-03-09T16:15:21.381 INFO:tasks.workunit.client.1.vm05.stdout:9/695: dread d4/d10/d35/d36/fb3 [0,4194304] 0 2026-03-09T16:15:21.381 INFO:tasks.workunit.client.1.vm05.stdout:2/621: dread db/dd/d15/d1f/d20/d23/d78/f92 [0,4194304] 0 2026-03-09T16:15:21.381 INFO:tasks.workunit.client.1.vm05.stdout:2/622: write db/dd/d15/d1f/d20/d23/d78/f92 [229632,44855] 0 2026-03-09T16:15:21.383 INFO:tasks.workunit.client.1.vm05.stdout:3/606: sync 2026-03-09T16:15:21.387 INFO:tasks.workunit.client.1.vm05.stdout:9/696: sync 2026-03-09T16:15:21.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:21 vm03.local ceph-mon[51019]: pgmap v20: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 42 MiB/s rd, 128 MiB/s wr, 260 op/s 2026-03-09T16:15:21.392 INFO:tasks.workunit.client.1.vm05.stdout:6/654: symlink d17/d22/dce/lf2 0 2026-03-09T16:15:21.395 INFO:tasks.workunit.client.1.vm05.stdout:6/655: dwrite d17/d22/d27/fd7 [0,4194304] 0 2026-03-09T16:15:21.417 INFO:tasks.workunit.client.1.vm05.stdout:6/656: dread d17/d22/d27/d34/d4b/d7f/fc8 [0,4194304] 0 2026-03-09T16:15:21.418 INFO:tasks.workunit.client.1.vm05.stdout:6/657: truncate d17/d1d/fe1 750676 0 2026-03-09T16:15:21.420 INFO:tasks.workunit.client.1.vm05.stdout:1/734: truncate d7/dd/f1f 3146315 0 2026-03-09T16:15:21.424 INFO:tasks.workunit.client.1.vm05.stdout:1/735: write d7/d27/f84 [2643423,60398] 0 2026-03-09T16:15:21.424 INFO:tasks.workunit.client.1.vm05.stdout:0/661: dread d5/db/f54 [0,4194304] 0 2026-03-09T16:15:21.427 INFO:tasks.workunit.client.1.vm05.stdout:4/760: dread d5/de/f9d [0,4194304] 0 2026-03-09T16:15:21.431 INFO:tasks.workunit.client.1.vm05.stdout:9/697: creat d4/d10/d35/d2b/fef x:0 0 0 2026-03-09T16:15:21.438 INFO:tasks.workunit.client.1.vm05.stdout:5/705: creat d8/ffa x:0 0 0 2026-03-09T16:15:21.438 INFO:tasks.workunit.client.1.vm05.stdout:6/658: creat d17/d22/d27/d58/db8/ff3 x:0 0 0 2026-03-09T16:15:21.440 INFO:tasks.workunit.client.1.vm05.stdout:8/702: creat d4/d6/db/fed x:0 0 0 2026-03-09T16:15:21.441 INFO:tasks.workunit.client.1.vm05.stdout:8/703: write d4/d6/db/d59/fdb [862948,32260] 0 2026-03-09T16:15:21.442 INFO:tasks.workunit.client.1.vm05.stdout:0/662: chown d5/d11/d4f/d68/caa 98102273 1 2026-03-09T16:15:21.445 INFO:tasks.workunit.client.1.vm05.stdout:4/761: mknod d5/c118 0 2026-03-09T16:15:21.445 INFO:tasks.workunit.client.1.vm05.stdout:0/663: stat d5/d1b/d3b/dc2/fe1 0 2026-03-09T16:15:21.445 INFO:tasks.workunit.client.1.vm05.stdout:8/704: read d4/d6/db/dc/fa2 [5082485,73076] 0 2026-03-09T16:15:21.448 INFO:tasks.workunit.client.1.vm05.stdout:0/664: chown d5/d2c/d49/d83/d8b/daf/lba 498 1 2026-03-09T16:15:21.453 INFO:tasks.workunit.client.1.vm05.stdout:4/762: dwrite d5/de/d15/d21/d27/d3c/f3d [0,4194304] 0 2026-03-09T16:15:21.454 INFO:tasks.workunit.client.1.vm05.stdout:1/736: dwrite d7/f34 [0,4194304] 0 2026-03-09T16:15:21.463 INFO:tasks.workunit.client.1.vm05.stdout:1/737: sync 2026-03-09T16:15:21.464 INFO:tasks.workunit.client.1.vm05.stdout:2/623: creat db/dd/fca x:0 0 0 2026-03-09T16:15:21.464 INFO:tasks.workunit.client.1.vm05.stdout:3/607: creat d0/d9/d22/d5f/d90/fcb x:0 0 0 2026-03-09T16:15:21.465 INFO:tasks.workunit.client.1.vm05.stdout:8/705: fdatasync d4/d6/d53/f5a 0 2026-03-09T16:15:21.465 INFO:tasks.workunit.client.1.vm05.stdout:3/608: chown d0/d9/c71 2910080 1 2026-03-09T16:15:21.470 INFO:tasks.workunit.client.1.vm05.stdout:3/609: readlink d0/d9/l25 0 2026-03-09T16:15:21.472 INFO:tasks.workunit.client.1.vm05.stdout:1/738: mkdir d7/dd/de/d52/df6/dfc 0 2026-03-09T16:15:21.472 INFO:tasks.workunit.client.1.vm05.stdout:3/610: write d0/d33/f7d [379319,8721] 0 2026-03-09T16:15:21.476 INFO:tasks.workunit.client.1.vm05.stdout:2/624: fdatasync db/dd/d15/d1f/d20/d23/d78/f92 0 2026-03-09T16:15:21.477 INFO:tasks.workunit.client.1.vm05.stdout:2/625: truncate db/fa4 86522 0 2026-03-09T16:15:21.481 INFO:tasks.workunit.client.1.vm05.stdout:2/626: write db/dd/d15/d1f/d20/d23/fbb [1186177,24030] 0 2026-03-09T16:15:21.486 INFO:tasks.workunit.client.1.vm05.stdout:8/706: dread d4/d6/db/dc/f2a [0,4194304] 0 2026-03-09T16:15:21.498 INFO:tasks.workunit.client.1.vm05.stdout:4/763: mkdir d5/de/d15/d21/da0/de3/d100/d119 0 2026-03-09T16:15:21.499 INFO:tasks.workunit.client.1.vm05.stdout:4/764: fsync d5/f2d 0 2026-03-09T16:15:21.504 INFO:tasks.workunit.client.1.vm05.stdout:9/698: rename d4/d10/d35/c4d to d4/d10/d35/d2b/d38/cf0 0 2026-03-09T16:15:21.512 INFO:tasks.workunit.client.1.vm05.stdout:1/739: symlink d7/dd/de/lfd 0 2026-03-09T16:15:21.525 INFO:tasks.workunit.client.1.vm05.stdout:3/611: mknod d0/d9/d8b/ccc 0 2026-03-09T16:15:21.525 INFO:tasks.workunit.client.1.vm05.stdout:3/612: stat d0/f57 0 2026-03-09T16:15:21.525 INFO:tasks.workunit.client.1.vm05.stdout:4/765: creat d5/d9c/dbd/f11a x:0 0 0 2026-03-09T16:15:21.525 INFO:tasks.workunit.client.1.vm05.stdout:4/766: chown d5/de/d15/da9/db1/dad/d90/f105 39248 1 2026-03-09T16:15:21.525 INFO:tasks.workunit.client.1.vm05.stdout:4/767: read - d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 zero size 2026-03-09T16:15:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:21 vm05.local ceph-mon[58702]: pgmap v20: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 42 MiB/s rd, 128 MiB/s wr, 260 op/s 2026-03-09T16:15:21.528 INFO:tasks.workunit.client.1.vm05.stdout:8/707: rename d4/d6/d3a/d15/f22 to d4/d6/db/dc/d2e/d85/dc7/dda/fee 0 2026-03-09T16:15:21.529 INFO:tasks.workunit.client.1.vm05.stdout:4/768: stat d5/de/d15/d21/d27/d3c/d5c/fe0 0 2026-03-09T16:15:21.529 INFO:tasks.workunit.client.1.vm05.stdout:4/769: chown f0 639867 1 2026-03-09T16:15:21.530 INFO:tasks.workunit.client.1.vm05.stdout:9/699: fsync d4/d10/d35/d2b/d38/d65/f6a 0 2026-03-09T16:15:21.538 INFO:tasks.workunit.client.1.vm05.stdout:3/613: rename d0/c58 to d0/d9/d22/d5f/d90/ccd 0 2026-03-09T16:15:21.539 INFO:tasks.workunit.client.1.vm05.stdout:8/708: rename d4/d6/db/dc to d4/d6/db/dc/def 22 2026-03-09T16:15:21.539 INFO:tasks.workunit.client.1.vm05.stdout:4/770: rename d5/de to d5/de/d15/da9/df6/d11b 22 2026-03-09T16:15:21.541 INFO:tasks.workunit.client.1.vm05.stdout:4/771: readlink d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/l75 0 2026-03-09T16:15:21.545 INFO:tasks.workunit.client.1.vm05.stdout:6/659: rmdir d17/d22/dce 39 2026-03-09T16:15:21.546 INFO:tasks.workunit.client.1.vm05.stdout:7/713: write d1/d2/d8/d67/d76/fc3 [680118,48315] 0 2026-03-09T16:15:21.547 INFO:tasks.workunit.client.1.vm05.stdout:5/706: write d8/d53/d7a/fc7 [1761449,124009] 0 2026-03-09T16:15:21.548 INFO:tasks.workunit.client.1.vm05.stdout:6/660: dread - d17/d5d/fef zero size 2026-03-09T16:15:21.549 INFO:tasks.workunit.client.1.vm05.stdout:5/707: fdatasync d8/d18/d1b/d47/d4e/fe1 0 2026-03-09T16:15:21.552 INFO:tasks.workunit.client.1.vm05.stdout:9/700: dread d4/d10/d35/d2b/d38/f5e [0,4194304] 0 2026-03-09T16:15:21.556 INFO:tasks.workunit.client.1.vm05.stdout:0/665: dwrite d5/d1b/f25 [0,4194304] 0 2026-03-09T16:15:21.560 INFO:tasks.workunit.client.1.vm05.stdout:5/708: dwrite d8/d59/d5b/f66 [8388608,4194304] 0 2026-03-09T16:15:21.581 INFO:tasks.workunit.client.1.vm05.stdout:2/627: creat db/dd/d15/d4c/fcb x:0 0 0 2026-03-09T16:15:21.581 INFO:tasks.workunit.client.1.vm05.stdout:2/628: readlink db/dd/d15/d46/lc2 0 2026-03-09T16:15:21.583 INFO:tasks.workunit.client.1.vm05.stdout:1/740: mkdir d7/d15/d45/dee/dfe 0 2026-03-09T16:15:21.584 INFO:tasks.workunit.client.1.vm05.stdout:8/709: rmdir d4/d6/db/dc/d5d 39 2026-03-09T16:15:21.584 INFO:tasks.workunit.client.1.vm05.stdout:3/614: chown d0/d9/d22/d5f/d7b/da8 171365781 1 2026-03-09T16:15:21.585 INFO:tasks.workunit.client.1.vm05.stdout:4/772: creat d5/de/d15/d21/d27/d3c/d5c/da2/dc9/f11c x:0 0 0 2026-03-09T16:15:21.591 INFO:tasks.workunit.client.1.vm05.stdout:0/666: creat d5/db/d5b/d82/fe5 x:0 0 0 2026-03-09T16:15:21.595 INFO:tasks.workunit.client.1.vm05.stdout:4/773: dread d5/d9c/f109 [0,4194304] 0 2026-03-09T16:15:21.595 INFO:tasks.workunit.client.1.vm05.stdout:1/741: dwrite d7/d15/d45/f67 [0,4194304] 0 2026-03-09T16:15:21.603 INFO:tasks.workunit.client.1.vm05.stdout:8/710: sync 2026-03-09T16:15:21.604 INFO:tasks.workunit.client.1.vm05.stdout:8/711: chown d4/d6/d3a/f49 2142675288 1 2026-03-09T16:15:21.605 INFO:tasks.workunit.client.1.vm05.stdout:5/709: mkdir d8/d18/d1b/d47/d48/d73/dfb 0 2026-03-09T16:15:21.607 INFO:tasks.workunit.client.1.vm05.stdout:2/629: creat db/dd/d98/fcc x:0 0 0 2026-03-09T16:15:21.609 INFO:tasks.workunit.client.1.vm05.stdout:2/630: write db/dd/d15/d1f/d20/d23/f9b [3443301,40306] 0 2026-03-09T16:15:21.610 INFO:tasks.workunit.client.1.vm05.stdout:5/710: dwrite d8/d1d/f21 [0,4194304] 0 2026-03-09T16:15:21.611 INFO:tasks.workunit.client.1.vm05.stdout:2/631: stat db/dd/d15/d3f/d5b/f97 0 2026-03-09T16:15:21.612 INFO:tasks.workunit.client.1.vm05.stdout:2/632: chown db/dd/d15/d3f/d5b/d60 148 1 2026-03-09T16:15:21.631 INFO:tasks.workunit.client.1.vm05.stdout:0/667: rmdir d5/d1b/d30 39 2026-03-09T16:15:21.632 INFO:tasks.workunit.client.1.vm05.stdout:0/668: truncate d5/d1b/fce 370137 0 2026-03-09T16:15:21.635 INFO:tasks.workunit.client.1.vm05.stdout:4/774: mknod d5/de/d15/d21/d27/d3c/d5c/da2/c11d 0 2026-03-09T16:15:21.636 INFO:tasks.workunit.client.1.vm05.stdout:4/775: write d5/de/d15/d21/d39/f42 [130548,106242] 0 2026-03-09T16:15:21.637 INFO:tasks.workunit.client.1.vm05.stdout:4/776: write d5/de/d15/da9/db1/dad/d37/f115 [691086,126110] 0 2026-03-09T16:15:21.638 INFO:tasks.workunit.client.1.vm05.stdout:4/777: chown d5/de/d15/d21/dfe/f10f 126124 1 2026-03-09T16:15:21.641 INFO:tasks.workunit.client.1.vm05.stdout:4/778: chown d5/de/d15/d21/d27/d3c/d5c/da2/c11d 0 1 2026-03-09T16:15:21.644 INFO:tasks.workunit.client.1.vm05.stdout:8/712: unlink d4/d6/d3a/d40/f4e 0 2026-03-09T16:15:21.646 INFO:tasks.workunit.client.1.vm05.stdout:7/714: truncate d1/d2/d8/d31/fc5 839131 0 2026-03-09T16:15:21.654 INFO:tasks.workunit.client.1.vm05.stdout:3/615: mkdir d0/dce 0 2026-03-09T16:15:21.661 INFO:tasks.workunit.client.1.vm05.stdout:2/633: mkdir db/dd/d15/d46/d8d/dcd 0 2026-03-09T16:15:21.665 INFO:tasks.workunit.client.1.vm05.stdout:9/701: creat d4/d10/d35/ff1 x:0 0 0 2026-03-09T16:15:21.676 INFO:tasks.workunit.client.1.vm05.stdout:1/742: getdents d7/dd/de/d52/df6/db4 0 2026-03-09T16:15:21.681 INFO:tasks.workunit.client.1.vm05.stdout:8/713: rename d4/d6/db/dc/f17 to d4/d6/d3a/d67/ff0 0 2026-03-09T16:15:21.681 INFO:tasks.workunit.client.1.vm05.stdout:8/714: dread - d4/d6/d3a/d15/f65 zero size 2026-03-09T16:15:21.692 INFO:tasks.workunit.client.1.vm05.stdout:7/715: dwrite d1/d2/d8/dc/d1b/d30/d7d/fa5 [0,4194304] 0 2026-03-09T16:15:21.693 INFO:tasks.workunit.client.1.vm05.stdout:7/716: chown d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb 1 1 2026-03-09T16:15:21.694 INFO:tasks.workunit.client.1.vm05.stdout:7/717: stat d1/d2/d8/dc/d1b/d30/d4b/l90 0 2026-03-09T16:15:21.695 INFO:tasks.workunit.client.1.vm05.stdout:3/616: dwrite d0/d9/d22/d5f/d75/d76/d88/f9c [0,4194304] 0 2026-03-09T16:15:21.697 INFO:tasks.workunit.client.1.vm05.stdout:7/718: stat d1/d2/d11/c12 0 2026-03-09T16:15:21.697 INFO:tasks.workunit.client.1.vm05.stdout:7/719: chown d1/d2/d8/dc/d1b/d71/ccf 405 1 2026-03-09T16:15:21.698 INFO:tasks.workunit.client.1.vm05.stdout:6/661: getdents d17/d22/d27/d34/d42 0 2026-03-09T16:15:21.701 INFO:tasks.workunit.client.1.vm05.stdout:9/702: dwrite d4/d10/d35/d2b/d38/fa0 [0,4194304] 0 2026-03-09T16:15:21.715 INFO:tasks.workunit.client.1.vm05.stdout:0/669: unlink d5/db/f12 0 2026-03-09T16:15:21.715 INFO:tasks.workunit.client.1.vm05.stdout:1/743: creat d7/d62/db6/fff x:0 0 0 2026-03-09T16:15:21.731 INFO:tasks.workunit.client.1.vm05.stdout:2/634: rename db/dd/d15/d3f/d5b/d60/d6a/l72 to db/dd/d15/d3f/d5b/d60/d95/lce 0 2026-03-09T16:15:21.743 INFO:tasks.workunit.client.1.vm05.stdout:3/617: creat d0/d9/d22/d5f/d90/dae/fcf x:0 0 0 2026-03-09T16:15:21.743 INFO:tasks.workunit.client.1.vm05.stdout:6/662: creat d17/d22/d27/d34/d4b/d7f/ff4 x:0 0 0 2026-03-09T16:15:21.744 INFO:tasks.workunit.client.1.vm05.stdout:6/663: truncate d17/d22/d27/d34/d4b/f6d 1233007 0 2026-03-09T16:15:21.749 INFO:tasks.workunit.client.1.vm05.stdout:7/720: chown d1/d2/d8/dc/d1b/d30/d4b/db2/cf6 3352450 1 2026-03-09T16:15:21.754 INFO:tasks.workunit.client.1.vm05.stdout:4/779: creat d5/de/d15/d21/f11e x:0 0 0 2026-03-09T16:15:21.757 INFO:tasks.workunit.client.1.vm05.stdout:1/744: write d7/dd/d21/d63/d71/ddc/df8/fc5 [660842,55429] 0 2026-03-09T16:15:21.758 INFO:tasks.workunit.client.1.vm05.stdout:4/780: stat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/ff5 0 2026-03-09T16:15:21.760 INFO:tasks.workunit.client.1.vm05.stdout:5/711: getdents d8/d59 0 2026-03-09T16:15:21.761 INFO:tasks.workunit.client.1.vm05.stdout:6/664: creat d17/d22/d9d/da5/ff5 x:0 0 0 2026-03-09T16:15:21.765 INFO:tasks.workunit.client.1.vm05.stdout:7/721: sync 2026-03-09T16:15:21.766 INFO:tasks.workunit.client.1.vm05.stdout:5/712: write d8/d59/d5b/d8b/da0/feb [3559391,74322] 0 2026-03-09T16:15:21.766 INFO:tasks.workunit.client.1.vm05.stdout:0/670: fsync d5/d1b/d30/f55 0 2026-03-09T16:15:21.767 INFO:tasks.workunit.client.1.vm05.stdout:1/745: creat d7/dd/de/d52/df6/f100 x:0 0 0 2026-03-09T16:15:21.767 INFO:tasks.workunit.client.1.vm05.stdout:2/635: link db/dd/d15/d1f/d21/d87/fbe db/dd/d15/d46/d8d/fcf 0 2026-03-09T16:15:21.768 INFO:tasks.workunit.client.1.vm05.stdout:2/636: readlink db/dd/l6c 0 2026-03-09T16:15:21.768 INFO:tasks.workunit.client.1.vm05.stdout:1/746: chown d7/dd/d21/d39/d87/db9/cd3 117 1 2026-03-09T16:15:21.769 INFO:tasks.workunit.client.1.vm05.stdout:4/781: creat d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/f11f x:0 0 0 2026-03-09T16:15:21.770 INFO:tasks.workunit.client.1.vm05.stdout:6/665: fsync d17/d22/d27/d34/d42/d53/d9f/fa0 0 2026-03-09T16:15:21.771 INFO:tasks.workunit.client.1.vm05.stdout:1/747: sync 2026-03-09T16:15:21.772 INFO:tasks.workunit.client.1.vm05.stdout:5/713: write d8/f7b [4784812,79272] 0 2026-03-09T16:15:21.773 INFO:tasks.workunit.client.1.vm05.stdout:9/703: rmdir d4/d10/d35/d36/d48/d60/ddc 0 2026-03-09T16:15:21.774 INFO:tasks.workunit.client.1.vm05.stdout:1/748: fdatasync d7/dd/d21/f2b 0 2026-03-09T16:15:21.774 INFO:tasks.workunit.client.1.vm05.stdout:1/749: stat d7/dd/d21/d39/fa4 0 2026-03-09T16:15:21.784 INFO:tasks.workunit.client.1.vm05.stdout:7/722: dread d1/d2/f5 [0,4194304] 0 2026-03-09T16:15:21.784 INFO:tasks.workunit.client.1.vm05.stdout:2/637: dwrite db/dd/d15/d1f/d21/f29 [0,4194304] 0 2026-03-09T16:15:21.787 INFO:tasks.workunit.client.1.vm05.stdout:3/618: rmdir d0/d9/d22/d5f/d90 39 2026-03-09T16:15:21.787 INFO:tasks.workunit.client.1.vm05.stdout:8/715: getdents d4/d6/db/dc/d2e/d85/dc7/dda 0 2026-03-09T16:15:21.795 INFO:tasks.workunit.client.1.vm05.stdout:2/638: chown db/f17 629 1 2026-03-09T16:15:21.799 INFO:tasks.workunit.client.1.vm05.stdout:9/704: dread - d4/d10/d35/d2b/d31/d96/fa7 zero size 2026-03-09T16:15:21.800 INFO:tasks.workunit.client.1.vm05.stdout:1/750: mkdir d7/d27/d101 0 2026-03-09T16:15:21.802 INFO:tasks.workunit.client.1.vm05.stdout:1/751: fdatasync d7/daa/fef 0 2026-03-09T16:15:21.803 INFO:tasks.workunit.client.1.vm05.stdout:7/723: write d1/d2/d8/dc/f1a [2396986,28353] 0 2026-03-09T16:15:21.805 INFO:tasks.workunit.client.1.vm05.stdout:3/619: dread d0/f56 [0,4194304] 0 2026-03-09T16:15:21.806 INFO:tasks.workunit.client.1.vm05.stdout:2/639: symlink db/dd/d15/d46/d67/ld0 0 2026-03-09T16:15:21.807 INFO:tasks.workunit.client.1.vm05.stdout:4/782: mkdir d5/de/d15/d120 0 2026-03-09T16:15:21.808 INFO:tasks.workunit.client.1.vm05.stdout:4/783: chown d5/de/d15/d21/d27/lfa 50 1 2026-03-09T16:15:21.809 INFO:tasks.workunit.client.1.vm05.stdout:3/620: sync 2026-03-09T16:15:21.812 INFO:tasks.workunit.client.1.vm05.stdout:1/752: symlink d7/d15/d6e/dbc/dd6/l102 0 2026-03-09T16:15:21.814 INFO:tasks.workunit.client.1.vm05.stdout:7/724: rename d1/d2/d8/dc/d14/cce to d1/d2/d8/dc/d1b/d30/d4b/db2/cff 0 2026-03-09T16:15:21.816 INFO:tasks.workunit.client.1.vm05.stdout:0/671: link d5/db/cb3 d5/db/d5f/da3/ce6 0 2026-03-09T16:15:21.816 INFO:tasks.workunit.client.1.vm05.stdout:4/784: write d5/de/d15/d21/d27/d3c/d5c/f107 [547167,58858] 0 2026-03-09T16:15:21.822 INFO:tasks.workunit.client.1.vm05.stdout:5/714: creat d8/d53/ffc x:0 0 0 2026-03-09T16:15:21.822 INFO:tasks.workunit.client.1.vm05.stdout:5/715: sync 2026-03-09T16:15:21.825 INFO:tasks.workunit.client.1.vm05.stdout:4/785: truncate d5/de/d82/fbe 847481 0 2026-03-09T16:15:21.826 INFO:tasks.workunit.client.1.vm05.stdout:5/716: chown d8/d53/l69 3620 1 2026-03-09T16:15:21.827 INFO:tasks.workunit.client.1.vm05.stdout:9/705: dread d4/d10/d35/d36/f49 [0,4194304] 0 2026-03-09T16:15:21.832 INFO:tasks.workunit.client.1.vm05.stdout:1/753: mkdir d7/dd/d21/d39/d48/d8c/dd8/d103 0 2026-03-09T16:15:21.833 INFO:tasks.workunit.client.1.vm05.stdout:7/725: fsync d1/d2/d11/d86/fc9 0 2026-03-09T16:15:21.833 INFO:tasks.workunit.client.1.vm05.stdout:1/754: chown d7/d15/d16/dc2 5 1 2026-03-09T16:15:21.836 INFO:tasks.workunit.client.1.vm05.stdout:1/755: write d7/d27/f84 [2527346,3657] 0 2026-03-09T16:15:21.839 INFO:tasks.workunit.client.1.vm05.stdout:6/666: getdents d17/d22/d27/d58/db8 0 2026-03-09T16:15:21.839 INFO:tasks.workunit.client.1.vm05.stdout:6/667: readlink d17/d22/d27/d34/led 0 2026-03-09T16:15:21.840 INFO:tasks.workunit.client.1.vm05.stdout:5/717: symlink d8/d5e/lfd 0 2026-03-09T16:15:21.844 INFO:tasks.workunit.client.1.vm05.stdout:4/786: creat d5/de/d15/d21/d39/d91/f121 x:0 0 0 2026-03-09T16:15:21.846 INFO:tasks.workunit.client.1.vm05.stdout:5/718: dread d8/d18/dbc/dcc/f94 [0,4194304] 0 2026-03-09T16:15:21.847 INFO:tasks.workunit.client.1.vm05.stdout:4/787: stat d5/de/d15/da9/db1/dad/d37/d60/f62 0 2026-03-09T16:15:21.848 INFO:tasks.workunit.client.1.vm05.stdout:1/756: dwrite d7/dd/d21/d63/d71/ddc/df8/fc5 [0,4194304] 0 2026-03-09T16:15:21.849 INFO:tasks.workunit.client.1.vm05.stdout:3/621: mknod d0/d9/d22/d5f/d7b/da8/cd0 0 2026-03-09T16:15:21.849 INFO:tasks.workunit.client.1.vm05.stdout:4/788: stat d5/de/d15/d21/dfe 0 2026-03-09T16:15:21.854 INFO:tasks.workunit.client.1.vm05.stdout:6/668: dwrite d17/d22/d9d/da5/ff5 [0,4194304] 0 2026-03-09T16:15:21.854 INFO:tasks.workunit.client.1.vm05.stdout:4/789: fsync d5/de/d82/fcd 0 2026-03-09T16:15:21.858 INFO:tasks.workunit.client.1.vm05.stdout:5/719: dread d8/d59/d5b/d8b/da0/feb [0,4194304] 0 2026-03-09T16:15:21.865 INFO:tasks.workunit.client.1.vm05.stdout:8/716: getdents d4/d6/d3a/d3c 0 2026-03-09T16:15:21.870 INFO:tasks.workunit.client.1.vm05.stdout:9/706: dwrite d4/f3c [0,4194304] 0 2026-03-09T16:15:21.870 INFO:tasks.workunit.client.1.vm05.stdout:8/717: chown d4/d6/d3a/d3c/l74 10070726 1 2026-03-09T16:15:21.870 INFO:tasks.workunit.client.1.vm05.stdout:9/707: dread - d4/d10/d35/d2b/d31/d96/fa7 zero size 2026-03-09T16:15:21.878 INFO:tasks.workunit.client.1.vm05.stdout:2/640: rename db/dd/d15/d1f/l4b to db/dd/d15/d1f/d20/d23/d78/ld1 0 2026-03-09T16:15:21.878 INFO:tasks.workunit.client.1.vm05.stdout:7/726: mkdir d1/d2/d8/dc/d33/d100 0 2026-03-09T16:15:21.882 INFO:tasks.workunit.client.1.vm05.stdout:8/718: unlink d4/d6/db/df/c57 0 2026-03-09T16:15:21.882 INFO:tasks.workunit.client.1.vm05.stdout:5/720: creat d8/d59/d5b/d8b/ffe x:0 0 0 2026-03-09T16:15:21.890 INFO:tasks.workunit.client.1.vm05.stdout:4/790: symlink d5/de/d15/da9/df6/l122 0 2026-03-09T16:15:21.890 INFO:tasks.workunit.client.1.vm05.stdout:2/641: creat db/dd/d15/d46/fd2 x:0 0 0 2026-03-09T16:15:21.891 INFO:tasks.workunit.client.1.vm05.stdout:3/622: creat d0/d9/d22/d5f/fd1 x:0 0 0 2026-03-09T16:15:21.891 INFO:tasks.workunit.client.1.vm05.stdout:6/669: dread d17/f31 [0,4194304] 0 2026-03-09T16:15:21.891 INFO:tasks.workunit.client.1.vm05.stdout:9/708: creat d4/d10/d35/d2b/d38/ff2 x:0 0 0 2026-03-09T16:15:21.892 INFO:tasks.workunit.client.1.vm05.stdout:8/719: dread d4/d6/db/d59/f60 [0,4194304] 0 2026-03-09T16:15:21.895 INFO:tasks.workunit.client.1.vm05.stdout:7/727: sync 2026-03-09T16:15:21.900 INFO:tasks.workunit.client.1.vm05.stdout:7/728: creat d1/d2/d8/d31/f101 x:0 0 0 2026-03-09T16:15:21.900 INFO:tasks.workunit.client.1.vm05.stdout:8/720: truncate d4/d6/f9 8851173 0 2026-03-09T16:15:21.903 INFO:tasks.workunit.client.1.vm05.stdout:5/721: getdents d8/d18/d1b/d47/d4e/d76/d8f 0 2026-03-09T16:15:21.905 INFO:tasks.workunit.client.1.vm05.stdout:6/670: fsync d17/f18 0 2026-03-09T16:15:21.905 INFO:tasks.workunit.client.1.vm05.stdout:4/791: creat d5/de/d15/d21/d27/d3c/d5c/f123 x:0 0 0 2026-03-09T16:15:21.905 INFO:tasks.workunit.client.1.vm05.stdout:8/721: dread d4/d6/db/d59/f60 [0,4194304] 0 2026-03-09T16:15:21.909 INFO:tasks.workunit.client.1.vm05.stdout:3/623: truncate d0/d9/d22/d5f/d90/fcb 920761 0 2026-03-09T16:15:21.910 INFO:tasks.workunit.client.1.vm05.stdout:5/722: dread - d8/d18/d1b/d6b/f93 zero size 2026-03-09T16:15:21.915 INFO:tasks.workunit.client.1.vm05.stdout:5/723: chown d8/lf5 93535 1 2026-03-09T16:15:21.917 INFO:tasks.workunit.client.1.vm05.stdout:9/709: dwrite d4/f6 [0,4194304] 0 2026-03-09T16:15:21.917 INFO:tasks.workunit.client.1.vm05.stdout:7/729: fsync d1/d2/d8/dc/d1b/d30/f93 0 2026-03-09T16:15:21.918 INFO:tasks.workunit.client.1.vm05.stdout:0/672: write d5/d9e/fa2 [834804,96483] 0 2026-03-09T16:15:21.918 INFO:tasks.workunit.client.1.vm05.stdout:1/757: write d7/dd/de/f56 [1719173,94502] 0 2026-03-09T16:15:21.933 INFO:tasks.workunit.client.1.vm05.stdout:2/642: link db/dd/d15/c18 db/dd/cd3 0 2026-03-09T16:15:21.933 INFO:tasks.workunit.client.1.vm05.stdout:8/722: unlink d4/l5 0 2026-03-09T16:15:21.934 INFO:tasks.workunit.client.1.vm05.stdout:5/724: dwrite d8/d59/d5b/d8b/da0/fc1 [0,4194304] 0 2026-03-09T16:15:21.942 INFO:tasks.workunit.client.1.vm05.stdout:9/710: sync 2026-03-09T16:15:21.943 INFO:tasks.workunit.client.1.vm05.stdout:9/711: write d4/d10/f18 [6317832,102301] 0 2026-03-09T16:15:21.947 INFO:tasks.workunit.client.1.vm05.stdout:3/624: dread d0/d9/d22/d5f/d75/d76/fa5 [0,4194304] 0 2026-03-09T16:15:21.948 INFO:tasks.workunit.client.1.vm05.stdout:3/625: write d0/d9/d8b/fc7 [1006104,59650] 0 2026-03-09T16:15:21.950 INFO:tasks.workunit.client.1.vm05.stdout:6/671: rename d17/d22/d27/d34/d42/d65/dbe to d17/d22/d27/d34/d42/d53/d87/df6 0 2026-03-09T16:15:21.951 INFO:tasks.workunit.client.1.vm05.stdout:0/673: chown d5/d2c/d49/d83/d8b/d95/f52 0 1 2026-03-09T16:15:21.960 INFO:tasks.workunit.client.1.vm05.stdout:7/730: read d1/d2/d8/dc/d1b/d71/d3c/feb [275710,118060] 0 2026-03-09T16:15:21.965 INFO:tasks.workunit.client.1.vm05.stdout:8/723: mknod d4/d6/db/df/d80/cf1 0 2026-03-09T16:15:21.968 INFO:tasks.workunit.client.1.vm05.stdout:4/792: write d5/de/d15/d21/f6d [579536,68605] 0 2026-03-09T16:15:21.975 INFO:tasks.workunit.client.1.vm05.stdout:9/712: symlink d4/d10/d35/d36/d48/d54/d59/lf3 0 2026-03-09T16:15:21.975 INFO:tasks.workunit.client.1.vm05.stdout:3/626: mkdir d0/d9/d22/d5f/d90/dae/dd2 0 2026-03-09T16:15:21.975 INFO:tasks.workunit.client.1.vm05.stdout:0/674: mknod d5/d2c/d49/ce7 0 2026-03-09T16:15:21.980 INFO:tasks.workunit.client.1.vm05.stdout:8/724: creat d4/d6/db/df/d4f/d9f/ff2 x:0 0 0 2026-03-09T16:15:21.980 INFO:tasks.workunit.client.1.vm05.stdout:5/725: dread d8/d53/d7a/f92 [0,4194304] 0 2026-03-09T16:15:21.981 INFO:tasks.workunit.client.1.vm05.stdout:4/793: dwrite d5/fce [0,4194304] 0 2026-03-09T16:15:21.983 INFO:tasks.workunit.client.1.vm05.stdout:8/725: write d4/d6/db/dc/f30 [3188800,79280] 0 2026-03-09T16:15:21.989 INFO:tasks.workunit.client.1.vm05.stdout:7/731: creat d1/d2/d8/dc/d1b/d30/d4b/db2/de9/f102 x:0 0 0 2026-03-09T16:15:21.990 INFO:tasks.workunit.client.1.vm05.stdout:0/675: mkdir d5/d2c/d49/d83/d8b/daf/de8 0 2026-03-09T16:15:21.994 INFO:tasks.workunit.client.1.vm05.stdout:3/627: mknod d0/d9/d22/d5f/d90/cd3 0 2026-03-09T16:15:22.000 INFO:tasks.workunit.client.1.vm05.stdout:2/643: creat db/dd/d15/fd4 x:0 0 0 2026-03-09T16:15:22.003 INFO:tasks.workunit.client.1.vm05.stdout:1/758: link d7/dd/d21/d39/d5a/l73 d7/d15/d6e/dbc/dd6/l104 0 2026-03-09T16:15:22.004 INFO:tasks.workunit.client.1.vm05.stdout:6/672: creat d17/d22/d27/ff7 x:0 0 0 2026-03-09T16:15:22.006 INFO:tasks.workunit.client.1.vm05.stdout:6/673: fdatasync f16 0 2026-03-09T16:15:22.006 INFO:tasks.workunit.client.1.vm05.stdout:1/759: write d7/d27/f84 [3199117,77964] 0 2026-03-09T16:15:22.006 INFO:tasks.workunit.client.1.vm05.stdout:7/732: sync 2026-03-09T16:15:22.015 INFO:tasks.workunit.client.1.vm05.stdout:9/713: write d4/d10/d35/d2b/d38/f78 [2239144,33763] 0 2026-03-09T16:15:22.024 INFO:tasks.workunit.client.1.vm05.stdout:4/794: dread d5/fd [0,4194304] 0 2026-03-09T16:15:22.027 INFO:tasks.workunit.client.1.vm05.stdout:8/726: fsync f0 0 2026-03-09T16:15:22.027 INFO:tasks.workunit.client.1.vm05.stdout:3/628: creat d0/d9/d97/fd4 x:0 0 0 2026-03-09T16:15:22.033 INFO:tasks.workunit.client.1.vm05.stdout:8/727: read - d4/d6/d3a/d3c/fe5 zero size 2026-03-09T16:15:22.033 INFO:tasks.workunit.client.1.vm05.stdout:3/629: chown d0/d33/l9f 173438 1 2026-03-09T16:15:22.034 INFO:tasks.workunit.client.1.vm05.stdout:4/795: chown d5/de/d15/d21/d27/d3c/d5c/da2/cb2 398477 1 2026-03-09T16:15:22.035 INFO:tasks.workunit.client.1.vm05.stdout:1/760: dwrite d7/d62/fd2 [0,4194304] 0 2026-03-09T16:15:22.035 INFO:tasks.workunit.client.1.vm05.stdout:7/733: fsync d1/d2/f22 0 2026-03-09T16:15:22.038 INFO:tasks.workunit.client.1.vm05.stdout:1/761: fdatasync d7/d15/f8d 0 2026-03-09T16:15:22.045 INFO:tasks.workunit.client.1.vm05.stdout:9/714: dwrite d4/d10/d35/d36/d48/d54/db0/fc6 [0,4194304] 0 2026-03-09T16:15:22.046 INFO:tasks.workunit.client.1.vm05.stdout:0/676: mkdir d5/db/d5b/de9 0 2026-03-09T16:15:22.046 INFO:tasks.workunit.client.1.vm05.stdout:2/644: fdatasync db/dd/d15/d1f/f9c 0 2026-03-09T16:15:22.047 INFO:tasks.workunit.client.1.vm05.stdout:5/726: creat d8/d18/fff x:0 0 0 2026-03-09T16:15:22.047 INFO:tasks.workunit.client.1.vm05.stdout:1/762: chown d7/dd/d21/d63/d71/la8 6 1 2026-03-09T16:15:22.050 INFO:tasks.workunit.client.1.vm05.stdout:5/727: chown d8/d18/dbc/dcc/daa/cf6 417 1 2026-03-09T16:15:22.050 INFO:tasks.workunit.client.1.vm05.stdout:2/645: chown db/dd/f1b 487 1 2026-03-09T16:15:22.051 INFO:tasks.workunit.client.1.vm05.stdout:4/796: dwrite d5/de/d15/d21/d27/f2c [0,4194304] 0 2026-03-09T16:15:22.053 INFO:tasks.workunit.client.1.vm05.stdout:5/728: chown d8/d59/d75 102 1 2026-03-09T16:15:22.063 INFO:tasks.workunit.client.1.vm05.stdout:6/674: mkdir d17/d22/d27/df8 0 2026-03-09T16:15:22.072 INFO:tasks.workunit.client.1.vm05.stdout:3/630: creat d0/d9/d22/d5f/d75/d76/d88/d89/fd5 x:0 0 0 2026-03-09T16:15:22.076 INFO:tasks.workunit.client.1.vm05.stdout:7/734: creat d1/d2/d8/dc/d1b/d30/d7d/f103 x:0 0 0 2026-03-09T16:15:22.076 INFO:tasks.workunit.client.1.vm05.stdout:0/677: fdatasync d5/f8 0 2026-03-09T16:15:22.077 INFO:tasks.workunit.client.1.vm05.stdout:9/715: mknod d4/d10/d35/d2b/d31/d96/cf4 0 2026-03-09T16:15:22.077 INFO:tasks.workunit.client.1.vm05.stdout:5/729: dwrite d8/ffa [0,4194304] 0 2026-03-09T16:15:22.093 INFO:tasks.workunit.client.1.vm05.stdout:4/797: rmdir d5/d9c/dbd 39 2026-03-09T16:15:22.094 INFO:tasks.workunit.client.1.vm05.stdout:7/735: unlink d1/d2/d8/dc/d1b/d30/d4b/fcc 0 2026-03-09T16:15:22.094 INFO:tasks.workunit.client.1.vm05.stdout:0/678: fdatasync d5/d11/d4f/d70/fd7 0 2026-03-09T16:15:22.094 INFO:tasks.workunit.client.1.vm05.stdout:5/730: dread d8/ffa [0,4194304] 0 2026-03-09T16:15:22.095 INFO:tasks.workunit.client.1.vm05.stdout:1/763: link d7/dd/d21/d63/d71/fd0 d7/dd/d21/d39/d48/da7/db5/f105 0 2026-03-09T16:15:22.096 INFO:tasks.workunit.client.1.vm05.stdout:9/716: creat d4/d10/d35/d2b/d31/d96/ff5 x:0 0 0 2026-03-09T16:15:22.096 INFO:tasks.workunit.client.1.vm05.stdout:7/736: stat d1/d2/d8/dc/d1b/d71/f97 0 2026-03-09T16:15:22.097 INFO:tasks.workunit.client.1.vm05.stdout:2/646: mknod db/dd/d15/d4c/cd5 0 2026-03-09T16:15:22.099 INFO:tasks.workunit.client.1.vm05.stdout:3/631: symlink d0/d9/d22/d5f/d90/dae/dd2/ld6 0 2026-03-09T16:15:22.108 INFO:tasks.workunit.client.1.vm05.stdout:2/647: write db/dd/fb8 [18608,56963] 0 2026-03-09T16:15:22.112 INFO:tasks.workunit.client.1.vm05.stdout:5/731: dread d8/d18/d1b/d47/d48/d73/d80/fe5 [0,4194304] 0 2026-03-09T16:15:22.112 INFO:tasks.workunit.client.1.vm05.stdout:6/675: rename d17/d22/d27/d34/d42/fda to d17/ff9 0 2026-03-09T16:15:22.114 INFO:tasks.workunit.client.1.vm05.stdout:9/717: creat d4/d10/d35/d36/d48/d60/dae/ff6 x:0 0 0 2026-03-09T16:15:22.118 INFO:tasks.workunit.client.1.vm05.stdout:7/737: chown d1/d2/d8/dc/d1b/d30/d4b/fdf 12 1 2026-03-09T16:15:22.118 INFO:tasks.workunit.client.1.vm05.stdout:5/732: readlink d8/d18/dbc/dcc/daa/l4a 0 2026-03-09T16:15:22.119 INFO:tasks.workunit.client.1.vm05.stdout:7/738: dread - d1/d2/d8/d67/d76/ff1 zero size 2026-03-09T16:15:22.121 INFO:tasks.workunit.client.1.vm05.stdout:3/632: truncate d0/d9/f73 599705 0 2026-03-09T16:15:22.123 INFO:tasks.workunit.client.1.vm05.stdout:0/679: creat d5/d1b/d3b/dc2/fea x:0 0 0 2026-03-09T16:15:22.124 INFO:tasks.workunit.client.1.vm05.stdout:4/798: mkdir d5/d9c/d124 0 2026-03-09T16:15:22.133 INFO:tasks.workunit.client.1.vm05.stdout:5/733: creat d8/d5e/d8e/f100 x:0 0 0 2026-03-09T16:15:22.137 INFO:tasks.workunit.client.1.vm05.stdout:8/728: dwrite d4/d6/d3a/f49 [0,4194304] 0 2026-03-09T16:15:22.137 INFO:tasks.workunit.client.1.vm05.stdout:9/718: rmdir d4/d10/d35/d2b 39 2026-03-09T16:15:22.139 INFO:tasks.workunit.client.1.vm05.stdout:8/729: stat d4/d6/d53/f6c 0 2026-03-09T16:15:22.142 INFO:tasks.workunit.client.1.vm05.stdout:1/764: write d7/dd/de/d52/f58 [2439427,39225] 0 2026-03-09T16:15:22.142 INFO:tasks.workunit.client.1.vm05.stdout:3/633: fdatasync d0/d9/d22/d5f/d7b/d99/f9d 0 2026-03-09T16:15:22.153 INFO:tasks.workunit.client.1.vm05.stdout:0/680: mkdir d5/db/d5f/deb 0 2026-03-09T16:15:22.160 INFO:tasks.workunit.client.1.vm05.stdout:2/648: rename db/dd/d15/d1f/d21/c5e to db/dd/d15/d3f/d5b/d7e/cd6 0 2026-03-09T16:15:22.160 INFO:tasks.workunit.client.1.vm05.stdout:2/649: stat db/dd/d15/d3f/d5b/d60/d95 0 2026-03-09T16:15:22.164 INFO:tasks.workunit.client.1.vm05.stdout:6/676: mknod d17/d22/cfa 0 2026-03-09T16:15:22.164 INFO:tasks.workunit.client.1.vm05.stdout:2/650: dread db/dd/d15/d1f/d21/d87/f99 [0,4194304] 0 2026-03-09T16:15:22.180 INFO:tasks.workunit.client.1.vm05.stdout:5/734: mknod d8/d5e/d8e/c101 0 2026-03-09T16:15:22.190 INFO:tasks.workunit.client.1.vm05.stdout:7/739: symlink d1/d2/d11/d86/da2/dfa/l104 0 2026-03-09T16:15:22.190 INFO:tasks.workunit.client.1.vm05.stdout:9/719: chown d4/d10/d35/d2b/d38/d65 414 1 2026-03-09T16:15:22.191 INFO:tasks.workunit.client.1.vm05.stdout:3/634: fsync d0/d9/d22/d5f/f66 0 2026-03-09T16:15:22.197 INFO:tasks.workunit.client.1.vm05.stdout:1/765: symlink d7/dd/de/l106 0 2026-03-09T16:15:22.197 INFO:tasks.workunit.client.1.vm05.stdout:0/681: creat d5/d2c/d49/d83/fec x:0 0 0 2026-03-09T16:15:22.198 INFO:tasks.workunit.client.1.vm05.stdout:4/799: dwrite d5/de/d15/d21/d27/d3c/d5c/da2/fee [0,4194304] 0 2026-03-09T16:15:22.225 INFO:tasks.workunit.client.1.vm05.stdout:5/735: stat d8/d18/c27 0 2026-03-09T16:15:22.225 INFO:tasks.workunit.client.1.vm05.stdout:5/736: fsync d8/d18/dbc/dcc/daa/f35 0 2026-03-09T16:15:22.228 INFO:tasks.workunit.client.1.vm05.stdout:7/740: creat d1/d2/d8/d31/f105 x:0 0 0 2026-03-09T16:15:22.228 INFO:tasks.workunit.client.1.vm05.stdout:5/737: write d8/d18/dbc/dcc/daa/d43/f41 [3159149,49817] 0 2026-03-09T16:15:22.230 INFO:tasks.workunit.client.1.vm05.stdout:3/635: write d0/f56 [2942310,83996] 0 2026-03-09T16:15:22.235 INFO:tasks.workunit.client.1.vm05.stdout:3/636: write d0/d9/d22/d5f/d7b/f9a [623052,119782] 0 2026-03-09T16:15:22.238 INFO:tasks.workunit.client.1.vm05.stdout:5/738: write d8/d18/dbc/dcc/daa/f3c [1336188,103523] 0 2026-03-09T16:15:22.241 INFO:tasks.workunit.client.1.vm05.stdout:6/677: write d17/d22/d27/d34/d42/d53/f90 [1876535,74591] 0 2026-03-09T16:15:22.244 INFO:tasks.workunit.client.1.vm05.stdout:0/682: mkdir d5/d9e/ded 0 2026-03-09T16:15:22.245 INFO:tasks.workunit.client.1.vm05.stdout:0/683: write d5/d11/d4f/d70/fbf [2176437,114934] 0 2026-03-09T16:15:22.250 INFO:tasks.workunit.client.1.vm05.stdout:7/741: unlink d1/d2/d8/dc/d1b/d71/d3c/dd3/cf0 0 2026-03-09T16:15:22.251 INFO:tasks.workunit.client.1.vm05.stdout:3/637: dwrite d0/d9/f2f [0,4194304] 0 2026-03-09T16:15:22.258 INFO:tasks.workunit.client.1.vm05.stdout:1/766: mkdir d7/dd/d21/d39/d48/d8c/dd8/d103/d107 0 2026-03-09T16:15:22.263 INFO:tasks.workunit.client.1.vm05.stdout:8/730: rename d4/c35 to d4/cf3 0 2026-03-09T16:15:22.263 INFO:tasks.workunit.client.1.vm05.stdout:8/731: chown d4/d6/d3a/d40/d6a/d97 550044 1 2026-03-09T16:15:22.264 INFO:tasks.workunit.client.1.vm05.stdout:3/638: mknod d0/d9/d22/d4c/d80/cd7 0 2026-03-09T16:15:22.264 INFO:tasks.workunit.client.1.vm05.stdout:8/732: read - d4/d6/fc0 zero size 2026-03-09T16:15:22.268 INFO:tasks.workunit.client.1.vm05.stdout:2/651: getdents db/dd/d15/d1f 0 2026-03-09T16:15:22.272 INFO:tasks.workunit.client.1.vm05.stdout:4/800: getdents d5/de/d15/d21 0 2026-03-09T16:15:22.274 INFO:tasks.workunit.client.1.vm05.stdout:3/639: mkdir d0/d9/d22/d5f/d7b/da8/dd8 0 2026-03-09T16:15:22.274 INFO:tasks.workunit.client.1.vm05.stdout:7/742: creat d1/d2/d8/dc/d1b/de6/f106 x:0 0 0 2026-03-09T16:15:22.274 INFO:tasks.workunit.client.1.vm05.stdout:5/739: getdents d8/d18/d1b/d78/d90 0 2026-03-09T16:15:22.276 INFO:tasks.workunit.client.1.vm05.stdout:3/640: dread - d0/d9/d22/d4c/d80/fbf zero size 2026-03-09T16:15:22.276 INFO:tasks.workunit.client.1.vm05.stdout:5/740: chown d8/f6f 11212 1 2026-03-09T16:15:22.277 INFO:tasks.workunit.client.1.vm05.stdout:5/741: chown d8/d18/d1b/d47/d68/l6d 59 1 2026-03-09T16:15:22.278 INFO:tasks.workunit.client.1.vm05.stdout:0/684: getdents d5/d11 0 2026-03-09T16:15:22.279 INFO:tasks.workunit.client.1.vm05.stdout:7/743: read - d1/d2/d8/d67/f99 zero size 2026-03-09T16:15:22.279 INFO:tasks.workunit.client.1.vm05.stdout:3/641: truncate d0/d9/d22/d5f/d90/dae/fb1 609477 0 2026-03-09T16:15:22.284 INFO:tasks.workunit.client.1.vm05.stdout:0/685: chown d5/d1b/d30/f55 2675077 1 2026-03-09T16:15:22.289 INFO:tasks.workunit.client.1.vm05.stdout:6/678: link d17/d22/dce/lf2 d17/d22/d27/d44/lfb 0 2026-03-09T16:15:22.289 INFO:tasks.workunit.client.1.vm05.stdout:0/686: truncate d5/d1b/d3b/f6f 5227831 0 2026-03-09T16:15:22.290 INFO:tasks.workunit.client.1.vm05.stdout:3/642: mknod d0/d9/d22/d5f/d7b/da8/cd9 0 2026-03-09T16:15:22.290 INFO:tasks.workunit.client.1.vm05.stdout:0/687: fsync d5/fd4 0 2026-03-09T16:15:22.290 INFO:tasks.workunit.client.1.vm05.stdout:7/744: dwrite d1/d2/d8/d31/d8d/f56 [0,4194304] 0 2026-03-09T16:15:22.292 INFO:tasks.workunit.client.1.vm05.stdout:2/652: getdents db/dd/d15/d4c 0 2026-03-09T16:15:22.294 INFO:tasks.workunit.client.1.vm05.stdout:4/801: rename d5/de/d15/d21/d27/d3c/d5c/da2/fee to d5/d9c/f125 0 2026-03-09T16:15:22.295 INFO:tasks.workunit.client.1.vm05.stdout:1/767: dread d7/dd/d21/d63/d71/f7b [0,4194304] 0 2026-03-09T16:15:22.295 INFO:tasks.workunit.client.1.vm05.stdout:5/742: creat d8/d53/f102 x:0 0 0 2026-03-09T16:15:22.296 INFO:tasks.workunit.client.1.vm05.stdout:9/720: dwrite d4/d10/d35/d2b/d31/d96/ddd/fe8 [0,4194304] 0 2026-03-09T16:15:22.305 INFO:tasks.workunit.client.1.vm05.stdout:9/721: read - d4/d10/d35/d36/d48/d60/dae/fc9 zero size 2026-03-09T16:15:22.314 INFO:tasks.workunit.client.1.vm05.stdout:8/733: write f0 [1246624,104869] 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:0/688: fdatasync d5/db/d5b/d82/f89 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:3/643: read d0/d9/d22/f5b [7305634,48827] 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:3/644: readlink d0/d9/d22/d4c/d4e/l59 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:3/645: fsync d0/f45 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:3/646: readlink d0/d9/l3e 0 2026-03-09T16:15:22.327 INFO:tasks.workunit.client.1.vm05.stdout:2/653: dwrite db/dd/d15/d3f/d5b/f69 [0,4194304] 0 2026-03-09T16:15:22.339 INFO:tasks.workunit.client.1.vm05.stdout:7/745: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/d107 0 2026-03-09T16:15:22.339 INFO:tasks.workunit.client.1.vm05.stdout:1/768: rename d7/dd/d21/f2b to d7/dd/d21/d2d/f108 0 2026-03-09T16:15:22.350 INFO:tasks.workunit.client.1.vm05.stdout:7/746: dwrite d1/d2/d8/dc/d1b/f62 [0,4194304] 0 2026-03-09T16:15:22.354 INFO:tasks.workunit.client.1.vm05.stdout:4/802: symlink d5/de/d15/da9/db1/dad/d90/dd8/l126 0 2026-03-09T16:15:22.357 INFO:tasks.workunit.client.1.vm05.stdout:9/722: truncate d4/d10/d35/d36/fb3 1324462 0 2026-03-09T16:15:22.363 INFO:tasks.workunit.client.1.vm05.stdout:6/679: dwrite d17/d22/d27/d34/d4b/d7f/fc8 [0,4194304] 0 2026-03-09T16:15:22.366 INFO:tasks.workunit.client.1.vm05.stdout:9/723: dread d4/d10/d35/fc3 [0,4194304] 0 2026-03-09T16:15:22.371 INFO:tasks.workunit.client.1.vm05.stdout:8/734: rmdir d4/d6 39 2026-03-09T16:15:22.371 INFO:tasks.workunit.client.1.vm05.stdout:9/724: chown d4/d10/c46 12742247 1 2026-03-09T16:15:22.372 INFO:tasks.workunit.client.1.vm05.stdout:0/689: mknod d5/db/d5f/cee 0 2026-03-09T16:15:22.373 INFO:tasks.workunit.client.1.vm05.stdout:3/647: mknod d0/d9/d22/d5f/d75/d76/d88/da3/cda 0 2026-03-09T16:15:22.374 INFO:tasks.workunit.client.1.vm05.stdout:1/769: dread d7/dd/de/d52/f58 [0,4194304] 0 2026-03-09T16:15:22.376 INFO:tasks.workunit.client.1.vm05.stdout:2/654: mkdir db/dd/d15/d3f/d5b/d60/d95/dd7 0 2026-03-09T16:15:22.381 INFO:tasks.workunit.client.1.vm05.stdout:1/770: write d7/dd/d21/d39/d48/f59 [1804068,27666] 0 2026-03-09T16:15:22.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:22 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:22.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:22 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:22.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:22 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:22.392 INFO:tasks.workunit.client.1.vm05.stdout:5/743: creat d8/d18/d1b/d47/d48/d73/d80/de4/f103 x:0 0 0 2026-03-09T16:15:22.395 INFO:tasks.workunit.client.1.vm05.stdout:4/803: write d5/de/d15/f74 [643787,92426] 0 2026-03-09T16:15:22.400 INFO:tasks.workunit.client.1.vm05.stdout:1/771: dwrite d7/f3f [0,4194304] 0 2026-03-09T16:15:22.411 INFO:tasks.workunit.client.1.vm05.stdout:4/804: dwrite d5/de/d2f/f99 [0,4194304] 0 2026-03-09T16:15:22.413 INFO:tasks.workunit.client.1.vm05.stdout:5/744: sync 2026-03-09T16:15:22.414 INFO:tasks.workunit.client.1.vm05.stdout:6/680: creat d17/d22/d27/d34/d42/d53/d9f/ffc x:0 0 0 2026-03-09T16:15:22.415 INFO:tasks.workunit.client.1.vm05.stdout:8/735: fdatasync d4/d6/db/df/d80/fb9 0 2026-03-09T16:15:22.415 INFO:tasks.workunit.client.1.vm05.stdout:9/725: rmdir d4/d10/d35/d2b/d31 39 2026-03-09T16:15:22.417 INFO:tasks.workunit.client.1.vm05.stdout:9/726: write d4/f4a [6877673,6565] 0 2026-03-09T16:15:22.423 INFO:tasks.workunit.client.1.vm05.stdout:4/805: mknod d5/de/d15/d21/d27/d3c/d5c/da2/dc9/c127 0 2026-03-09T16:15:22.424 INFO:tasks.workunit.client.1.vm05.stdout:4/806: dread - d5/de/d2f/f78 zero size 2026-03-09T16:15:22.426 INFO:tasks.workunit.client.1.vm05.stdout:5/745: mknod d8/d18/d1b/d47/d48/c104 0 2026-03-09T16:15:22.427 INFO:tasks.workunit.client.1.vm05.stdout:7/747: dwrite d1/f26 [0,4194304] 0 2026-03-09T16:15:22.427 INFO:tasks.workunit.client.1.vm05.stdout:5/746: write d8/dc8/ff3 [993846,84274] 0 2026-03-09T16:15:22.430 INFO:tasks.workunit.client.1.vm05.stdout:7/748: sync 2026-03-09T16:15:22.431 INFO:tasks.workunit.client.1.vm05.stdout:7/749: write d1/d2/d11/d86/d8a/d91/ffe [739464,44296] 0 2026-03-09T16:15:22.434 INFO:tasks.workunit.client.1.vm05.stdout:7/750: write d1/d2/d11/d86/da2/fb0 [740375,56842] 0 2026-03-09T16:15:22.452 INFO:tasks.workunit.client.1.vm05.stdout:6/681: creat d17/d22/d27/d34/d42/d53/ffd x:0 0 0 2026-03-09T16:15:22.457 INFO:tasks.workunit.client.1.vm05.stdout:2/655: unlink db/c1e 0 2026-03-09T16:15:22.466 INFO:tasks.workunit.client.1.vm05.stdout:0/690: mkdir d5/db/def 0 2026-03-09T16:15:22.467 INFO:tasks.workunit.client.1.vm05.stdout:9/727: mknod d4/d10/d35/d36/d48/d54/db0/cf7 0 2026-03-09T16:15:22.467 INFO:tasks.workunit.client.1.vm05.stdout:1/772: rename d7/d62/d72/f79 to d7/dd/de/f109 0 2026-03-09T16:15:22.468 INFO:tasks.workunit.client.1.vm05.stdout:4/807: mkdir d5/de/d15/d21/dfe/d128 0 2026-03-09T16:15:22.470 INFO:tasks.workunit.client.1.vm05.stdout:5/747: symlink d8/d59/d5b/d8b/l105 0 2026-03-09T16:15:22.471 INFO:tasks.workunit.client.1.vm05.stdout:7/751: dread - d1/d2/d8/dc/d33/fb5 zero size 2026-03-09T16:15:22.474 INFO:tasks.workunit.client.1.vm05.stdout:6/682: mknod d17/d22/d27/d34/d42/cfe 0 2026-03-09T16:15:22.474 INFO:tasks.workunit.client.1.vm05.stdout:2/656: symlink db/dd/d15/d4c/ld8 0 2026-03-09T16:15:22.479 INFO:tasks.workunit.client.1.vm05.stdout:0/691: dread d5/d11/f1e [0,4194304] 0 2026-03-09T16:15:22.480 INFO:tasks.workunit.client.1.vm05.stdout:4/808: symlink d5/de/d15/d21/d39/d91/de9/l129 0 2026-03-09T16:15:22.484 INFO:tasks.workunit.client.1.vm05.stdout:8/736: write d4/d6/d3a/d40/f76 [518786,99011] 0 2026-03-09T16:15:22.486 INFO:tasks.workunit.client.1.vm05.stdout:3/648: getdents d0 0 2026-03-09T16:15:22.488 INFO:tasks.workunit.client.1.vm05.stdout:3/649: chown d0/d9/d22/d4c/d4e/db3/lbd 511141 1 2026-03-09T16:15:22.489 INFO:tasks.workunit.client.1.vm05.stdout:6/683: rename d17/d22/d9d/ld6 to d17/d22/d9d/db4/lff 0 2026-03-09T16:15:22.489 INFO:tasks.workunit.client.1.vm05.stdout:9/728: mkdir d4/d10/d35/d2b/d31/d82/df8 0 2026-03-09T16:15:22.491 INFO:tasks.workunit.client.1.vm05.stdout:6/684: sync 2026-03-09T16:15:22.493 INFO:tasks.workunit.client.1.vm05.stdout:7/752: write d1/d2/d8/dc/d1b/d30/d4b/fdf [798154,87588] 0 2026-03-09T16:15:22.495 INFO:tasks.workunit.client.1.vm05.stdout:4/809: creat d5/de/d15/da9/df6/f12a x:0 0 0 2026-03-09T16:15:22.498 INFO:tasks.workunit.client.1.vm05.stdout:5/748: getdents d8/d18/d1b/d47/d4e/d76/d8f/df9 0 2026-03-09T16:15:22.501 INFO:tasks.workunit.client.1.vm05.stdout:3/650: mknod d0/d9/d97/dad/cdb 0 2026-03-09T16:15:22.503 INFO:tasks.workunit.client.1.vm05.stdout:3/651: dread d0/d9/d22/d5f/d90/dae/fb1 [0,4194304] 0 2026-03-09T16:15:22.507 INFO:tasks.workunit.client.1.vm05.stdout:7/753: truncate d1/d2/d11/d86/f96 1142469 0 2026-03-09T16:15:22.510 INFO:tasks.workunit.client.1.vm05.stdout:5/749: symlink d8/d18/dbc/dcc/daa/d43/l106 0 2026-03-09T16:15:22.511 INFO:tasks.workunit.client.1.vm05.stdout:5/750: dread - d8/d53/d7a/f84 zero size 2026-03-09T16:15:22.511 INFO:tasks.workunit.client.1.vm05.stdout:5/751: chown d8/d18/d1b/d47/d48/f61 75 1 2026-03-09T16:15:22.516 INFO:tasks.workunit.client.1.vm05.stdout:6/685: dwrite d17/f30 [4194304,4194304] 0 2026-03-09T16:15:22.520 INFO:tasks.workunit.client.1.vm05.stdout:8/737: truncate d4/d6/f9 3938041 0 2026-03-09T16:15:22.522 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:22 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:22.522 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:22 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:22.522 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:22 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:22.522 INFO:tasks.workunit.client.1.vm05.stdout:8/738: stat d4/d6/db/d59/d87 0 2026-03-09T16:15:22.526 INFO:tasks.workunit.client.1.vm05.stdout:2/657: rename db/f2d to db/dd/d15/d1f/d21/d87/fd9 0 2026-03-09T16:15:22.530 INFO:tasks.workunit.client.1.vm05.stdout:3/652: creat d0/d9/d22/d5f/d75/d76/d88/d89/fdc x:0 0 0 2026-03-09T16:15:22.552 INFO:tasks.workunit.client.1.vm05.stdout:7/754: dread - d1/d2/d8/dc/d1b/d30/d7d/fc1 zero size 2026-03-09T16:15:22.556 INFO:tasks.workunit.client.1.vm05.stdout:9/729: dwrite d4/d10/d35/d36/fb3 [0,4194304] 0 2026-03-09T16:15:22.570 INFO:tasks.workunit.client.1.vm05.stdout:6/686: unlink cd 0 2026-03-09T16:15:22.570 INFO:tasks.workunit.client.1.vm05.stdout:8/739: creat d4/d6/d3a/d15/ff4 x:0 0 0 2026-03-09T16:15:22.571 INFO:tasks.workunit.client.1.vm05.stdout:1/773: getdents d7/dbe/ded 0 2026-03-09T16:15:22.573 INFO:tasks.workunit.client.1.vm05.stdout:4/810: link d5/de/d15/da9/db1/dad/d90/cea d5/de/d15/da9/df6/c12b 0 2026-03-09T16:15:22.574 INFO:tasks.workunit.client.1.vm05.stdout:5/752: symlink d8/d18/d1b/d47/d48/d73/dfb/l107 0 2026-03-09T16:15:22.575 INFO:tasks.workunit.client.1.vm05.stdout:7/755: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108 0 2026-03-09T16:15:22.576 INFO:tasks.workunit.client.1.vm05.stdout:9/730: mknod d4/d10/d35/d36/d48/d54/cf9 0 2026-03-09T16:15:22.577 INFO:tasks.workunit.client.1.vm05.stdout:0/692: link d5/d2c/d49/d83/d8b/d95/c2d d5/d2c/cf0 0 2026-03-09T16:15:22.577 INFO:tasks.workunit.client.1.vm05.stdout:2/658: chown db/dd/d15/d4c/d56/f62 2015 1 2026-03-09T16:15:22.577 INFO:tasks.workunit.client.1.vm05.stdout:9/731: readlink d4/d10/d35/d2b/d38/d65/dd6/de3/l5f 0 2026-03-09T16:15:22.577 INFO:tasks.workunit.client.1.vm05.stdout:3/653: symlink d0/d9/d22/d4c/d4e/ldd 0 2026-03-09T16:15:22.578 INFO:tasks.workunit.client.1.vm05.stdout:0/693: fsync d5/db/d5b/d82/fb7 0 2026-03-09T16:15:22.578 INFO:tasks.workunit.client.1.vm05.stdout:4/811: mknod d5/de/d15/da9/df6/c12c 0 2026-03-09T16:15:22.579 INFO:tasks.workunit.client.1.vm05.stdout:2/659: write db/fa4 [148544,89384] 0 2026-03-09T16:15:22.579 INFO:tasks.workunit.client.1.vm05.stdout:0/694: fsync d5/f8 0 2026-03-09T16:15:22.583 INFO:tasks.workunit.client.1.vm05.stdout:1/774: creat d7/dd/d21/d39/d48/d8c/dd8/d103/f10a x:0 0 0 2026-03-09T16:15:22.588 INFO:tasks.workunit.client.1.vm05.stdout:0/695: dwrite d5/d11/d4f/d70/fbf [0,4194304] 0 2026-03-09T16:15:22.588 INFO:tasks.workunit.client.1.vm05.stdout:5/753: sync 2026-03-09T16:15:22.588 INFO:tasks.workunit.client.1.vm05.stdout:0/696: stat d5/d1b/d30/c9a 0 2026-03-09T16:15:22.588 INFO:tasks.workunit.client.1.vm05.stdout:3/654: creat d0/d9/d22/d5f/d90/fde x:0 0 0 2026-03-09T16:15:22.594 INFO:tasks.workunit.client.1.vm05.stdout:4/812: write d5/de/d15/d21/f6d [893926,68982] 0 2026-03-09T16:15:22.601 INFO:tasks.workunit.client.1.vm05.stdout:1/775: dread - d7/dd/d21/d39/d48/d8c/dd8/d103/f10a zero size 2026-03-09T16:15:22.602 INFO:tasks.workunit.client.1.vm05.stdout:7/756: creat d1/d2/d8/f109 x:0 0 0 2026-03-09T16:15:22.602 INFO:tasks.workunit.client.1.vm05.stdout:2/660: readlink db/dd/d15/d1f/d20/d23/d78/ld1 0 2026-03-09T16:15:22.603 INFO:tasks.workunit.client.1.vm05.stdout:6/687: rename d17/d22/d27/d8a/cdc to d17/d5d/c100 0 2026-03-09T16:15:22.604 INFO:tasks.workunit.client.1.vm05.stdout:0/697: creat d5/d11/d4f/d68/ff1 x:0 0 0 2026-03-09T16:15:22.607 INFO:tasks.workunit.client.1.vm05.stdout:1/776: readlink d7/d15/d6e/dbc/dd6/l104 0 2026-03-09T16:15:22.613 INFO:tasks.workunit.client.1.vm05.stdout:8/740: rename d4/d6/db/l61 to d4/d6/db/dc/d3b/lf5 0 2026-03-09T16:15:22.614 INFO:tasks.workunit.client.1.vm05.stdout:5/754: creat d8/d59/d5b/d8b/da0/de2/f108 x:0 0 0 2026-03-09T16:15:22.620 INFO:tasks.workunit.client.1.vm05.stdout:5/755: chown d8/d18/d1b/d47/d68/l6d 1163 1 2026-03-09T16:15:22.620 INFO:tasks.workunit.client.1.vm05.stdout:3/655: dwrite d0/d9/f2c [0,4194304] 0 2026-03-09T16:15:22.620 INFO:tasks.workunit.client.1.vm05.stdout:0/698: mkdir d5/db/def/df2 0 2026-03-09T16:15:22.620 INFO:tasks.workunit.client.1.vm05.stdout:8/741: fdatasync d4/d6/d3a/d40/d71/fe2 0 2026-03-09T16:15:22.622 INFO:tasks.workunit.client.1.vm05.stdout:0/699: chown d5/d2c/fde 49049 1 2026-03-09T16:15:22.624 INFO:tasks.workunit.client.1.vm05.stdout:3/656: stat d0/d9/f2b 0 2026-03-09T16:15:22.625 INFO:tasks.workunit.client.1.vm05.stdout:3/657: dread - d0/d33/f63 zero size 2026-03-09T16:15:22.630 INFO:tasks.workunit.client.1.vm05.stdout:5/756: creat d8/d18/d1b/d47/d48/d73/dfb/f109 x:0 0 0 2026-03-09T16:15:22.630 INFO:tasks.workunit.client.1.vm05.stdout:1/777: truncate d7/fb 3520339 0 2026-03-09T16:15:22.642 INFO:tasks.workunit.client.1.vm05.stdout:9/732: write d4/d10/d35/d2b/d38/f5e [1927109,74665] 0 2026-03-09T16:15:22.648 INFO:tasks.workunit.client.1.vm05.stdout:6/688: dread d17/f2d [0,4194304] 0 2026-03-09T16:15:22.652 INFO:tasks.workunit.client.1.vm05.stdout:6/689: write d17/d22/d9d/fe8 [1493639,111850] 0 2026-03-09T16:15:22.652 INFO:tasks.workunit.client.1.vm05.stdout:3/658: symlink d0/d9/d22/d5f/d7b/d99/ldf 0 2026-03-09T16:15:22.652 INFO:tasks.workunit.client.1.vm05.stdout:8/742: rename d4/d6/db/d59/fcb to d4/d6/db/d59/db0/ff6 0 2026-03-09T16:15:22.652 INFO:tasks.workunit.client.1.vm05.stdout:4/813: dread d5/de/d82/fbe [0,4194304] 0 2026-03-09T16:15:22.652 INFO:tasks.workunit.client.1.vm05.stdout:9/733: write d4/d10/d35/d36/d48/f9e [463939,94815] 0 2026-03-09T16:15:22.654 INFO:tasks.workunit.client.1.vm05.stdout:1/778: creat d7/dd/d21/d63/d71/ddc/df8/f10b x:0 0 0 2026-03-09T16:15:22.654 INFO:tasks.workunit.client.1.vm05.stdout:2/661: link db/dd/d15/cb3 db/dd/d15/d3f/d5b/d60/cda 0 2026-03-09T16:15:22.656 INFO:tasks.workunit.client.1.vm05.stdout:6/690: read d17/d22/d27/d34/f47 [2791180,114462] 0 2026-03-09T16:15:22.661 INFO:tasks.workunit.client.1.vm05.stdout:2/662: write db/dd/d15/d3f/d5b/d60/f7c [3616366,40149] 0 2026-03-09T16:15:22.661 INFO:tasks.workunit.client.1.vm05.stdout:1/779: chown d7/dd/d21/d39/d48/d8c/dd8 390 1 2026-03-09T16:15:22.668 INFO:tasks.workunit.client.1.vm05.stdout:6/691: chown d17/d22/d9d/c76 881 1 2026-03-09T16:15:22.669 INFO:tasks.workunit.client.1.vm05.stdout:7/757: dwrite d1/d2/d11/d86/f96 [0,4194304] 0 2026-03-09T16:15:22.678 INFO:tasks.workunit.client.1.vm05.stdout:0/700: rmdir d5/db/d5b/d82/de0 0 2026-03-09T16:15:22.680 INFO:tasks.workunit.client.1.vm05.stdout:9/734: dread d4/d10/d35/d2b/d31/d96/ddd/fe8 [0,4194304] 0 2026-03-09T16:15:22.681 INFO:tasks.workunit.client.1.vm05.stdout:1/780: symlink d7/dd/d21/d39/d5a/l10c 0 2026-03-09T16:15:22.682 INFO:tasks.workunit.client.1.vm05.stdout:6/692: dread d17/f2d [0,4194304] 0 2026-03-09T16:15:22.689 INFO:tasks.workunit.client.1.vm05.stdout:2/663: dwrite db/dd/d15/d3f/d5b/d60/d95/fc9 [0,4194304] 0 2026-03-09T16:15:22.691 INFO:tasks.workunit.client.1.vm05.stdout:5/757: write d8/d18/d1b/d47/f4c [2127677,7449] 0 2026-03-09T16:15:22.691 INFO:tasks.workunit.client.1.vm05.stdout:6/693: fdatasync d17/d22/d27/d34/d42/d53/ffd 0 2026-03-09T16:15:22.692 INFO:tasks.workunit.client.1.vm05.stdout:4/814: dwrite d5/d9c/dbd/f11a [0,4194304] 0 2026-03-09T16:15:22.695 INFO:tasks.workunit.client.1.vm05.stdout:8/743: mknod d4/d6/db/d9b/cf7 0 2026-03-09T16:15:22.702 INFO:tasks.workunit.client.1.vm05.stdout:6/694: fsync d17/d22/d27/d44/f48 0 2026-03-09T16:15:22.702 INFO:tasks.workunit.client.1.vm05.stdout:6/695: fdatasync d17/f60 0 2026-03-09T16:15:22.707 INFO:tasks.workunit.client.1.vm05.stdout:9/735: mknod d4/d10/dd7/cfa 0 2026-03-09T16:15:22.708 INFO:tasks.workunit.client.1.vm05.stdout:1/781: creat d7/dd/f10d x:0 0 0 2026-03-09T16:15:22.720 INFO:tasks.workunit.client.1.vm05.stdout:8/744: rmdir d4/d6/d3a/d40/d6a/d97 39 2026-03-09T16:15:22.725 INFO:tasks.workunit.client.1.vm05.stdout:6/696: chown d17/d5d/c100 40 1 2026-03-09T16:15:22.728 INFO:tasks.workunit.client.1.vm05.stdout:8/745: write d4/d6/db/df/fdc [480146,19911] 0 2026-03-09T16:15:22.729 INFO:tasks.workunit.client.1.vm05.stdout:7/758: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108/f10a x:0 0 0 2026-03-09T16:15:22.730 INFO:tasks.workunit.client.1.vm05.stdout:9/736: dwrite d4/f4a [0,4194304] 0 2026-03-09T16:15:22.733 INFO:tasks.workunit.client.1.vm05.stdout:3/659: dwrite d0/d9/d22/d5f/d7b/fb7 [0,4194304] 0 2026-03-09T16:15:22.739 INFO:tasks.workunit.client.1.vm05.stdout:0/701: dwrite d5/d11/f9f [0,4194304] 0 2026-03-09T16:15:22.741 INFO:tasks.workunit.client.1.vm05.stdout:7/759: read d1/d2/d11/d86/f96 [3538551,122430] 0 2026-03-09T16:15:22.750 INFO:tasks.workunit.client.1.vm05.stdout:7/760: fdatasync d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fdc 0 2026-03-09T16:15:22.759 INFO:tasks.workunit.client.1.vm05.stdout:1/782: dwrite d7/dd/d21/d39/d87/fe2 [0,4194304] 0 2026-03-09T16:15:22.776 INFO:tasks.workunit.client.1.vm05.stdout:2/664: unlink db/dd/cd3 0 2026-03-09T16:15:22.783 INFO:tasks.workunit.client.1.vm05.stdout:9/737: readlink d4/d10/d35/d36/d48/d60/d94/la2 0 2026-03-09T16:15:22.792 INFO:tasks.workunit.client.1.vm05.stdout:3/660: fdatasync d0/f46 0 2026-03-09T16:15:22.793 INFO:tasks.workunit.client.1.vm05.stdout:5/758: dread d8/d18/d1b/f32 [0,4194304] 0 2026-03-09T16:15:22.793 INFO:tasks.workunit.client.1.vm05.stdout:0/702: fdatasync d5/d11/d4f/d68/f94 0 2026-03-09T16:15:22.793 INFO:tasks.workunit.client.1.vm05.stdout:7/761: rmdir d1/d2/d8/dc/d9c 39 2026-03-09T16:15:22.793 INFO:tasks.workunit.client.1.vm05.stdout:3/661: stat d0/d9/d22/d5f/d75/d76/d88/d89/fdc 0 2026-03-09T16:15:22.796 INFO:tasks.workunit.client.1.vm05.stdout:0/703: dwrite d5/db/d5f/da3/fc6 [0,4194304] 0 2026-03-09T16:15:22.804 INFO:tasks.workunit.client.1.vm05.stdout:9/738: creat d4/d10/d35/d2b/d31/dc8/ffb x:0 0 0 2026-03-09T16:15:22.808 INFO:tasks.workunit.client.1.vm05.stdout:5/759: creat d8/d18/d1b/d47/dda/f10a x:0 0 0 2026-03-09T16:15:22.811 INFO:tasks.workunit.client.1.vm05.stdout:7/762: symlink d1/d2/d8/dc/d1b/d30/d4b/l10b 0 2026-03-09T16:15:22.818 INFO:tasks.workunit.client.1.vm05.stdout:1/783: symlink d7/dd/de/d52/df6/db4/l10e 0 2026-03-09T16:15:22.822 INFO:tasks.workunit.client.1.vm05.stdout:9/739: dread d4/d10/d35/d36/d48/f6e [0,4194304] 0 2026-03-09T16:15:22.830 INFO:tasks.workunit.client.1.vm05.stdout:3/662: mknod d0/d9/d22/d5f/d90/dae/ce0 0 2026-03-09T16:15:22.830 INFO:tasks.workunit.client.1.vm05.stdout:3/663: write d0/d33/f3a [199665,105066] 0 2026-03-09T16:15:22.830 INFO:tasks.workunit.client.1.vm05.stdout:7/763: dread d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fe8 [0,4194304] 0 2026-03-09T16:15:22.830 INFO:tasks.workunit.client.1.vm05.stdout:4/815: getdents d5/de/d15/d21/d27 0 2026-03-09T16:15:22.832 INFO:tasks.workunit.client.1.vm05.stdout:3/664: dwrite d0/d9/d22/d5f/d90/fa2 [0,4194304] 0 2026-03-09T16:15:22.841 INFO:tasks.workunit.client.1.vm05.stdout:0/704: mkdir d5/db/d77/df3 0 2026-03-09T16:15:22.841 INFO:tasks.workunit.client.1.vm05.stdout:8/746: creat d4/ff8 x:0 0 0 2026-03-09T16:15:22.841 INFO:tasks.workunit.client.1.vm05.stdout:5/760: mkdir d8/d18/dbc/d10b 0 2026-03-09T16:15:22.846 INFO:tasks.workunit.client.1.vm05.stdout:9/740: truncate d4/d10/d35/d2b/d38/d65/dd6/de3/f93 883932 0 2026-03-09T16:15:22.848 INFO:tasks.workunit.client.1.vm05.stdout:7/764: creat d1/d2/d11/d86/da2/f10c x:0 0 0 2026-03-09T16:15:22.861 INFO:tasks.workunit.client.1.vm05.stdout:3/665: truncate d0/d9/d22/d5f/fa7 395351 0 2026-03-09T16:15:22.861 INFO:tasks.workunit.client.1.vm05.stdout:2/665: getdents db/dd/d15/d46/d67 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:9/741: chown d4/d10/d35/cb 212 1 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:9/742: chown d4/d10/dd7/cfa 429790394 1 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:0/705: dread d5/d11/d4f/d70/fd7 [0,4194304] 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:0/706: chown d5/d1b/d30/ca0 2087146 1 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:2/666: chown db/dd/d15/d46/l5a 5743 1 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:0/707: write d5/d1b/fbb [939884,32060] 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:8/747: fdatasync d4/d6/f9 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:2/667: write db/dd/d15/d4c/d56/f62 [1807628,83979] 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:3/666: mkdir d0/d9/d97/dad/de1 0 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:3/667: chown d0/d33/l3c 826363 1 2026-03-09T16:15:22.862 INFO:tasks.workunit.client.1.vm05.stdout:9/743: write d4/d10/d35/d2b/d31/d96/f97 [872413,53695] 0 2026-03-09T16:15:22.864 INFO:tasks.workunit.client.1.vm05.stdout:8/748: rmdir d4/d6/d3a/d40/d6a 39 2026-03-09T16:15:22.864 INFO:tasks.workunit.client.1.vm05.stdout:4/816: getdents d5/de/d15/d21/d27/d3c/d5c/d5f/d4e 0 2026-03-09T16:15:22.865 INFO:tasks.workunit.client.1.vm05.stdout:2/668: symlink db/dd/d15/d3f/d5b/d60/d95/ldb 0 2026-03-09T16:15:22.866 INFO:tasks.workunit.client.1.vm05.stdout:7/765: rename d1/d2/d8/dc/d1b/d71/c5c to d1/d2/d11/c10d 0 2026-03-09T16:15:22.869 INFO:tasks.workunit.client.1.vm05.stdout:3/668: rename d0/d9/d22/d5f/d90/dae/dd2/ld6 to d0/d9/d22/d5f/d75/le2 0 2026-03-09T16:15:22.870 INFO:tasks.workunit.client.1.vm05.stdout:2/669: creat db/dd/d15/d1f/d20/d86/fdc x:0 0 0 2026-03-09T16:15:22.871 INFO:tasks.workunit.client.1.vm05.stdout:3/669: creat d0/da9/fe3 x:0 0 0 2026-03-09T16:15:22.873 INFO:tasks.workunit.client.1.vm05.stdout:9/744: rename d4/d10/ce6 to d4/d10/d35/d2b/cfc 0 2026-03-09T16:15:22.875 INFO:tasks.workunit.client.1.vm05.stdout:3/670: creat d0/d9/d22/d5f/d7b/da8/dd8/fe4 x:0 0 0 2026-03-09T16:15:22.876 INFO:tasks.workunit.client.1.vm05.stdout:9/745: dread - d4/d10/f75 zero size 2026-03-09T16:15:22.877 INFO:tasks.workunit.client.1.vm05.stdout:7/766: link d1/d2/d11/c78 d1/d2/d8/dc/d1b/d30/c10e 0 2026-03-09T16:15:22.878 INFO:tasks.workunit.client.1.vm05.stdout:7/767: write d1/f26 [4314684,11698] 0 2026-03-09T16:15:22.878 INFO:tasks.workunit.client.1.vm05.stdout:9/746: mknod d4/d10/d35/d2b/d31/d82/df8/cfd 0 2026-03-09T16:15:22.879 INFO:tasks.workunit.client.1.vm05.stdout:3/671: dread d0/d9/f73 [0,4194304] 0 2026-03-09T16:15:22.881 INFO:tasks.workunit.client.1.vm05.stdout:7/768: symlink d1/d2/d11/d86/d8a/l10f 0 2026-03-09T16:15:22.881 INFO:tasks.workunit.client.1.vm05.stdout:3/672: dread d0/d33/f85 [0,4194304] 0 2026-03-09T16:15:22.883 INFO:tasks.workunit.client.1.vm05.stdout:7/769: fdatasync d1/d2/d11/d86/d8a/fa3 0 2026-03-09T16:15:22.920 INFO:tasks.workunit.client.1.vm05.stdout:3/673: creat d0/d9/d22/d5f/d75/d76/d88/d89/fe5 x:0 0 0 2026-03-09T16:15:22.982 INFO:tasks.workunit.client.1.vm05.stdout:7/770: sync 2026-03-09T16:15:22.984 INFO:tasks.workunit.client.1.vm05.stdout:7/771: creat d1/d2/d8/dc/d1b/d30/d4b/db2/df3/f110 x:0 0 0 2026-03-09T16:15:22.990 INFO:tasks.workunit.client.1.vm05.stdout:7/772: symlink d1/d2/d8/dc/d1b/d30/d4b/db2/de9/l111 0 2026-03-09T16:15:22.991 INFO:tasks.workunit.client.1.vm05.stdout:6/697: write d17/d1d/fa8 [493696,123007] 0 2026-03-09T16:15:22.996 INFO:tasks.workunit.client.1.vm05.stdout:7/773: creat d1/d2/d11/d86/d8a/d91/f112 x:0 0 0 2026-03-09T16:15:22.998 INFO:tasks.workunit.client.1.vm05.stdout:7/774: truncate d1/d2/d8/dc/d1b/d30/d4b/d65/f7f 5069030 0 2026-03-09T16:15:22.999 INFO:tasks.workunit.client.1.vm05.stdout:7/775: chown d1/d2/d8/dc/d1b/d30/d4b/d65/f7f 2006 1 2026-03-09T16:15:23.000 INFO:tasks.workunit.client.1.vm05.stdout:7/776: readlink d1/d2/d8/dc/d1b/d71/ld8 0 2026-03-09T16:15:23.003 INFO:tasks.workunit.client.1.vm05.stdout:7/777: sync 2026-03-09T16:15:23.003 INFO:tasks.workunit.client.1.vm05.stdout:7/778: chown d1/d2/d8/dc/d14/l7e 6647 1 2026-03-09T16:15:23.012 INFO:tasks.workunit.client.1.vm05.stdout:1/784: write d7/dd/d21/fba [950678,40145] 0 2026-03-09T16:15:23.015 INFO:tasks.workunit.client.1.vm05.stdout:2/670: rename db/dd/d15/d1f/d20/d86 to db/dd/d15/d3f/d5b/d60/d95/ddd 0 2026-03-09T16:15:23.016 INFO:tasks.workunit.client.1.vm05.stdout:0/708: write d5/d11/f8a [612281,100739] 0 2026-03-09T16:15:23.016 INFO:tasks.workunit.client.1.vm05.stdout:8/749: write d4/d6/d53/f7f [694292,71646] 0 2026-03-09T16:15:23.016 INFO:tasks.workunit.client.1.vm05.stdout:3/674: rename d0/d9/d22 to d0/d9/d22/d4c/de6 22 2026-03-09T16:15:23.020 INFO:tasks.workunit.client.1.vm05.stdout:5/761: write d8/d18/d1b/d6b/f93 [1010967,91971] 0 2026-03-09T16:15:23.029 INFO:tasks.workunit.client.1.vm05.stdout:4/817: dwrite d5/de/d15/da9/db1/f64 [0,4194304] 0 2026-03-09T16:15:23.037 INFO:tasks.workunit.client.1.vm05.stdout:5/762: write d8/d18/d1b/d47/d48/d73/fdd [512295,42405] 0 2026-03-09T16:15:23.038 INFO:tasks.workunit.client.1.vm05.stdout:1/785: readlink d7/dd/d21/d39/d5a/d50/lcf 0 2026-03-09T16:15:23.039 INFO:tasks.workunit.client.1.vm05.stdout:4/818: chown d5/lc 244 1 2026-03-09T16:15:23.040 INFO:tasks.workunit.client.1.vm05.stdout:0/709: truncate d5/db/d5b/f69 2454864 0 2026-03-09T16:15:23.047 INFO:tasks.workunit.client.1.vm05.stdout:7/779: rename d1/d2/d8/f9a to d1/d2/d8/dc/d1b/d30/d4b/d65/f113 0 2026-03-09T16:15:23.052 INFO:tasks.workunit.client.1.vm05.stdout:6/698: dwrite d17/f18 [0,4194304] 0 2026-03-09T16:15:23.053 INFO:tasks.workunit.client.1.vm05.stdout:9/747: dwrite d4/f43 [0,4194304] 0 2026-03-09T16:15:23.059 INFO:tasks.workunit.client.1.vm05.stdout:0/710: dwrite d5/db/d5b/fcd [0,4194304] 0 2026-03-09T16:15:23.069 INFO:tasks.workunit.client.1.vm05.stdout:8/750: truncate d4/d6/db/dc/d5d/da0/dbf/fcf 130797 0 2026-03-09T16:15:23.074 INFO:tasks.workunit.client.1.vm05.stdout:6/699: mknod d17/d22/d9d/da9/c101 0 2026-03-09T16:15:23.075 INFO:tasks.workunit.client.1.vm05.stdout:0/711: write d5/d2c/d49/d83/d8b/d95/f2e [2646347,120311] 0 2026-03-09T16:15:23.088 INFO:tasks.workunit.client.1.vm05.stdout:1/786: dwrite d7/dd/d21/d44/f46 [4194304,4194304] 0 2026-03-09T16:15:23.089 INFO:tasks.workunit.client.1.vm05.stdout:9/748: mkdir d4/d10/d35/d2b/d38/d65/dfe 0 2026-03-09T16:15:23.090 INFO:tasks.workunit.client.1.vm05.stdout:8/751: rmdir d4/d6/db/df/d4f 39 2026-03-09T16:15:23.092 INFO:tasks.workunit.client.1.vm05.stdout:7/780: dwrite d1/d2/d8/dc/d1b/de6/ff8 [4194304,4194304] 0 2026-03-09T16:15:23.109 INFO:tasks.workunit.client.1.vm05.stdout:5/763: creat d8/d18/dbc/dcc/daa/f10c x:0 0 0 2026-03-09T16:15:23.110 INFO:tasks.workunit.client.1.vm05.stdout:3/675: rename d0/d9/d8b/fb0 to d0/d9/d22/d4c/d4e/fe7 0 2026-03-09T16:15:23.111 INFO:tasks.workunit.client.1.vm05.stdout:5/764: write d8/d18/d1b/d6b/f93 [66984,117718] 0 2026-03-09T16:15:23.123 INFO:tasks.workunit.client.1.vm05.stdout:1/787: unlink d7/dbe/ded/df3/lfa 0 2026-03-09T16:15:23.123 INFO:tasks.workunit.client.1.vm05.stdout:8/752: unlink d4/d6/d3a/d15/c7d 0 2026-03-09T16:15:23.123 INFO:tasks.workunit.client.1.vm05.stdout:1/788: read - d7/dd/f10d zero size 2026-03-09T16:15:23.125 INFO:tasks.workunit.client.1.vm05.stdout:8/753: write d4/d6/db/d75/fe4 [923652,50929] 0 2026-03-09T16:15:23.134 INFO:tasks.workunit.client.1.vm05.stdout:5/765: symlink d8/d53/d7e/l10d 0 2026-03-09T16:15:23.134 INFO:tasks.workunit.client.1.vm05.stdout:8/754: creat d4/d6/db/dc/d2e/d85/ff9 x:0 0 0 2026-03-09T16:15:23.135 INFO:tasks.workunit.client.1.vm05.stdout:8/755: stat d4/d6/c1a 0 2026-03-09T16:15:23.141 INFO:tasks.workunit.client.1.vm05.stdout:1/789: creat d7/dd/d21/d39/d48/da7/db5/f10f x:0 0 0 2026-03-09T16:15:23.142 INFO:tasks.workunit.client.1.vm05.stdout:4/819: dread d5/de/d15/da9/db1/dad/f48 [0,4194304] 0 2026-03-09T16:15:23.144 INFO:tasks.workunit.client.1.vm05.stdout:5/766: dread d8/d18/dbc/dcc/daa/fb1 [0,4194304] 0 2026-03-09T16:15:23.149 INFO:tasks.workunit.client.1.vm05.stdout:8/756: stat d4/d6/db/dc/d2e/cad 0 2026-03-09T16:15:23.153 INFO:tasks.workunit.client.1.vm05.stdout:6/700: rename d17/d22/d27/f2a to d17/d5d/f102 0 2026-03-09T16:15:23.153 INFO:tasks.workunit.client.1.vm05.stdout:8/757: fdatasync d4/d6/d3a/f49 0 2026-03-09T16:15:23.163 INFO:tasks.workunit.client.1.vm05.stdout:2/671: dwrite db/dd/d15/d46/d67/f9a [0,4194304] 0 2026-03-09T16:15:23.171 INFO:tasks.workunit.client.1.vm05.stdout:5/767: dread d8/d59/d5b/f66 [8388608,4194304] 0 2026-03-09T16:15:23.175 INFO:tasks.workunit.client.1.vm05.stdout:6/701: mknod d17/d22/d27/d8a/d8b/c103 0 2026-03-09T16:15:23.175 INFO:tasks.workunit.client.1.vm05.stdout:2/672: dwrite db/dd/d15/d46/fd2 [0,4194304] 0 2026-03-09T16:15:23.181 INFO:tasks.workunit.client.1.vm05.stdout:5/768: read - d8/d18/fc5 zero size 2026-03-09T16:15:23.181 INFO:tasks.workunit.client.1.vm05.stdout:1/790: link d7/d27/f4d d7/dd/d21/d63/d71/ddc/df8/f110 0 2026-03-09T16:15:23.183 INFO:tasks.workunit.client.1.vm05.stdout:6/702: sync 2026-03-09T16:15:23.184 INFO:tasks.workunit.client.1.vm05.stdout:6/703: truncate d17/d22/fe4 1279430 0 2026-03-09T16:15:23.194 INFO:tasks.workunit.client.1.vm05.stdout:3/676: write d0/d9/d22/f30 [1487500,123847] 0 2026-03-09T16:15:23.197 INFO:tasks.workunit.client.1.vm05.stdout:0/712: dwrite d5/d11/f1e [0,4194304] 0 2026-03-09T16:15:23.199 INFO:tasks.workunit.client.1.vm05.stdout:8/758: mknod d4/d6/d3a/d15/cfa 0 2026-03-09T16:15:23.212 INFO:tasks.workunit.client.1.vm05.stdout:5/769: symlink d8/d59/d5b/d8b/da0/de2/l10e 0 2026-03-09T16:15:23.212 INFO:tasks.workunit.client.1.vm05.stdout:2/673: creat db/dd/d15/d3f/d5b/d60/da2/fde x:0 0 0 2026-03-09T16:15:23.216 INFO:tasks.workunit.client.1.vm05.stdout:6/704: mkdir d17/d22/d27/d34/d42/d53/d87/d104 0 2026-03-09T16:15:23.224 INFO:tasks.workunit.client.1.vm05.stdout:1/791: dwrite d7/dd/d21/d2d/f108 [0,4194304] 0 2026-03-09T16:15:23.224 INFO:tasks.workunit.client.1.vm05.stdout:0/713: mknod d5/d1b/d30/cf4 0 2026-03-09T16:15:23.225 INFO:tasks.workunit.client.1.vm05.stdout:9/749: rename d4/d10/d35/d36/d48/d54/la9 to d4/lff 0 2026-03-09T16:15:23.228 INFO:tasks.workunit.client.1.vm05.stdout:5/770: mkdir d8/d59/d10f 0 2026-03-09T16:15:23.230 INFO:tasks.workunit.client.1.vm05.stdout:0/714: fdatasync d5/db/d5f/f7b 0 2026-03-09T16:15:23.232 INFO:tasks.workunit.client.1.vm05.stdout:7/781: rename d1/d2/d8/dc/d1b/d30/d5e to d1/d2/d8/dc/d1b/d30/d7d/d114 0 2026-03-09T16:15:23.242 INFO:tasks.workunit.client.1.vm05.stdout:8/759: rename d4/d6/db/dc/d5d/da0/dbf/fcf to d4/d6/db/d59/db0/dd6/ffb 0 2026-03-09T16:15:23.244 INFO:tasks.workunit.client.1.vm05.stdout:2/674: dwrite db/dd/f10 [0,4194304] 0 2026-03-09T16:15:23.244 INFO:tasks.workunit.client.1.vm05.stdout:9/750: fdatasync d4/d10/d35/fc3 0 2026-03-09T16:15:23.245 INFO:tasks.workunit.client.1.vm05.stdout:1/792: mknod d7/dd/de/d52/df6/dfc/c111 0 2026-03-09T16:15:23.249 INFO:tasks.workunit.client.1.vm05.stdout:0/715: mkdir d5/d1b/d3b/df5 0 2026-03-09T16:15:23.250 INFO:tasks.workunit.client.1.vm05.stdout:0/716: stat d5/d2c/d49/f5d 0 2026-03-09T16:15:23.251 INFO:tasks.workunit.client.1.vm05.stdout:3/677: getdents d0/d9/d22/d4c 0 2026-03-09T16:15:23.255 INFO:tasks.workunit.client.1.vm05.stdout:0/717: dread - d5/d2c/fde zero size 2026-03-09T16:15:23.258 INFO:tasks.workunit.client.1.vm05.stdout:7/782: dwrite d1/d2/d8/d31/d8d/f6f [0,4194304] 0 2026-03-09T16:15:23.261 INFO:tasks.workunit.client.1.vm05.stdout:3/678: truncate d0/d33/f7d 657362 0 2026-03-09T16:15:23.262 INFO:tasks.workunit.client.1.vm05.stdout:5/771: creat d8/d18/dbc/dcc/daa/f110 x:0 0 0 2026-03-09T16:15:23.271 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:23 vm05.local ceph-mon[58702]: pgmap v21: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 42 MiB/s rd, 128 MiB/s wr, 260 op/s 2026-03-09T16:15:23.271 INFO:tasks.workunit.client.1.vm05.stdout:9/751: creat d4/d10/d35/d36/d48/d60/d94/f100 x:0 0 0 2026-03-09T16:15:23.280 INFO:tasks.workunit.client.1.vm05.stdout:1/793: mkdir d7/d15/d6e/dbc/dd6/d112 0 2026-03-09T16:15:23.283 INFO:tasks.workunit.client.1.vm05.stdout:5/772: dwrite d8/d18/dbc/dcc/daa/f9d [0,4194304] 0 2026-03-09T16:15:23.287 INFO:tasks.workunit.client.1.vm05.stdout:4/820: write d5/de/d15/da9/db1/dad/d37/d60/f76 [123511,5989] 0 2026-03-09T16:15:23.293 INFO:tasks.workunit.client.1.vm05.stdout:8/760: mknod d4/d6/db/df/d4f/d9f/cfc 0 2026-03-09T16:15:23.294 INFO:tasks.workunit.client.1.vm05.stdout:3/679: symlink d0/d9/d22/d5f/d75/d76/le8 0 2026-03-09T16:15:23.295 INFO:tasks.workunit.client.1.vm05.stdout:1/794: rmdir d7/d15/d6e/dbc/dd6 39 2026-03-09T16:15:23.298 INFO:tasks.workunit.client.1.vm05.stdout:9/752: readlink d4/d10/d35/d2b/d31/d82/le4 0 2026-03-09T16:15:23.298 INFO:tasks.workunit.client.1.vm05.stdout:6/705: rename d17/d22/d27/d58/cc3 to d17/d5d/c105 0 2026-03-09T16:15:23.300 INFO:tasks.workunit.client.1.vm05.stdout:8/761: write d4/d6/db/d9b/fbe [187917,41491] 0 2026-03-09T16:15:23.301 INFO:tasks.workunit.client.1.vm05.stdout:1/795: dread - d7/dd/d21/fde zero size 2026-03-09T16:15:23.305 INFO:tasks.workunit.client.1.vm05.stdout:5/773: dwrite d8/d5e/ff4 [0,4194304] 0 2026-03-09T16:15:23.315 INFO:tasks.workunit.client.1.vm05.stdout:7/783: read d1/d2/d8/dc/d9c/f6b [512588,3142] 0 2026-03-09T16:15:23.315 INFO:tasks.workunit.client.1.vm05.stdout:5/774: dwrite d8/d18/d1b/d47/d48/d73/d80/ff0 [0,4194304] 0 2026-03-09T16:15:23.323 INFO:tasks.workunit.client.1.vm05.stdout:1/796: rmdir d7/dd/d21/d39/d48/d8c 39 2026-03-09T16:15:23.326 INFO:tasks.workunit.client.1.vm05.stdout:6/706: rmdir d17/d22/d27/d8a/d8b 39 2026-03-09T16:15:23.334 INFO:tasks.workunit.client.1.vm05.stdout:9/753: creat d4/d10/d35/d36/d48/f101 x:0 0 0 2026-03-09T16:15:23.335 INFO:tasks.workunit.client.1.vm05.stdout:0/718: getdents d5/db/d5f/da3 0 2026-03-09T16:15:23.335 INFO:tasks.workunit.client.1.vm05.stdout:9/754: readlink d4/d10/l3d 0 2026-03-09T16:15:23.335 INFO:tasks.workunit.client.1.vm05.stdout:7/784: dwrite d1/d2/d8/d31/f105 [0,4194304] 0 2026-03-09T16:15:23.335 INFO:tasks.workunit.client.1.vm05.stdout:1/797: mkdir d7/dd/d21/d39/d5a/d113 0 2026-03-09T16:15:23.337 INFO:tasks.workunit.client.1.vm05.stdout:9/755: dread d4/f43 [0,4194304] 0 2026-03-09T16:15:23.337 INFO:tasks.workunit.client.1.vm05.stdout:6/707: dwrite d17/ff9 [0,4194304] 0 2026-03-09T16:15:23.338 INFO:tasks.workunit.client.1.vm05.stdout:5/775: unlink d8/d18/d1b/f2a 0 2026-03-09T16:15:23.339 INFO:tasks.workunit.client.1.vm05.stdout:5/776: readlink d8/d18/d1b/d47/l87 0 2026-03-09T16:15:23.340 INFO:tasks.workunit.client.1.vm05.stdout:3/680: rmdir d0/d9/d97/dad/de1 0 2026-03-09T16:15:23.347 INFO:tasks.workunit.client.1.vm05.stdout:0/719: rename d5/db/d48/d66/ld8 to d5/d11/d4f/ddc/lf6 0 2026-03-09T16:15:23.347 INFO:tasks.workunit.client.1.vm05.stdout:6/708: chown d17/d22/d9d/cc7 115852 1 2026-03-09T16:15:23.355 INFO:tasks.workunit.client.1.vm05.stdout:9/756: mknod d4/d10/d35/d36/d48/d60/dae/c102 0 2026-03-09T16:15:23.365 INFO:tasks.workunit.client.1.vm05.stdout:6/709: creat d17/d22/d27/d34/dd1/f106 x:0 0 0 2026-03-09T16:15:23.365 INFO:tasks.workunit.client.1.vm05.stdout:9/757: truncate d4/d10/d35/d36/d48/d60/d94/f100 549997 0 2026-03-09T16:15:23.365 INFO:tasks.workunit.client.1.vm05.stdout:6/710: write d17/d5d/d73/fe3 [829871,21160] 0 2026-03-09T16:15:23.365 INFO:tasks.workunit.client.1.vm05.stdout:0/720: unlink d5/d1b/d30/l3e 0 2026-03-09T16:15:23.365 INFO:tasks.workunit.client.1.vm05.stdout:7/785: getdents d1/d2/d8/dc/d1b/d30/d4b/d65/d3e 0 2026-03-09T16:15:23.366 INFO:tasks.workunit.client.1.vm05.stdout:3/681: creat d0/d9/fe9 x:0 0 0 2026-03-09T16:15:23.368 INFO:tasks.workunit.client.1.vm05.stdout:5/777: sync 2026-03-09T16:15:23.372 INFO:tasks.workunit.client.1.vm05.stdout:1/798: rename d7/d15/d6e/dbc/dd6/l102 to d7/dd/d21/d39/d48/l114 0 2026-03-09T16:15:23.373 INFO:tasks.workunit.client.1.vm05.stdout:9/758: dwrite d4/d10/d35/d36/d48/d60/fd3 [0,4194304] 0 2026-03-09T16:15:23.378 INFO:tasks.workunit.client.1.vm05.stdout:3/682: unlink d0/d9/f51 0 2026-03-09T16:15:23.379 INFO:tasks.workunit.client.1.vm05.stdout:5/778: rmdir d8/d18/dbc 39 2026-03-09T16:15:23.379 INFO:tasks.workunit.client.1.vm05.stdout:3/683: dread - d0/d9/fe9 zero size 2026-03-09T16:15:23.397 INFO:tasks.workunit.client.1.vm05.stdout:0/721: dwrite d5/db/d5f/f7b [0,4194304] 0 2026-03-09T16:15:23.399 INFO:tasks.workunit.client.1.vm05.stdout:7/786: rename d1/d2/d8/dc/d1b/d30/f85 to d1/d2/d11/d86/da2/f115 0 2026-03-09T16:15:23.403 INFO:tasks.workunit.client.1.vm05.stdout:6/711: dread d17/d22/d27/d34/d42/d53/f90 [0,4194304] 0 2026-03-09T16:15:23.409 INFO:tasks.workunit.client.1.vm05.stdout:3/684: unlink d0/f49 0 2026-03-09T16:15:23.409 INFO:tasks.workunit.client.1.vm05.stdout:2/675: dread db/dd/d15/d3f/d5b/d60/d95/f80 [0,4194304] 0 2026-03-09T16:15:23.409 INFO:tasks.workunit.client.1.vm05.stdout:9/759: rmdir d4/d10/d35/d2b/d38 39 2026-03-09T16:15:23.411 INFO:tasks.workunit.client.1.vm05.stdout:6/712: read f16 [5670362,82858] 0 2026-03-09T16:15:23.426 INFO:tasks.workunit.client.1.vm05.stdout:9/760: readlink d4/d10/d35/d36/d48/l4e 0 2026-03-09T16:15:23.426 INFO:tasks.workunit.client.1.vm05.stdout:4/821: dwrite d5/de/d2f/f99 [4194304,4194304] 0 2026-03-09T16:15:23.426 INFO:tasks.workunit.client.1.vm05.stdout:0/722: creat d5/db/ff7 x:0 0 0 2026-03-09T16:15:23.426 INFO:tasks.workunit.client.1.vm05.stdout:4/822: creat d5/de/d15/d21/f12d x:0 0 0 2026-03-09T16:15:23.427 INFO:tasks.workunit.client.1.vm05.stdout:6/713: getdents d17/d22/d27/d8a/d8b 0 2026-03-09T16:15:23.435 INFO:tasks.workunit.client.1.vm05.stdout:2/676: sync 2026-03-09T16:15:23.435 INFO:tasks.workunit.client.1.vm05.stdout:6/714: link d17/d5d/d73/cdd d17/d22/d27/d58/db8/c107 0 2026-03-09T16:15:23.441 INFO:tasks.workunit.client.1.vm05.stdout:4/823: dwrite d5/de/d15/d21/d27/f36 [0,4194304] 0 2026-03-09T16:15:23.449 INFO:tasks.workunit.client.1.vm05.stdout:2/677: dread - db/dd/d15/d4c/fcb zero size 2026-03-09T16:15:23.449 INFO:tasks.workunit.client.1.vm05.stdout:9/761: dread d4/d10/f8a [0,4194304] 0 2026-03-09T16:15:23.454 INFO:tasks.workunit.client.1.vm05.stdout:0/723: dread d5/d11/f8a [0,4194304] 0 2026-03-09T16:15:23.454 INFO:tasks.workunit.client.1.vm05.stdout:4/824: creat d5/de/d15/da9/db1/dad/d37/dfb/f12e x:0 0 0 2026-03-09T16:15:23.455 INFO:tasks.workunit.client.1.vm05.stdout:2/678: rename db/dd/d15/d1f/c3a to db/dd/d15/d3f/d5b/d60/d95/dd7/cdf 0 2026-03-09T16:15:23.456 INFO:tasks.workunit.client.1.vm05.stdout:2/679: dread - db/dd/d15/d46/d8d/fcf zero size 2026-03-09T16:15:23.456 INFO:tasks.workunit.client.1.vm05.stdout:8/762: dwrite d4/d6/f29 [0,4194304] 0 2026-03-09T16:15:23.466 INFO:tasks.workunit.client.1.vm05.stdout:6/715: sync 2026-03-09T16:15:23.466 INFO:tasks.workunit.client.1.vm05.stdout:0/724: creat d5/db/def/ff8 x:0 0 0 2026-03-09T16:15:23.466 INFO:tasks.workunit.client.1.vm05.stdout:4/825: dread - d5/de/d15/f52 zero size 2026-03-09T16:15:23.473 INFO:tasks.workunit.client.1.vm05.stdout:8/763: dwrite d4/d6/d3a/f88 [0,4194304] 0 2026-03-09T16:15:23.474 INFO:tasks.workunit.client.1.vm05.stdout:0/725: fsync d5/d1b/d3b/f3c 0 2026-03-09T16:15:23.477 INFO:tasks.workunit.client.1.vm05.stdout:9/762: sync 2026-03-09T16:15:23.478 INFO:tasks.workunit.client.1.vm05.stdout:8/764: fsync d4/d6/db/df/d80/fb9 0 2026-03-09T16:15:23.478 INFO:tasks.workunit.client.1.vm05.stdout:6/716: symlink d17/d5d/d73/d83/l108 0 2026-03-09T16:15:23.491 INFO:tasks.workunit.client.1.vm05.stdout:0/726: chown d5/db/d77/lc8 272 1 2026-03-09T16:15:23.495 INFO:tasks.workunit.client.1.vm05.stdout:9/763: truncate d4/d10/d35/d2b/d38/f5e 2317242 0 2026-03-09T16:15:23.498 INFO:tasks.workunit.client.1.vm05.stdout:6/717: mknod d17/d22/d27/d34/dd1/c109 0 2026-03-09T16:15:23.501 INFO:tasks.workunit.client.1.vm05.stdout:8/765: rename d4/d6/db/da6/lc2 to d4/d6/db/d59/db0/dd6/lfd 0 2026-03-09T16:15:23.507 INFO:tasks.workunit.client.1.vm05.stdout:0/727: creat d5/d2c/d49/d83/d8b/daf/ff9 x:0 0 0 2026-03-09T16:15:23.511 INFO:tasks.workunit.client.1.vm05.stdout:9/764: unlink d4/d10/d35/fdb 0 2026-03-09T16:15:23.513 INFO:tasks.workunit.client.1.vm05.stdout:1/799: truncate d7/dd/d21/d44/f46 7138959 0 2026-03-09T16:15:23.513 INFO:tasks.workunit.client.1.vm05.stdout:5/779: write d8/d53/d7a/f92 [169891,121034] 0 2026-03-09T16:15:23.520 INFO:tasks.workunit.client.1.vm05.stdout:6/718: mkdir d17/d22/d27/d34/dd1/d10a 0 2026-03-09T16:15:23.520 INFO:tasks.workunit.client.1.vm05.stdout:4/826: getdents d5/de/d2f 0 2026-03-09T16:15:23.521 INFO:tasks.workunit.client.1.vm05.stdout:8/766: creat d4/d6/db/dc/d5d/da0/dd7/dd8/ffe x:0 0 0 2026-03-09T16:15:23.522 INFO:tasks.workunit.client.1.vm05.stdout:4/827: fsync d5/de/d15/da9/db1/dad/d90/fd9 0 2026-03-09T16:15:23.522 INFO:tasks.workunit.client.1.vm05.stdout:9/765: creat d4/d10/d35/d36/d48/d54/f103 x:0 0 0 2026-03-09T16:15:23.523 INFO:tasks.workunit.client.1.vm05.stdout:1/800: mknod d7/dd/d21/d63/d71/ddc/c115 0 2026-03-09T16:15:23.527 INFO:tasks.workunit.client.1.vm05.stdout:9/766: truncate d4/d10/d35/d2b/d31/dc8/ffb 865343 0 2026-03-09T16:15:23.527 INFO:tasks.workunit.client.1.vm05.stdout:7/787: dwrite d1/d2/d8/dc/d1b/d71/f74 [0,4194304] 0 2026-03-09T16:15:23.529 INFO:tasks.workunit.client.1.vm05.stdout:7/788: chown d1/d2/d8/dc/dd4 22 1 2026-03-09T16:15:23.530 INFO:tasks.workunit.client.1.vm05.stdout:8/767: truncate d4/d6/db/d59/db0/dd6/ffb 645295 0 2026-03-09T16:15:23.532 INFO:tasks.workunit.client.1.vm05.stdout:8/768: chown d4/d6/d3a/d3c/f45 0 1 2026-03-09T16:15:23.535 INFO:tasks.workunit.client.1.vm05.stdout:1/801: dwrite d7/dd/de/f32 [0,4194304] 0 2026-03-09T16:15:23.542 INFO:tasks.workunit.client.1.vm05.stdout:8/769: dwrite d4/d6/db/dc/fec [0,4194304] 0 2026-03-09T16:15:23.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:23 vm03.local ceph-mon[51019]: pgmap v21: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 42 MiB/s rd, 128 MiB/s wr, 260 op/s 2026-03-09T16:15:23.585 INFO:tasks.workunit.client.1.vm05.stdout:4/828: mkdir d5/de/d15/d21/d27/d3c/d12f 0 2026-03-09T16:15:23.590 INFO:tasks.workunit.client.1.vm05.stdout:4/829: dwrite d5/de/d15/da9/db1/dad/d37/f115 [0,4194304] 0 2026-03-09T16:15:23.597 INFO:tasks.workunit.client.1.vm05.stdout:4/830: dread d5/de/d15/d21/f2a [0,4194304] 0 2026-03-09T16:15:23.610 INFO:tasks.workunit.client.1.vm05.stdout:9/767: fdatasync d4/d10/d35/d36/f49 0 2026-03-09T16:15:23.613 INFO:tasks.workunit.client.1.vm05.stdout:2/680: dwrite db/dd/d15/d4c/f58 [4194304,4194304] 0 2026-03-09T16:15:23.613 INFO:tasks.workunit.client.1.vm05.stdout:7/789: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d116 0 2026-03-09T16:15:23.626 INFO:tasks.workunit.client.1.vm05.stdout:6/719: creat d17/d22/d27/dd8/f10b x:0 0 0 2026-03-09T16:15:23.635 INFO:tasks.workunit.client.1.vm05.stdout:1/802: symlink d7/dd/de/l116 0 2026-03-09T16:15:23.658 INFO:tasks.workunit.client.1.vm05.stdout:5/780: rmdir d8/d5e/dd0 0 2026-03-09T16:15:23.662 INFO:tasks.workunit.client.1.vm05.stdout:3/685: mknod d0/dce/cea 0 2026-03-09T16:15:23.662 INFO:tasks.workunit.client.1.vm05.stdout:5/781: fsync d8/d18/d1b/f28 0 2026-03-09T16:15:23.663 INFO:tasks.workunit.client.1.vm05.stdout:2/681: mkdir db/dd/d15/d3f/d5b/d60/d95/de0 0 2026-03-09T16:15:23.669 INFO:tasks.workunit.client.1.vm05.stdout:2/682: dwrite db/dd/d15/f70 [4194304,4194304] 0 2026-03-09T16:15:23.685 INFO:tasks.workunit.client.1.vm05.stdout:1/803: stat d7/dd/d21/d39/d5a/d50/l8a 0 2026-03-09T16:15:23.685 INFO:tasks.workunit.client.1.vm05.stdout:1/804: chown d7/dd/d21/d39/d48/d5d 15052 1 2026-03-09T16:15:23.690 INFO:tasks.workunit.client.1.vm05.stdout:8/770: rename d4/cf3 to d4/d6/d53/cff 0 2026-03-09T16:15:23.704 INFO:tasks.workunit.client.1.vm05.stdout:7/790: symlink d1/d2/l117 0 2026-03-09T16:15:23.708 INFO:tasks.workunit.client.1.vm05.stdout:6/720: truncate d17/d1d/f1e 683830 0 2026-03-09T16:15:23.715 INFO:tasks.workunit.client.1.vm05.stdout:1/805: dread - d7/dd/d21/d39/d48/d8c/dd8/d103/f10a zero size 2026-03-09T16:15:23.716 INFO:tasks.workunit.client.1.vm05.stdout:1/806: fdatasync d7/d15/d16/dc2/fc7 0 2026-03-09T16:15:23.717 INFO:tasks.workunit.client.1.vm05.stdout:3/686: creat d0/d33/feb x:0 0 0 2026-03-09T16:15:23.721 INFO:tasks.workunit.client.1.vm05.stdout:9/768: rename d4/d10/d35/d2b/d38/f78 to d4/d10/d35/d2b/d38/f104 0 2026-03-09T16:15:23.721 INFO:tasks.workunit.client.1.vm05.stdout:5/782: creat d8/d18/dbc/f111 x:0 0 0 2026-03-09T16:15:23.728 INFO:tasks.workunit.client.1.vm05.stdout:8/771: symlink d4/d6/db/df/l100 0 2026-03-09T16:15:23.730 INFO:tasks.workunit.client.1.vm05.stdout:3/687: readlink d0/l1c 0 2026-03-09T16:15:23.731 INFO:tasks.workunit.client.1.vm05.stdout:4/831: getdents d5/de/d15/da9/db1/dad/d37/dfb 0 2026-03-09T16:15:23.738 INFO:tasks.workunit.client.1.vm05.stdout:8/772: sync 2026-03-09T16:15:23.738 INFO:tasks.workunit.client.1.vm05.stdout:7/791: write d1/d2/d8/dc/d14/f41 [720351,71044] 0 2026-03-09T16:15:23.780 INFO:tasks.workunit.client.1.vm05.stdout:1/807: rename d7/dd/de/f56 to d7/dd/d21/d2d/f117 0 2026-03-09T16:15:23.785 INFO:tasks.workunit.client.1.vm05.stdout:1/808: dread d7/d27/f84 [0,4194304] 0 2026-03-09T16:15:23.825 INFO:tasks.workunit.client.1.vm05.stdout:0/728: link d5/d2c/d49/cb5 d5/db/def/df2/cfa 0 2026-03-09T16:15:23.831 INFO:tasks.workunit.client.1.vm05.stdout:4/832: truncate d5/f3e 2535830 0 2026-03-09T16:15:23.840 INFO:tasks.workunit.client.1.vm05.stdout:2/683: rename db/dd/d15/d1f/d21/f29 to db/dd/d98/fe1 0 2026-03-09T16:15:23.841 INFO:tasks.workunit.client.1.vm05.stdout:2/684: dread - db/dd/d15/d3f/d5b/d60/da2/fc5 zero size 2026-03-09T16:15:23.841 INFO:tasks.workunit.client.1.vm05.stdout:7/792: dread d1/d2/d11/f25 [0,4194304] 0 2026-03-09T16:15:23.849 INFO:tasks.workunit.client.1.vm05.stdout:3/688: mknod d0/d9/d22/d5f/d90/dae/cec 0 2026-03-09T16:15:23.855 INFO:tasks.workunit.client.1.vm05.stdout:8/773: creat d4/de9/f101 x:0 0 0 2026-03-09T16:15:23.855 INFO:tasks.workunit.client.1.vm05.stdout:6/721: link d17/d22/d27/d34/d42/d53/d87/lac d17/d22/d27/dd8/l10c 0 2026-03-09T16:15:23.856 INFO:tasks.workunit.client.1.vm05.stdout:4/833: truncate d5/de/d15/f34 1273243 0 2026-03-09T16:15:23.858 INFO:tasks.workunit.client.1.vm05.stdout:5/783: read d8/d59/d5b/d8b/da0/feb [1311793,106769] 0 2026-03-09T16:15:23.859 INFO:tasks.workunit.client.1.vm05.stdout:5/784: read - d8/d18/d1b/d47/dda/f10a zero size 2026-03-09T16:15:23.860 INFO:tasks.workunit.client.1.vm05.stdout:8/774: truncate d4/d6/db/dc/fa2 6165750 0 2026-03-09T16:15:23.863 INFO:tasks.workunit.client.1.vm05.stdout:0/729: symlink d5/db/d5b/da5/lfb 0 2026-03-09T16:15:23.863 INFO:tasks.workunit.client.1.vm05.stdout:6/722: stat d17/d22/dce/lf2 0 2026-03-09T16:15:23.866 INFO:tasks.workunit.client.1.vm05.stdout:0/730: mkdir d5/d1b/d3b/dfc 0 2026-03-09T16:15:23.868 INFO:tasks.workunit.client.1.vm05.stdout:3/689: getdents d0/d9/d22/d5f/d75/d76/d88/da3 0 2026-03-09T16:15:23.874 INFO:tasks.workunit.client.1.vm05.stdout:6/723: getdents d17/d22/d27/d58 0 2026-03-09T16:15:23.883 INFO:tasks.workunit.client.1.vm05.stdout:0/731: getdents d5/d1b/d3b/dc2 0 2026-03-09T16:15:23.884 INFO:tasks.workunit.client.1.vm05.stdout:3/690: getdents d0/d9/d22/d5f/d7b/da8/dd8 0 2026-03-09T16:15:23.896 INFO:tasks.workunit.client.1.vm05.stdout:3/691: dread d0/d9/d22/d5f/d90/fa6 [0,4194304] 0 2026-03-09T16:15:23.940 INFO:tasks.workunit.client.1.vm05.stdout:6/724: mkdir d17/d22/d27/d34/d42/d65/d10d 0 2026-03-09T16:15:23.945 INFO:tasks.workunit.client.1.vm05.stdout:6/725: truncate d17/d22/d27/d44/f86 202328 0 2026-03-09T16:15:23.947 INFO:tasks.workunit.client.1.vm05.stdout:0/732: truncate d5/db/d77/fb4 4378655 0 2026-03-09T16:15:23.948 INFO:tasks.workunit.client.1.vm05.stdout:0/733: chown d5/d1b/d3b/dc2/fe1 0 1 2026-03-09T16:15:23.956 INFO:tasks.workunit.client.1.vm05.stdout:1/809: write d7/dd/de/f35 [3373557,2584] 0 2026-03-09T16:15:23.956 INFO:tasks.workunit.client.1.vm05.stdout:0/734: rename d5/db/d77/lc8 to d5/d2c/d49/d83/d8b/daf/de8/lfd 0 2026-03-09T16:15:24.008 INFO:tasks.workunit.client.1.vm05.stdout:9/769: truncate d4/d10/d35/d2b/d38/f5e 1061648 0 2026-03-09T16:15:24.010 INFO:tasks.workunit.client.1.vm05.stdout:9/770: symlink d4/d10/d35/d36/d48/d54/l105 0 2026-03-09T16:15:24.011 INFO:tasks.workunit.client.1.vm05.stdout:9/771: write d4/d10/d35/f32 [949144,59171] 0 2026-03-09T16:15:24.019 INFO:tasks.workunit.client.1.vm05.stdout:9/772: rename d4/d10/d35/d36/d48/l4e to d4/d10/d35/d36/d48/d60/d94/l106 0 2026-03-09T16:15:24.024 INFO:tasks.workunit.client.1.vm05.stdout:2/685: truncate db/dd/d15/d46/d67/f77 1356782 0 2026-03-09T16:15:24.026 INFO:tasks.workunit.client.1.vm05.stdout:9/773: dwrite d4/d10/d35/d2b/d38/fe9 [0,4194304] 0 2026-03-09T16:15:24.028 INFO:tasks.workunit.client.1.vm05.stdout:4/834: truncate d5/de/d15/d21/d27/d3c/fe8 206113 0 2026-03-09T16:15:24.032 INFO:tasks.workunit.client.1.vm05.stdout:5/785: truncate d8/d18/dbc/dcc/daa/d43/f41 5811105 0 2026-03-09T16:15:24.034 INFO:tasks.workunit.client.1.vm05.stdout:5/786: readlink d8/d53/d7a/lbb 0 2026-03-09T16:15:24.036 INFO:tasks.workunit.client.1.vm05.stdout:7/793: dwrite d1/d2/d8/dc/d1b/d71/f46 [0,4194304] 0 2026-03-09T16:15:24.036 INFO:tasks.workunit.client.1.vm05.stdout:4/835: creat d5/de/d15/d21/d27/f130 x:0 0 0 2026-03-09T16:15:24.036 INFO:tasks.workunit.client.1.vm05.stdout:0/735: link d5/db/d5f/f85 d5/d1b/ffe 0 2026-03-09T16:15:24.036 INFO:tasks.workunit.client.1.vm05.stdout:0/736: fsync d5/d2c/d49/d83/fc9 0 2026-03-09T16:15:24.048 INFO:tasks.workunit.client.1.vm05.stdout:5/787: mknod d8/d18/d1b/d6b/c112 0 2026-03-09T16:15:24.053 INFO:tasks.workunit.client.1.vm05.stdout:3/692: creat d0/d9/d22/d5f/d75/d76/fed x:0 0 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:5/788: write d8/d59/d5b/d8b/ffe [457794,62567] 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:6/726: mknod d17/d22/d27/d58/c10e 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:1/810: creat d7/d15/d16/f118 x:0 0 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:6/727: readlink d17/d22/d27/d34/d4b/laa 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:6/728: chown d17/d22/d9d/da5/ff5 9224 1 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:0/737: rename d5/d2c/d49/d83/d8b/d95 to d5/d2c/dff 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:8/775: creat d4/d6/db/f102 x:0 0 0 2026-03-09T16:15:24.062 INFO:tasks.workunit.client.1.vm05.stdout:0/738: truncate d5/d11/f90 339087 0 2026-03-09T16:15:24.065 INFO:tasks.workunit.client.1.vm05.stdout:9/774: sync 2026-03-09T16:15:24.065 INFO:tasks.workunit.client.1.vm05.stdout:4/836: sync 2026-03-09T16:15:24.068 INFO:tasks.workunit.client.1.vm05.stdout:1/811: rename d7/d27/d101 to d7/d27/d119 0 2026-03-09T16:15:24.068 INFO:tasks.workunit.client.1.vm05.stdout:0/739: chown d5/db/d5b/f35 36875 1 2026-03-09T16:15:24.071 INFO:tasks.workunit.client.1.vm05.stdout:6/729: rename d17/l49 to d17/d22/d27/d34/d42/d53/l10f 0 2026-03-09T16:15:24.072 INFO:tasks.workunit.client.1.vm05.stdout:0/740: readlink d5/d2c/l75 0 2026-03-09T16:15:24.072 INFO:tasks.workunit.client.1.vm05.stdout:8/776: dwrite f0 [0,4194304] 0 2026-03-09T16:15:24.083 INFO:tasks.workunit.client.1.vm05.stdout:4/837: symlink d5/de/d15/da9/db1/dad/d37/d60/dbf/l131 0 2026-03-09T16:15:24.085 INFO:tasks.workunit.client.1.vm05.stdout:0/741: dwrite d5/d1b/f25 [0,4194304] 0 2026-03-09T16:15:24.085 INFO:tasks.workunit.client.1.vm05.stdout:3/693: link d0/d9/d22/d5f/d7b/d99/lc6 d0/d9/d97/dad/lee 0 2026-03-09T16:15:24.086 INFO:tasks.workunit.client.1.vm05.stdout:8/777: unlink d4/d6/db/df/d4f/d9f/cc3 0 2026-03-09T16:15:24.087 INFO:tasks.workunit.client.1.vm05.stdout:9/775: getdents d4/d10/d35/d2b/d38/d65/dea 0 2026-03-09T16:15:24.091 INFO:tasks.workunit.client.1.vm05.stdout:0/742: symlink d5/db/d77/l100 0 2026-03-09T16:15:24.093 INFO:tasks.workunit.client.1.vm05.stdout:8/778: mknod d4/d6/db/d59/db0/c103 0 2026-03-09T16:15:24.095 INFO:tasks.workunit.client.1.vm05.stdout:0/743: sync 2026-03-09T16:15:24.100 INFO:tasks.workunit.client.1.vm05.stdout:6/730: mknod d17/d22/d27/d34/dd1/d10a/c110 0 2026-03-09T16:15:24.108 INFO:tasks.workunit.client.1.vm05.stdout:4/838: creat d5/de/d15/d21/d39/d91/f132 x:0 0 0 2026-03-09T16:15:24.115 INFO:tasks.workunit.client.1.vm05.stdout:9/776: stat d4/d10/d35/d36/d48/d60/d94/cbb 0 2026-03-09T16:15:24.115 INFO:tasks.workunit.client.1.vm05.stdout:1/812: creat d7/dd/d21/d44/f11a x:0 0 0 2026-03-09T16:15:24.115 INFO:tasks.workunit.client.1.vm05.stdout:1/813: write d7/d15/d45/dee/ff1 [850240,61514] 0 2026-03-09T16:15:24.119 INFO:tasks.workunit.client.1.vm05.stdout:9/777: dwrite d4/d10/d35/d36/d48/d60/dae/fcf [0,4194304] 0 2026-03-09T16:15:24.125 INFO:tasks.workunit.client.1.vm05.stdout:8/779: creat d4/d6/db/d9b/f104 x:0 0 0 2026-03-09T16:15:24.126 INFO:tasks.workunit.client.1.vm05.stdout:8/780: chown d4/d6/db/d59/db0/dd6 0 1 2026-03-09T16:15:24.139 INFO:tasks.workunit.client.1.vm05.stdout:7/794: write d1/d2/d8/dc/d1b/d30/d7d/fc1 [294198,8756] 0 2026-03-09T16:15:24.143 INFO:tasks.workunit.client.1.vm05.stdout:7/795: dread - d1/d2/d8/dc/d9c/fda zero size 2026-03-09T16:15:24.144 INFO:tasks.workunit.client.1.vm05.stdout:2/686: dwrite db/f12 [0,4194304] 0 2026-03-09T16:15:24.159 INFO:tasks.workunit.client.1.vm05.stdout:1/814: read - d7/dd/de/d52/fe7 zero size 2026-03-09T16:15:24.159 INFO:tasks.workunit.client.1.vm05.stdout:9/778: stat d4/d10/d35/d2b/d38/d65/c71 0 2026-03-09T16:15:24.163 INFO:tasks.workunit.client.1.vm05.stdout:1/815: chown d7/dd/de/d52/d5b 7159 1 2026-03-09T16:15:24.169 INFO:tasks.workunit.client.1.vm05.stdout:5/789: dwrite d8/d59/f83 [0,4194304] 0 2026-03-09T16:15:24.175 INFO:tasks.workunit.client.1.vm05.stdout:1/816: dread d7/dd/d21/d63/d71/f7b [0,4194304] 0 2026-03-09T16:15:24.189 INFO:tasks.workunit.client.1.vm05.stdout:5/790: dwrite d8/d53/d7e/f8a [0,4194304] 0 2026-03-09T16:15:24.204 INFO:tasks.workunit.client.1.vm05.stdout:3/694: dwrite d0/d9/d8b/f91 [0,4194304] 0 2026-03-09T16:15:24.215 INFO:tasks.workunit.client.1.vm05.stdout:8/781: symlink d4/d6/d3a/d15/dd9/l105 0 2026-03-09T16:15:24.222 INFO:tasks.workunit.client.1.vm05.stdout:1/817: dread d7/dd/d21/d63/d71/fd0 [0,4194304] 0 2026-03-09T16:15:24.225 INFO:tasks.workunit.client.1.vm05.stdout:1/818: dwrite d7/d15/d16/f53 [4194304,4194304] 0 2026-03-09T16:15:24.240 INFO:tasks.workunit.client.1.vm05.stdout:7/796: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d116/c118 0 2026-03-09T16:15:24.242 INFO:tasks.workunit.client.1.vm05.stdout:7/797: write d1/d2/d8/dc/d1b/d30/d7d/f103 [167418,78517] 0 2026-03-09T16:15:24.246 INFO:tasks.workunit.client.1.vm05.stdout:7/798: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/fe3 [0,4194304] 0 2026-03-09T16:15:24.247 INFO:tasks.workunit.client.1.vm05.stdout:7/799: read - d1/d2/d8/f109 zero size 2026-03-09T16:15:24.258 INFO:tasks.workunit.client.1.vm05.stdout:5/791: rename d8/f13 to d8/d18/d1b/d6b/f113 0 2026-03-09T16:15:24.260 INFO:tasks.workunit.client.1.vm05.stdout:5/792: chown d8/c12 23 1 2026-03-09T16:15:24.269 INFO:tasks.workunit.client.1.vm05.stdout:0/744: truncate d5/d1b/f25 78856 0 2026-03-09T16:15:24.269 INFO:tasks.workunit.client.1.vm05.stdout:6/731: truncate d17/d22/d27/d34/d42/d53/f90 3313522 0 2026-03-09T16:15:24.271 INFO:tasks.workunit.client.1.vm05.stdout:4/839: link d5/de/d15/d21/da0/cd2 d5/de/d15/da9/db1/dad/d90/dd8/c133 0 2026-03-09T16:15:24.292 INFO:tasks.workunit.client.1.vm05.stdout:9/779: write d4/d10/f8a [5304668,71916] 0 2026-03-09T16:15:24.306 INFO:tasks.workunit.client.1.vm05.stdout:3/695: mknod d0/d9/d22/d5f/d7b/da8/cef 0 2026-03-09T16:15:24.313 INFO:tasks.workunit.client.1.vm05.stdout:2/687: creat db/dd/d15/d4c/fe2 x:0 0 0 2026-03-09T16:15:24.323 INFO:tasks.workunit.client.1.vm05.stdout:2/688: sync 2026-03-09T16:15:24.337 INFO:tasks.workunit.client.1.vm05.stdout:1/819: creat d7/dd/de/d52/df6/dfc/f11b x:0 0 0 2026-03-09T16:15:24.369 INFO:tasks.workunit.client.1.vm05.stdout:0/745: rmdir d5/d9e 39 2026-03-09T16:15:24.374 INFO:tasks.workunit.client.1.vm05.stdout:5/793: dwrite d8/d18/d1b/d47/d48/d73/d80/fe5 [0,4194304] 0 2026-03-09T16:15:24.376 INFO:tasks.workunit.client.1.vm05.stdout:8/782: mkdir d4/d6/db/dc/d106 0 2026-03-09T16:15:24.377 INFO:tasks.workunit.client.1.vm05.stdout:5/794: write d8/d53/ffc [99229,33502] 0 2026-03-09T16:15:24.385 INFO:tasks.workunit.client.1.vm05.stdout:2/689: rmdir db/dd/d15/d1f/d21 39 2026-03-09T16:15:24.385 INFO:tasks.workunit.client.1.vm05.stdout:1/820: mkdir d7/d15/d6e/dbc/dd6/d11c 0 2026-03-09T16:15:24.385 INFO:tasks.workunit.client.1.vm05.stdout:6/732: mknod d17/d22/d27/df8/c111 0 2026-03-09T16:15:24.386 INFO:tasks.workunit.client.1.vm05.stdout:4/840: link d5/de/d15/d21/d27/fc8 d5/de/d15/da9/db1/dad/d37/f134 0 2026-03-09T16:15:24.389 INFO:tasks.workunit.client.1.vm05.stdout:3/696: mkdir d0/d9/d22/df0 0 2026-03-09T16:15:24.390 INFO:tasks.workunit.client.1.vm05.stdout:5/795: truncate d8/d18/d1b/d47/d48/f61 892233 0 2026-03-09T16:15:24.391 INFO:tasks.workunit.client.1.vm05.stdout:6/733: mkdir d17/d22/d27/df8/d112 0 2026-03-09T16:15:24.394 INFO:tasks.workunit.client.1.vm05.stdout:7/800: getdents d1/d2/d8/dc/d1b/d30/d7d/d114 0 2026-03-09T16:15:24.395 INFO:tasks.workunit.client.1.vm05.stdout:1/821: rename d7/dd/de/d52/df6/f100 to d7/dd/de/d52/df6/db4/f11d 0 2026-03-09T16:15:24.398 INFO:tasks.workunit.client.1.vm05.stdout:5/796: dwrite d8/d18/dbc/dcc/daa/f3c [0,4194304] 0 2026-03-09T16:15:24.401 INFO:tasks.workunit.client.1.vm05.stdout:7/801: dwrite d1/d2/d8/dc/d14/f41 [0,4194304] 0 2026-03-09T16:15:24.402 INFO:tasks.workunit.client.1.vm05.stdout:7/802: dread - d1/d2/d8/d31/ff7 zero size 2026-03-09T16:15:24.415 INFO:tasks.workunit.client.1.vm05.stdout:9/780: write d4/d10/d35/fc3 [1069936,24730] 0 2026-03-09T16:15:24.430 INFO:tasks.workunit.client.1.vm05.stdout:2/690: rmdir db/dd/d15/d1f/d20/d23 39 2026-03-09T16:15:24.431 INFO:tasks.workunit.client.1.vm05.stdout:2/691: chown db/dd/d15/d3f/d5b/d60/d95/dd7 1554532 1 2026-03-09T16:15:24.433 INFO:tasks.workunit.client.1.vm05.stdout:8/783: rename d4/d6/d3a/d40/d6a/l8f to d4/d6/d3a/d40/d71/l107 0 2026-03-09T16:15:24.434 INFO:tasks.workunit.client.1.vm05.stdout:4/841: dwrite d5/de/d15/f52 [0,4194304] 0 2026-03-09T16:15:24.435 INFO:tasks.workunit.client.1.vm05.stdout:7/803: creat d1/d2/d11/d86/d8a/f119 x:0 0 0 2026-03-09T16:15:24.445 INFO:tasks.workunit.client.1.vm05.stdout:1/822: creat d7/dbe/dca/f11e x:0 0 0 2026-03-09T16:15:24.446 INFO:tasks.workunit.client.1.vm05.stdout:9/781: unlink d4/d10/d35/d2b/d31/fa8 0 2026-03-09T16:15:24.446 INFO:tasks.workunit.client.1.vm05.stdout:0/746: getdents d5/db/d5b 0 2026-03-09T16:15:24.448 INFO:tasks.workunit.client.1.vm05.stdout:9/782: fsync d4/d10/d35/d36/d48/d60/dae/fc9 0 2026-03-09T16:15:24.448 INFO:tasks.workunit.client.1.vm05.stdout:5/797: rename d8/d53/d7e/cb7 to d8/d18/d1b/d47/d4e/d76/d8f/df9/c114 0 2026-03-09T16:15:24.448 INFO:tasks.workunit.client.1.vm05.stdout:7/804: mknod d1/d2/d8/d31/c11a 0 2026-03-09T16:15:24.449 INFO:tasks.workunit.client.1.vm05.stdout:6/734: creat d17/d22/d27/df8/d112/f113 x:0 0 0 2026-03-09T16:15:24.449 INFO:tasks.workunit.client.1.vm05.stdout:1/823: read d7/dd/d21/d63/d71/ddc/df8/fc5 [2388759,83181] 0 2026-03-09T16:15:24.450 INFO:tasks.workunit.client.1.vm05.stdout:8/784: mknod d4/d6/db/dc/d5d/da0/dd7/dd8/c108 0 2026-03-09T16:15:24.456 INFO:tasks.workunit.client.1.vm05.stdout:0/747: chown d5/db/d77/fb4 56 1 2026-03-09T16:15:24.456 INFO:tasks.workunit.client.1.vm05.stdout:2/692: mknod db/dd/d15/d1f/d21/ce3 0 2026-03-09T16:15:24.456 INFO:tasks.workunit.client.1.vm05.stdout:8/785: mknod d4/d6/db/dc/d3b/c109 0 2026-03-09T16:15:24.456 INFO:tasks.workunit.client.1.vm05.stdout:8/786: chown d4/d6/d3a/d15/f65 10603736 1 2026-03-09T16:15:24.456 INFO:tasks.workunit.client.1.vm05.stdout:3/697: getdents d0/d9/d22/d5f/d7b 0 2026-03-09T16:15:24.459 INFO:tasks.workunit.client.1.vm05.stdout:1/824: getdents d7/d15/d6e/dbc/dd6/d11c 0 2026-03-09T16:15:24.460 INFO:tasks.workunit.client.1.vm05.stdout:1/825: write d7/dd/d21/d39/f86 [810291,59063] 0 2026-03-09T16:15:24.462 INFO:tasks.workunit.client.1.vm05.stdout:0/748: dread d5/d1b/f78 [0,4194304] 0 2026-03-09T16:15:24.463 INFO:tasks.workunit.client.1.vm05.stdout:4/842: sync 2026-03-09T16:15:24.464 INFO:tasks.workunit.client.1.vm05.stdout:9/783: sync 2026-03-09T16:15:24.465 INFO:tasks.workunit.client.1.vm05.stdout:1/826: sync 2026-03-09T16:15:24.465 INFO:tasks.workunit.client.1.vm05.stdout:5/798: getdents d8/d18/dbc 0 2026-03-09T16:15:24.467 INFO:tasks.workunit.client.1.vm05.stdout:0/749: dread d5/d1b/d3b/dc2/fe1 [0,4194304] 0 2026-03-09T16:15:24.467 INFO:tasks.workunit.client.1.vm05.stdout:1/827: chown d7/dd/d21/d39/d87/fe2 28 1 2026-03-09T16:15:24.468 INFO:tasks.workunit.client.1.vm05.stdout:7/805: getdents d1/d2/d8 0 2026-03-09T16:15:24.470 INFO:tasks.workunit.client.1.vm05.stdout:2/693: creat db/dd/d15/d4c/fe4 x:0 0 0 2026-03-09T16:15:24.471 INFO:tasks.workunit.client.1.vm05.stdout:9/784: creat d4/d10/d35/d2b/d31/d96/f107 x:0 0 0 2026-03-09T16:15:24.472 INFO:tasks.workunit.client.1.vm05.stdout:5/799: mkdir d8/d5e/d8e/d115 0 2026-03-09T16:15:24.473 INFO:tasks.workunit.client.1.vm05.stdout:4/843: dwrite d5/de/d15/da9/db1/dad/d37/d60/dbf/fc7 [4194304,4194304] 0 2026-03-09T16:15:24.482 INFO:tasks.workunit.client.1.vm05.stdout:6/735: dwrite d17/f31 [0,4194304] 0 2026-03-09T16:15:24.482 INFO:tasks.workunit.client.1.vm05.stdout:8/787: truncate d4/d6/d53/f7f 329449 0 2026-03-09T16:15:24.495 INFO:tasks.workunit.client.1.vm05.stdout:3/698: link d0/d9/d8b/f91 d0/d9/d22/d5f/d75/d76/d88/da3/ff1 0 2026-03-09T16:15:24.497 INFO:tasks.workunit.client.1.vm05.stdout:6/736: chown d17/d4f/fbd 125473 1 2026-03-09T16:15:24.501 INFO:tasks.workunit.client.1.vm05.stdout:5/800: mknod d8/d18/d1b/d47/d48/d73/dfb/c116 0 2026-03-09T16:15:24.511 INFO:tasks.workunit.client.1.vm05.stdout:8/788: truncate d4/d6/db/dc/f2a 2210556 0 2026-03-09T16:15:24.512 INFO:tasks.workunit.client.1.vm05.stdout:0/750: dwrite d5/d11/f8a [0,4194304] 0 2026-03-09T16:15:24.512 INFO:tasks.workunit.client.1.vm05.stdout:1/828: creat d7/d15/d6e/dbc/dd6/d11c/f11f x:0 0 0 2026-03-09T16:15:24.512 INFO:tasks.workunit.client.1.vm05.stdout:6/737: symlink d17/d22/d27/d34/dd1/l114 0 2026-03-09T16:15:24.513 INFO:tasks.workunit.client.1.vm05.stdout:3/699: rename d0/d9/d22/d5f/d75/d76/d88/d89/f9e to d0/d33/ff2 0 2026-03-09T16:15:24.513 INFO:tasks.workunit.client.1.vm05.stdout:9/785: unlink d4/d10/d35/d36/c9c 0 2026-03-09T16:15:24.515 INFO:tasks.workunit.client.1.vm05.stdout:1/829: dread d7/dd/d21/d39/d48/d5d/f98 [0,4194304] 0 2026-03-09T16:15:24.516 INFO:tasks.workunit.client.1.vm05.stdout:8/789: dread d4/d6/d3a/d15/f93 [0,4194304] 0 2026-03-09T16:15:24.519 INFO:tasks.workunit.client.1.vm05.stdout:8/790: read d4/d6/f5f [4100630,56170] 0 2026-03-09T16:15:24.520 INFO:tasks.workunit.client.1.vm05.stdout:6/738: mknod d17/d22/d27/dd8/c115 0 2026-03-09T16:15:24.522 INFO:tasks.workunit.client.1.vm05.stdout:6/739: dwrite d17/d22/d9d/fe8 [0,4194304] 0 2026-03-09T16:15:24.524 INFO:tasks.workunit.client.1.vm05.stdout:5/801: truncate d8/d18/dbc/dcc/f94 845034 0 2026-03-09T16:15:24.524 INFO:tasks.workunit.client.1.vm05.stdout:5/802: stat d8 0 2026-03-09T16:15:24.531 INFO:tasks.workunit.client.1.vm05.stdout:9/786: mknod d4/d10/c108 0 2026-03-09T16:15:24.531 INFO:tasks.workunit.client.1.vm05.stdout:3/700: creat d0/d9/d97/dbc/ff3 x:0 0 0 2026-03-09T16:15:24.531 INFO:tasks.workunit.client.1.vm05.stdout:4/844: creat d5/de/d82/dc1/dec/f135 x:0 0 0 2026-03-09T16:15:24.532 INFO:tasks.workunit.client.1.vm05.stdout:9/787: write d4/f6 [3201484,25191] 0 2026-03-09T16:15:24.538 INFO:tasks.workunit.client.1.vm05.stdout:4/845: dwrite d5/de/f23 [0,4194304] 0 2026-03-09T16:15:24.540 INFO:tasks.workunit.client.1.vm05.stdout:1/830: symlink d7/d62/db6/l120 0 2026-03-09T16:15:24.540 INFO:tasks.workunit.client.1.vm05.stdout:6/740: creat d17/d22/dce/f116 x:0 0 0 2026-03-09T16:15:24.548 INFO:tasks.workunit.client.1.vm05.stdout:3/701: symlink d0/d9/d22/d5f/d75/d76/d88/d89/lf4 0 2026-03-09T16:15:24.552 INFO:tasks.workunit.client.1.vm05.stdout:7/806: write d1/d2/d8/dc/d9c/f6b [2063566,19963] 0 2026-03-09T16:15:24.556 INFO:tasks.workunit.client.1.vm05.stdout:9/788: mknod d4/d10/d35/d2b/d38/d65/c109 0 2026-03-09T16:15:24.556 INFO:tasks.workunit.client.1.vm05.stdout:9/789: chown d4/f20 4 1 2026-03-09T16:15:24.557 INFO:tasks.workunit.client.1.vm05.stdout:9/790: chown d4/f4a 0 1 2026-03-09T16:15:24.564 INFO:tasks.workunit.client.1.vm05.stdout:6/741: mkdir d17/d22/d27/d34/d42/d65/d117 0 2026-03-09T16:15:24.564 INFO:tasks.workunit.client.1.vm05.stdout:1/831: dread d7/dd/de/d52/fd7 [0,4194304] 0 2026-03-09T16:15:24.564 INFO:tasks.workunit.client.1.vm05.stdout:6/742: stat d17/f60 0 2026-03-09T16:15:24.565 INFO:tasks.workunit.client.1.vm05.stdout:0/751: rename d5/db/d5f/da3/ce6 to d5/db/d48/dc3/c101 0 2026-03-09T16:15:24.567 INFO:tasks.workunit.client.1.vm05.stdout:2/694: dwrite db/dd/d15/d1f/d20/f3d [0,4194304] 0 2026-03-09T16:15:24.572 INFO:tasks.workunit.client.1.vm05.stdout:4/846: link d5/de/d2f/d8a/fb0 d5/de/d15/da9/f136 0 2026-03-09T16:15:24.573 INFO:tasks.workunit.client.1.vm05.stdout:9/791: truncate d4/d10/d35/d36/d48/f68 728196 0 2026-03-09T16:15:24.574 INFO:tasks.workunit.client.1.vm05.stdout:6/743: symlink d17/d22/d9d/db4/l118 0 2026-03-09T16:15:24.575 INFO:tasks.workunit.client.1.vm05.stdout:9/792: dread - d4/d10/f75 zero size 2026-03-09T16:15:24.577 INFO:tasks.workunit.client.1.vm05.stdout:9/793: readlink d4/d10/d35/d2b/d38/d65/dd6/de3/l5f 0 2026-03-09T16:15:24.580 INFO:tasks.workunit.client.1.vm05.stdout:3/702: symlink d0/d9/d22/df0/lf5 0 2026-03-09T16:15:24.580 INFO:tasks.workunit.client.1.vm05.stdout:4/847: fdatasync d5/d9c/f109 0 2026-03-09T16:15:24.594 INFO:tasks.workunit.client.1.vm05.stdout:5/803: dread d8/d53/d7a/f92 [0,4194304] 0 2026-03-09T16:15:24.594 INFO:tasks.workunit.client.1.vm05.stdout:2/695: creat db/dd/d15/d46/d8d/fe5 x:0 0 0 2026-03-09T16:15:24.594 INFO:tasks.workunit.client.1.vm05.stdout:6/744: rmdir d17/d22/d27 39 2026-03-09T16:15:24.597 INFO:tasks.workunit.client.1.vm05.stdout:9/794: creat d4/d10/d35/d2b/d38/d65/dd6/f10a x:0 0 0 2026-03-09T16:15:24.598 INFO:tasks.workunit.client.1.vm05.stdout:0/752: dread d5/d11/f40 [0,4194304] 0 2026-03-09T16:15:24.600 INFO:tasks.workunit.client.1.vm05.stdout:0/753: write d5/d97/fe2 [302387,15319] 0 2026-03-09T16:15:24.610 INFO:tasks.workunit.client.1.vm05.stdout:4/848: creat d5/de/d15/d21/d27/d3c/d5c/da2/f137 x:0 0 0 2026-03-09T16:15:24.612 INFO:tasks.workunit.client.1.vm05.stdout:1/832: link d7/dd/d21/d2d/f108 d7/dbe/ded/f121 0 2026-03-09T16:15:24.613 INFO:tasks.workunit.client.1.vm05.stdout:5/804: sync 2026-03-09T16:15:24.613 INFO:tasks.workunit.client.1.vm05.stdout:7/807: dread d1/d2/d8/dc/d33/f9f [0,4194304] 0 2026-03-09T16:15:24.614 INFO:tasks.workunit.client.1.vm05.stdout:5/805: write d8/d18/dbc/dcc/daa/f3c [2989299,81677] 0 2026-03-09T16:15:24.614 INFO:tasks.workunit.client.1.vm05.stdout:0/754: fsync d5/d2c/dff/f52 0 2026-03-09T16:15:24.616 INFO:tasks.workunit.client.1.vm05.stdout:8/791: dwrite d4/d6/db/d59/f60 [0,4194304] 0 2026-03-09T16:15:24.620 INFO:tasks.workunit.client.1.vm05.stdout:2/696: fdatasync db/dd/d15/d1f/f49 0 2026-03-09T16:15:24.622 INFO:tasks.workunit.client.1.vm05.stdout:5/806: chown d8/c23 195 1 2026-03-09T16:15:24.623 INFO:tasks.workunit.client.1.vm05.stdout:6/745: dread d17/d22/d27/d8a/fa1 [0,4194304] 0 2026-03-09T16:15:24.628 INFO:tasks.workunit.client.1.vm05.stdout:3/703: dread d0/d33/ff2 [0,4194304] 0 2026-03-09T16:15:24.635 INFO:tasks.workunit.client.1.vm05.stdout:4/849: rmdir d5/de/d15/da9/db1/dad/d37/dfb 39 2026-03-09T16:15:24.635 INFO:tasks.workunit.client.1.vm05.stdout:7/808: rmdir d1/d2/d8/dc/dd4/da8 39 2026-03-09T16:15:24.642 INFO:tasks.workunit.client.1.vm05.stdout:4/850: chown d5/de/d2f/f10d 0 1 2026-03-09T16:15:24.642 INFO:tasks.workunit.client.1.vm05.stdout:2/697: dwrite db/dd/d15/d3f/d5b/d60/d6a/fc8 [0,4194304] 0 2026-03-09T16:15:24.642 INFO:tasks.workunit.client.1.vm05.stdout:9/795: write d4/d10/d35/d2b/d38/f5e [1609112,44243] 0 2026-03-09T16:15:24.652 INFO:tasks.workunit.client.1.vm05.stdout:5/807: fsync d8/d18/d1b/d6b/f113 0 2026-03-09T16:15:24.653 INFO:tasks.workunit.client.1.vm05.stdout:9/796: dwrite d4/f66 [4194304,4194304] 0 2026-03-09T16:15:24.656 INFO:tasks.workunit.client.1.vm05.stdout:9/797: dread - d4/d10/d35/d36/d48/d60/dae/fc9 zero size 2026-03-09T16:15:24.663 INFO:tasks.workunit.client.1.vm05.stdout:9/798: write d4/d10/d35/d2b/f2c [2820074,7061] 0 2026-03-09T16:15:24.664 INFO:tasks.workunit.client.1.vm05.stdout:1/833: rename d7/dd/d21/d39/d48/da7/db5 to d7/dd/de/d52/df6/d55/df9/d122 0 2026-03-09T16:15:24.665 INFO:tasks.workunit.client.1.vm05.stdout:1/834: write d7/dd/de/f3e [2191438,84028] 0 2026-03-09T16:15:24.673 INFO:tasks.workunit.client.1.vm05.stdout:6/746: dread d17/d22/d27/d34/d42/d53/f90 [0,4194304] 0 2026-03-09T16:15:24.675 INFO:tasks.workunit.client.1.vm05.stdout:8/792: dread d4/d6/db/dc/f41 [0,4194304] 0 2026-03-09T16:15:24.676 INFO:tasks.workunit.client.1.vm05.stdout:8/793: write d4/fca [56455,11865] 0 2026-03-09T16:15:24.678 INFO:tasks.workunit.client.1.vm05.stdout:3/704: chown d0/d9/f4d 12427853 1 2026-03-09T16:15:24.678 INFO:tasks.workunit.client.1.vm05.stdout:2/698: mknod db/dd/d15/d3f/d5b/d60/d95/ddd/ce6 0 2026-03-09T16:15:24.680 INFO:tasks.workunit.client.1.vm05.stdout:2/699: dread - db/dd/d15/d3f/d5b/d60/da2/fde zero size 2026-03-09T16:15:24.681 INFO:tasks.workunit.client.1.vm05.stdout:4/851: mknod d5/de/d15/d21/d39/c138 0 2026-03-09T16:15:24.682 INFO:tasks.workunit.client.1.vm05.stdout:9/799: creat d4/d10/d35/d2b/d31/dc8/f10b x:0 0 0 2026-03-09T16:15:24.684 INFO:tasks.workunit.client.1.vm05.stdout:9/800: truncate d4/d10/d35/d2b/d38/ff2 812888 0 2026-03-09T16:15:24.695 INFO:tasks.workunit.client.1.vm05.stdout:6/747: mkdir d17/d5d/d73/d83/d119 0 2026-03-09T16:15:24.695 INFO:tasks.workunit.client.1.vm05.stdout:0/755: link d5/d2c/dd2/ldb d5/db/d5f/da3/l102 0 2026-03-09T16:15:24.695 INFO:tasks.workunit.client.1.vm05.stdout:3/705: rmdir d0/dce 39 2026-03-09T16:15:24.696 INFO:tasks.workunit.client.1.vm05.stdout:8/794: fdatasync d4/d6/d3a/d3c/f8d 0 2026-03-09T16:15:24.696 INFO:tasks.workunit.client.1.vm05.stdout:2/700: mkdir db/dd/d15/d3f/d5b/d60/d95/de7 0 2026-03-09T16:15:24.697 INFO:tasks.workunit.client.1.vm05.stdout:8/795: write d4/d6/db/dc/d5d/d79/fe8 [982731,22873] 0 2026-03-09T16:15:24.698 INFO:tasks.workunit.client.1.vm05.stdout:2/701: dread - db/dd/d15/d3f/d5b/d60/d95/ddd/fdc zero size 2026-03-09T16:15:24.701 INFO:tasks.workunit.client.1.vm05.stdout:0/756: read d5/db/d5b/fcd [87379,13361] 0 2026-03-09T16:15:24.703 INFO:tasks.workunit.client.1.vm05.stdout:4/852: dread d5/de/d15/da9/db1/dad/d37/f134 [0,4194304] 0 2026-03-09T16:15:24.705 INFO:tasks.workunit.client.1.vm05.stdout:2/702: sync 2026-03-09T16:15:24.712 INFO:tasks.workunit.client.1.vm05.stdout:5/808: creat d8/d18/f117 x:0 0 0 2026-03-09T16:15:24.714 INFO:tasks.workunit.client.1.vm05.stdout:5/809: sync 2026-03-09T16:15:24.714 INFO:tasks.workunit.client.1.vm05.stdout:5/810: fdatasync d8/d18/d1b/d47/d48/d73/d80/fe5 0 2026-03-09T16:15:24.718 INFO:tasks.workunit.client.1.vm05.stdout:4/853: chown d5/de/d2f/d8a/le6 827551 1 2026-03-09T16:15:24.718 INFO:tasks.workunit.client.1.vm05.stdout:7/809: write d1/d2/d8/dc/d1b/d30/d4b/d65/f113 [665980,28891] 0 2026-03-09T16:15:24.718 INFO:tasks.workunit.client.1.vm05.stdout:9/801: write d4/d10/d35/d2b/fa1 [4882270,2981] 0 2026-03-09T16:15:24.718 INFO:tasks.workunit.client.1.vm05.stdout:1/835: write d7/d15/d16/f74 [2236436,114738] 0 2026-03-09T16:15:24.718 INFO:tasks.workunit.client.1.vm05.stdout:4/854: stat d5/d9c 0 2026-03-09T16:15:24.719 INFO:tasks.workunit.client.1.vm05.stdout:3/706: write d0/d9/d8b/fc7 [826603,83573] 0 2026-03-09T16:15:24.729 INFO:tasks.workunit.client.1.vm05.stdout:9/802: dread - d4/d10/d35/d2b/d31/dc8/f10b zero size 2026-03-09T16:15:24.729 INFO:tasks.workunit.client.1.vm05.stdout:2/703: fsync db/dd/d15/f6f 0 2026-03-09T16:15:24.731 INFO:tasks.workunit.client.1.vm05.stdout:4/855: dread d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/f9f [0,4194304] 0 2026-03-09T16:15:24.734 INFO:tasks.workunit.client.1.vm05.stdout:2/704: chown db/dd/d15/d3f/d5b/d60/da2/fa9 109748313 1 2026-03-09T16:15:24.734 INFO:tasks.workunit.client.1.vm05.stdout:3/707: truncate d0/d9/d97/fd4 528064 0 2026-03-09T16:15:24.734 INFO:tasks.workunit.client.1.vm05.stdout:0/757: mkdir d5/db/def/df2/d103 0 2026-03-09T16:15:24.734 INFO:tasks.workunit.client.1.vm05.stdout:5/811: creat d8/d18/d1b/d47/d48/f118 x:0 0 0 2026-03-09T16:15:24.735 INFO:tasks.workunit.client.1.vm05.stdout:8/796: write d4/d6/db/dc/f41 [3694935,127721] 0 2026-03-09T16:15:24.736 INFO:tasks.workunit.client.1.vm05.stdout:8/797: dread - d4/d6/db/d9b/f104 zero size 2026-03-09T16:15:24.736 INFO:tasks.workunit.client.1.vm05.stdout:6/748: fsync d17/d22/d27/d44/f86 0 2026-03-09T16:15:24.737 INFO:tasks.workunit.client.1.vm05.stdout:2/705: dread - db/dd/d15/d3f/d5b/d60/d95/ddd/fdc zero size 2026-03-09T16:15:24.747 INFO:tasks.workunit.client.1.vm05.stdout:8/798: dwrite d4/d6/db/df/fdc [0,4194304] 0 2026-03-09T16:15:24.802 INFO:tasks.workunit.client.1.vm05.stdout:7/810: creat d1/d2/d8/dc/d1b/d30/d4b/f11b x:0 0 0 2026-03-09T16:15:24.812 INFO:tasks.workunit.client.1.vm05.stdout:6/749: creat d17/d22/d27/d34/d42/f11a x:0 0 0 2026-03-09T16:15:24.812 INFO:tasks.workunit.client.1.vm05.stdout:2/706: rmdir db/dd/d15/d4c 39 2026-03-09T16:15:24.814 INFO:tasks.workunit.client.1.vm05.stdout:7/811: dread d1/d2/d8/dc/d1b/d30/d7d/f103 [0,4194304] 0 2026-03-09T16:15:24.815 INFO:tasks.workunit.client.1.vm05.stdout:8/799: dread - d4/d6/db/dc/d5d/da0/fa1 zero size 2026-03-09T16:15:24.818 INFO:tasks.workunit.client.1.vm05.stdout:2/707: write db/dd/f1b [4598449,98520] 0 2026-03-09T16:15:24.822 INFO:tasks.workunit.client.1.vm05.stdout:2/708: dread - db/dd/d98/fcc zero size 2026-03-09T16:15:24.822 INFO:tasks.workunit.client.1.vm05.stdout:5/812: dread d8/f6f [0,4194304] 0 2026-03-09T16:15:24.825 INFO:tasks.workunit.client.1.vm05.stdout:3/708: getdents d0/d9/d22/d5f/d7b 0 2026-03-09T16:15:24.825 INFO:tasks.workunit.client.1.vm05.stdout:7/812: creat d1/d2/d8/dc/dd4/da8/f11c x:0 0 0 2026-03-09T16:15:24.825 INFO:tasks.workunit.client.1.vm05.stdout:6/750: creat d17/d22/d27/d34/f11b x:0 0 0 2026-03-09T16:15:24.827 INFO:tasks.workunit.client.1.vm05.stdout:2/709: chown db/dd/d15/f90 3410 1 2026-03-09T16:15:24.827 INFO:tasks.workunit.client.1.vm05.stdout:5/813: fdatasync d8/d59/f5c 0 2026-03-09T16:15:24.828 INFO:tasks.workunit.client.1.vm05.stdout:1/836: write d7/dd/de/d52/df6/f65 [661625,56527] 0 2026-03-09T16:15:24.828 INFO:tasks.workunit.client.1.vm05.stdout:9/803: write d4/d10/d35/d2b/d38/f62 [4597205,126360] 0 2026-03-09T16:15:24.829 INFO:tasks.workunit.client.1.vm05.stdout:0/758: write d5/d1b/f6a [3932466,128181] 0 2026-03-09T16:15:24.829 INFO:tasks.workunit.client.1.vm05.stdout:3/709: write d0/d9/d8b/fc7 [1799361,77467] 0 2026-03-09T16:15:24.835 INFO:tasks.workunit.client.1.vm05.stdout:6/751: creat d17/d22/d27/d34/d42/d53/f11c x:0 0 0 2026-03-09T16:15:24.837 INFO:tasks.workunit.client.1.vm05.stdout:3/710: creat d0/d9/d8b/ff6 x:0 0 0 2026-03-09T16:15:24.838 INFO:tasks.workunit.client.1.vm05.stdout:2/710: rename db/dd/d15/d3f/d5b/d60/d95/c57 to db/dd/d15/d46/d8d/ce8 0 2026-03-09T16:15:24.838 INFO:tasks.workunit.client.1.vm05.stdout:8/800: dwrite d4/d6/db/df/f18 [0,4194304] 0 2026-03-09T16:15:24.843 INFO:tasks.workunit.client.1.vm05.stdout:1/837: readlink d7/d15/d45/l89 0 2026-03-09T16:15:24.845 INFO:tasks.workunit.client.1.vm05.stdout:4/856: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:24.852 INFO:tasks.workunit.client.1.vm05.stdout:2/711: chown db/dd/d15/d3f/d5b/d60/f7c 174949 1 2026-03-09T16:15:24.852 INFO:tasks.workunit.client.1.vm05.stdout:7/813: mkdir d1/d2/d8/dc/d1b/d11d 0 2026-03-09T16:15:24.852 INFO:tasks.workunit.client.1.vm05.stdout:6/752: mkdir d17/d22/dce/d11d 0 2026-03-09T16:15:24.855 INFO:tasks.workunit.client.1.vm05.stdout:6/753: chown d17/d22/d27/d34/f66 301144 1 2026-03-09T16:15:24.857 INFO:tasks.workunit.client.1.vm05.stdout:4/857: dread - d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 zero size 2026-03-09T16:15:24.859 INFO:tasks.workunit.client.1.vm05.stdout:1/838: mkdir d7/d15/d16/d123 0 2026-03-09T16:15:24.860 INFO:tasks.workunit.client.1.vm05.stdout:0/759: creat d5/db/d5f/deb/f104 x:0 0 0 2026-03-09T16:15:24.860 INFO:tasks.workunit.client.1.vm05.stdout:5/814: write d8/d18/dbc/dcc/daa/fb1 [1596416,8727] 0 2026-03-09T16:15:24.860 INFO:tasks.workunit.client.1.vm05.stdout:3/711: rename d0/d9/d22/d4c to d0/d9/d22/d5f/d75/d76/d88/da3/df7 0 2026-03-09T16:15:24.860 INFO:tasks.workunit.client.1.vm05.stdout:9/804: dwrite d4/d10/d35/d36/f49 [0,4194304] 0 2026-03-09T16:15:24.864 INFO:tasks.workunit.client.1.vm05.stdout:5/815: read d8/f11 [2035111,50187] 0 2026-03-09T16:15:24.866 INFO:tasks.workunit.client.1.vm05.stdout:2/712: stat db/dd/d15/d1f/la3 0 2026-03-09T16:15:24.867 INFO:tasks.workunit.client.1.vm05.stdout:8/801: rmdir d4/d6/d9a 39 2026-03-09T16:15:24.867 INFO:tasks.workunit.client.1.vm05.stdout:8/802: chown d4/d6/d3a/d15/f66 231220537 1 2026-03-09T16:15:24.875 INFO:tasks.workunit.client.1.vm05.stdout:7/814: symlink d1/d2/d8/dc/d1b/d30/d4b/d65/l11e 0 2026-03-09T16:15:24.880 INFO:tasks.workunit.client.1.vm05.stdout:6/754: dread d17/d22/d9d/f7c [0,4194304] 0 2026-03-09T16:15:24.883 INFO:tasks.workunit.client.1.vm05.stdout:0/760: creat d5/d11/d4f/d68/f105 x:0 0 0 2026-03-09T16:15:24.883 INFO:tasks.workunit.client.1.vm05.stdout:5/816: mkdir d8/d18/dbc/dcc/daa/d43/d119 0 2026-03-09T16:15:24.888 INFO:tasks.workunit.client.1.vm05.stdout:9/805: dread d4/d10/d35/d36/fb3 [0,4194304] 0 2026-03-09T16:15:24.889 INFO:tasks.workunit.client.1.vm05.stdout:3/712: dread d0/d9/d22/d5f/d75/d76/fa5 [0,4194304] 0 2026-03-09T16:15:24.892 INFO:tasks.workunit.client.1.vm05.stdout:8/803: mkdir d4/d6/db/dc/d5d/da0/dd7/d10a 0 2026-03-09T16:15:24.892 INFO:tasks.workunit.client.1.vm05.stdout:7/815: rename d1/d2/d8/d31/d8d/ded to d1/d2/d8/dc/d1b/d71/d3c/d11f 0 2026-03-09T16:15:24.892 INFO:tasks.workunit.client.1.vm05.stdout:8/804: stat d4/d6/d53/c6b 0 2026-03-09T16:15:24.894 INFO:tasks.workunit.client.1.vm05.stdout:0/761: creat d5/d11/f106 x:0 0 0 2026-03-09T16:15:24.894 INFO:tasks.workunit.client.1.vm05.stdout:8/805: chown d4/d6/d3a/d3c/l68 248468144 1 2026-03-09T16:15:24.896 INFO:tasks.workunit.client.1.vm05.stdout:5/817: unlink d8/d18/dbc/f111 0 2026-03-09T16:15:24.896 INFO:tasks.workunit.client.1.vm05.stdout:5/818: fsync d8/d18/d1b/f28 0 2026-03-09T16:15:24.897 INFO:tasks.workunit.client.1.vm05.stdout:4/858: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d108/c139 0 2026-03-09T16:15:24.897 INFO:tasks.workunit.client.1.vm05.stdout:3/713: symlink d0/d9/d97/dac/lf8 0 2026-03-09T16:15:24.898 INFO:tasks.workunit.client.1.vm05.stdout:0/762: write d5/d1b/fce [230863,108096] 0 2026-03-09T16:15:24.899 INFO:tasks.workunit.client.1.vm05.stdout:4/859: chown d5/de/d15/d21/d27/lfa 12938 1 2026-03-09T16:15:24.899 INFO:tasks.workunit.client.1.vm05.stdout:8/806: chown d4/d6/db/d75/lc5 76637060 1 2026-03-09T16:15:24.899 INFO:tasks.workunit.client.1.vm05.stdout:4/860: stat d5/de/d15/d21/d39/f42 0 2026-03-09T16:15:24.902 INFO:tasks.workunit.client.1.vm05.stdout:6/755: symlink d17/d22/d27/d8a/l11e 0 2026-03-09T16:15:24.906 INFO:tasks.workunit.client.1.vm05.stdout:6/756: stat d17/d22/d9d/da9/c101 0 2026-03-09T16:15:24.907 INFO:tasks.workunit.client.1.vm05.stdout:6/757: fsync d17/d22/d9d/fe8 0 2026-03-09T16:15:24.907 INFO:tasks.workunit.client.1.vm05.stdout:7/816: rename d1/d2/d8/dc/d33/fcd to d1/d2/d8/dc/d14/f120 0 2026-03-09T16:15:24.907 INFO:tasks.workunit.client.1.vm05.stdout:5/819: truncate d8/d53/d7a/f84 122468 0 2026-03-09T16:15:24.907 INFO:tasks.workunit.client.1.vm05.stdout:9/806: fsync d4/d10/d35/d2b/d31/d96/f9b 0 2026-03-09T16:15:24.908 INFO:tasks.workunit.client.1.vm05.stdout:3/714: creat d0/d9/d22/d5f/d90/ff9 x:0 0 0 2026-03-09T16:15:24.909 INFO:tasks.workunit.client.1.vm05.stdout:0/763: chown d5/db/cb3 85844501 1 2026-03-09T16:15:24.911 INFO:tasks.workunit.client.1.vm05.stdout:4/861: read d5/de/d15/d21/d27/d3c/f92 [3009173,123005] 0 2026-03-09T16:15:24.913 INFO:tasks.workunit.client.1.vm05.stdout:6/758: mknod d17/d22/d27/d34/dd1/d10a/c11f 0 2026-03-09T16:15:24.913 INFO:tasks.workunit.client.1.vm05.stdout:4/862: write d5/de/d15/d21/d27/d3c/d5c/d5f/f112 [412143,76140] 0 2026-03-09T16:15:24.914 INFO:tasks.workunit.client.1.vm05.stdout:6/759: chown d17/d22/d27/d34/dd1/d10a/c110 242064 1 2026-03-09T16:15:24.918 INFO:tasks.workunit.client.1.vm05.stdout:6/760: read - d17/d22/d27/d8a/d8b/f9c zero size 2026-03-09T16:15:24.918 INFO:tasks.workunit.client.1.vm05.stdout:2/713: dread db/dd/d15/d3f/d5b/d60/d6a/f8a [0,4194304] 0 2026-03-09T16:15:24.918 INFO:tasks.workunit.client.1.vm05.stdout:5/820: fdatasync d8/f11 0 2026-03-09T16:15:24.933 INFO:tasks.workunit.client.1.vm05.stdout:0/764: mkdir d5/d1b/d30/d107 0 2026-03-09T16:15:24.934 INFO:tasks.workunit.client.1.vm05.stdout:2/714: dread db/dd/d15/d1f/d21/f47 [0,4194304] 0 2026-03-09T16:15:24.934 INFO:tasks.workunit.client.1.vm05.stdout:3/715: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/fe7 [1037973,77693] 0 2026-03-09T16:15:24.937 INFO:tasks.workunit.client.1.vm05.stdout:5/821: dwrite d8/d5e/ff4 [4194304,4194304] 0 2026-03-09T16:15:24.938 INFO:tasks.workunit.client.1.vm05.stdout:5/822: stat d8/d5e/cde 0 2026-03-09T16:15:24.944 INFO:tasks.workunit.client.1.vm05.stdout:2/715: sync 2026-03-09T16:15:24.951 INFO:tasks.workunit.client.1.vm05.stdout:5/823: dread d8/d5e/ff4 [0,4194304] 0 2026-03-09T16:15:24.952 INFO:tasks.workunit.client.1.vm05.stdout:5/824: read - d8/d18/d1b/d47/d4e/fe1 zero size 2026-03-09T16:15:24.973 INFO:tasks.workunit.client.1.vm05.stdout:1/839: dwrite d7/dd/d21/d44/f46 [4194304,4194304] 0 2026-03-09T16:15:24.974 INFO:tasks.workunit.client.1.vm05.stdout:3/716: dwrite d0/d33/f7d [0,4194304] 0 2026-03-09T16:15:24.985 INFO:tasks.workunit.client.1.vm05.stdout:0/765: creat d5/d2c/d49/d83/d8b/daf/f108 x:0 0 0 2026-03-09T16:15:24.985 INFO:tasks.workunit.client.1.vm05.stdout:8/807: write d4/d6/db/dc/f2a [3022142,52945] 0 2026-03-09T16:15:24.992 INFO:tasks.workunit.client.1.vm05.stdout:7/817: link d1/d2/d8/dc/d1b/d30/d7d/fc1 d1/d2/d8/dc/d1b/d30/d4b/f121 0 2026-03-09T16:15:24.999 INFO:tasks.workunit.client.1.vm05.stdout:1/840: read - d7/dd/d21/d63/d71/ddc/df8/fbb zero size 2026-03-09T16:15:25.001 INFO:tasks.workunit.client.1.vm05.stdout:1/841: chown d7/d62/f69 291037 1 2026-03-09T16:15:25.005 INFO:tasks.workunit.client.1.vm05.stdout:9/807: rename d4/d10/d35/d2b/d31/d96 to d4/d10c 0 2026-03-09T16:15:25.009 INFO:tasks.workunit.client.1.vm05.stdout:8/808: truncate d4/d6/fae 459152 0 2026-03-09T16:15:25.013 INFO:tasks.workunit.client.1.vm05.stdout:7/818: unlink d1/d2/d8/dc/d14/l7e 0 2026-03-09T16:15:25.021 INFO:tasks.workunit.client.1.vm05.stdout:2/716: write db/dd/d15/f90 [174511,6017] 0 2026-03-09T16:15:25.025 INFO:tasks.workunit.client.1.vm05.stdout:2/717: truncate db/fc6 372792 0 2026-03-09T16:15:25.025 INFO:tasks.workunit.client.1.vm05.stdout:0/766: mkdir d5/d109 0 2026-03-09T16:15:25.032 INFO:tasks.workunit.client.1.vm05.stdout:7/819: unlink d1/c4e 0 2026-03-09T16:15:25.037 INFO:tasks.workunit.client.1.vm05.stdout:7/820: dread d1/d2/d8/dc/d1b/d30/d7d/d114/f68 [0,4194304] 0 2026-03-09T16:15:25.039 INFO:tasks.workunit.client.1.vm05.stdout:7/821: fdatasync d1/d2/d8/dc/d1b/d30/d4b/d65/f113 0 2026-03-09T16:15:25.041 INFO:tasks.workunit.client.1.vm05.stdout:9/808: mkdir d4/d10/d35/d36/d10d 0 2026-03-09T16:15:25.041 INFO:tasks.workunit.client.1.vm05.stdout:8/809: dwrite d4/d6/d3a/f28 [0,4194304] 0 2026-03-09T16:15:25.043 INFO:tasks.workunit.client.1.vm05.stdout:9/809: read d4/d10/d35/d36/f49 [3420161,11472] 0 2026-03-09T16:15:25.044 INFO:tasks.workunit.client.1.vm05.stdout:3/717: link d0/d9/c71 d0/d33/cfa 0 2026-03-09T16:15:25.046 INFO:tasks.workunit.client.1.vm05.stdout:9/810: chown d4/d10/d35/d36/d48/f6e 7 1 2026-03-09T16:15:25.052 INFO:tasks.workunit.client.1.vm05.stdout:2/718: dread db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:25.052 INFO:tasks.workunit.client.1.vm05.stdout:3/718: dwrite d0/d9/fe9 [0,4194304] 0 2026-03-09T16:15:25.075 INFO:tasks.workunit.client.1.vm05.stdout:0/767: mkdir d5/d11/d4f/ddc/d10a 0 2026-03-09T16:15:25.105 INFO:tasks.workunit.client.1.vm05.stdout:2/719: symlink db/dd/d15/d3f/d5b/d60/d95/dd7/le9 0 2026-03-09T16:15:25.120 INFO:tasks.workunit.client.1.vm05.stdout:4/863: rename d5/c20 to d5/de/d15/c13a 0 2026-03-09T16:15:25.129 INFO:tasks.workunit.client.1.vm05.stdout:4/864: sync 2026-03-09T16:15:25.139 INFO:tasks.workunit.client.1.vm05.stdout:0/768: rmdir d5/d2c/d49/d83/d8b/daf 39 2026-03-09T16:15:25.141 INFO:tasks.workunit.client.1.vm05.stdout:0/769: write d5/d11/f9f [3010198,78385] 0 2026-03-09T16:15:25.153 INFO:tasks.workunit.client.1.vm05.stdout:3/719: symlink d0/dce/lfb 0 2026-03-09T16:15:25.154 INFO:tasks.workunit.client.1.vm05.stdout:2/720: mkdir db/dd/d15/d3f/d5b/d60/d6a/dea 0 2026-03-09T16:15:25.165 INFO:tasks.workunit.client.1.vm05.stdout:6/761: rename d17/l61 to d17/d22/d27/d8a/d8b/l120 0 2026-03-09T16:15:25.165 INFO:tasks.workunit.client.1.vm05.stdout:6/762: stat d17/d22/d9d/ca3 0 2026-03-09T16:15:25.168 INFO:tasks.workunit.client.1.vm05.stdout:9/811: dwrite d4/d10/d35/d2b/f9d [0,4194304] 0 2026-03-09T16:15:25.169 INFO:tasks.workunit.client.1.vm05.stdout:9/812: fdatasync d4/d10/d35/d2b/d38/f4b 0 2026-03-09T16:15:25.172 INFO:tasks.workunit.client.1.vm05.stdout:9/813: truncate d4/d10/d35/d36/d48/d60/fad 526530 0 2026-03-09T16:15:25.179 INFO:tasks.workunit.client.1.vm05.stdout:4/865: fsync d5/fd 0 2026-03-09T16:15:25.195 INFO:tasks.workunit.client.1.vm05.stdout:2/721: symlink db/dd/d15/d46/leb 0 2026-03-09T16:15:25.207 INFO:tasks.workunit.client.1.vm05.stdout:7/822: getdents d1/d2/d8/dc/d1b/d30/d7d 0 2026-03-09T16:15:25.208 INFO:tasks.workunit.client.1.vm05.stdout:2/722: rmdir db/dd/d15/d3f 39 2026-03-09T16:15:25.210 INFO:tasks.workunit.client.1.vm05.stdout:0/770: dwrite d5/db/f54 [0,4194304] 0 2026-03-09T16:15:25.245 INFO:tasks.workunit.client.1.vm05.stdout:5/825: rename d8/d5e/d8e/lc9 to d8/d18/d1b/l11a 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:7/823: truncate d1/d2/d8/dc/d1b/d71/f59 2562529 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:2/723: symlink db/dd/d15/lec 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:1/842: rename d7/dd/d21/d63/d71/l82 to d7/dd/d21/d39/d48/d8c/l124 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:7/824: dwrite d1/d2/d8/dc/dd4/ff4 [0,4194304] 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:0/771: symlink d5/d9e/l10b 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:2/724: stat db/dd/d15/d3f/d5b/d60/d95/f80 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:8/810: rename d4/d6/d3a/d40/d6a/d97/lce to d4/d6/db/dc/d5d/da0/dd7/l10b 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:8/811: chown d4/d6/d3a/d15/cfa 136 1 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:1/843: mkdir d7/dd/de/d52/df6/d55/d125 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:1/844: read - d7/dd/de/d52/fe7 zero size 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:3/720: rename d0/f57 to d0/d9/d97/dc2/ffc 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:0/772: dwrite d5/db/d5f/da3/fc6 [0,4194304] 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:7/825: getdents d1 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:0/773: fdatasync d5/d9e/fbd 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:8/812: truncate d4/d6/d3a/f25 2262718 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:2/725: rename db/dd/d15/d3f/d5b/d60/d95/fc9 to db/dd/d15/d46/d8d/dcd/fed 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:8/813: chown d4/f77 249097761 1 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:1/845: unlink d7/d15/d16/dc2/ld1 0 2026-03-09T16:15:25.246 INFO:tasks.workunit.client.1.vm05.stdout:7/826: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/c122 0 2026-03-09T16:15:25.247 INFO:tasks.workunit.client.1.vm05.stdout:3/721: mkdir d0/d9/d22/d5f/dfd 0 2026-03-09T16:15:25.247 INFO:tasks.workunit.client.1.vm05.stdout:1/846: chown d7/dd/d21/d39/d5a/d113 55526 1 2026-03-09T16:15:25.248 INFO:tasks.workunit.client.1.vm05.stdout:8/814: readlink d4/d6/db/dc/d5d/da0/dd7/dd8/ldd 0 2026-03-09T16:15:25.249 INFO:tasks.workunit.client.1.vm05.stdout:8/815: stat d4/d6/fc0 0 2026-03-09T16:15:25.249 INFO:tasks.workunit.client.1.vm05.stdout:2/726: readlink db/dd/d15/d1f/d20/d23/l37 0 2026-03-09T16:15:25.249 INFO:tasks.workunit.client.1.vm05.stdout:1/847: fdatasync d7/dd/f93 0 2026-03-09T16:15:25.250 INFO:tasks.workunit.client.1.vm05.stdout:1/848: stat d7/dd/de/d52/f7d 0 2026-03-09T16:15:25.251 INFO:tasks.workunit.client.1.vm05.stdout:3/722: rename d0/d9/d22/d5f/d90/ccd to d0/d9/d22/d5f/d7b/d99/cfe 0 2026-03-09T16:15:25.251 INFO:tasks.workunit.client.1.vm05.stdout:2/727: write db/dd/d15/d4c/d56/f62 [1411427,63833] 0 2026-03-09T16:15:25.254 INFO:tasks.workunit.client.1.vm05.stdout:2/728: rmdir db/dd/d15/d4c/d56 39 2026-03-09T16:15:25.485 INFO:tasks.workunit.client.1.vm05.stdout:6/763: sync 2026-03-09T16:15:25.486 INFO:tasks.workunit.client.1.vm05.stdout:6/764: dread - d17/d22/d9d/da5/fd9 zero size 2026-03-09T16:15:25.489 INFO:tasks.workunit.client.1.vm05.stdout:7/827: sync 2026-03-09T16:15:25.494 INFO:tasks.workunit.client.1.vm05.stdout:7/828: mknod d1/d2/d8/dfd/c123 0 2026-03-09T16:15:25.498 INFO:tasks.workunit.client.1.vm05.stdout:9/814: write d4/d10/d35/d36/d48/d54/d59/f5c [3290282,30176] 0 2026-03-09T16:15:25.499 INFO:tasks.workunit.client.1.vm05.stdout:9/815: chown d4/d10/d35/d2b/d38/fa6 884077 1 2026-03-09T16:15:25.501 INFO:tasks.workunit.client.1.vm05.stdout:4/866: dwrite f0 [0,4194304] 0 2026-03-09T16:15:25.512 INFO:tasks.workunit.client.1.vm05.stdout:7/829: dread d1/d2/d8/d31/d8d/f6f [0,4194304] 0 2026-03-09T16:15:25.513 INFO:tasks.workunit.client.1.vm05.stdout:9/816: truncate d4/d10/d35/d36/d48/d60/d94/fd4 88692 0 2026-03-09T16:15:25.515 INFO:tasks.workunit.client.1.vm05.stdout:9/817: write d4/d10/d35/d2b/fa1 [5390966,129454] 0 2026-03-09T16:15:25.521 INFO:tasks.workunit.client.1.vm05.stdout:4/867: mknod d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b/c13b 0 2026-03-09T16:15:25.522 INFO:tasks.workunit.client.1.vm05.stdout:9/818: truncate d4/d10c/f9b 429527 0 2026-03-09T16:15:25.524 INFO:tasks.workunit.client.1.vm05.stdout:5/826: dwrite d8/d18/d1b/f2d [0,4194304] 0 2026-03-09T16:15:25.525 INFO:tasks.workunit.client.1.vm05.stdout:5/827: stat d8/d18/dbc/dcc/lcd 0 2026-03-09T16:15:25.530 INFO:tasks.workunit.client.1.vm05.stdout:9/819: rmdir d4/d10 39 2026-03-09T16:15:25.535 INFO:tasks.workunit.client.1.vm05.stdout:0/774: write d5/db/d48/d66/f91 [544994,93066] 0 2026-03-09T16:15:25.536 INFO:tasks.workunit.client.1.vm05.stdout:8/816: write d4/d6/db/dc/d2e/f46 [2459021,112847] 0 2026-03-09T16:15:25.537 INFO:tasks.workunit.client.1.vm05.stdout:1/849: write d7/fc [2282846,74205] 0 2026-03-09T16:15:25.538 INFO:tasks.workunit.client.1.vm05.stdout:1/850: fsync d7/f34 0 2026-03-09T16:15:25.538 INFO:tasks.workunit.client.1.vm05.stdout:1/851: dread - d7/d62/db6/fff zero size 2026-03-09T16:15:25.540 INFO:tasks.workunit.client.1.vm05.stdout:2/729: truncate db/dd/d15/d46/d67/f9a 2372132 0 2026-03-09T16:15:25.541 INFO:tasks.workunit.client.1.vm05.stdout:5/828: mkdir d8/d5e/d11b 0 2026-03-09T16:15:25.544 INFO:tasks.workunit.client.1.vm05.stdout:3/723: truncate d0/d9/d97/dc2/ffc 265303 0 2026-03-09T16:15:25.545 INFO:tasks.workunit.client.1.vm05.stdout:0/775: mkdir d5/d1b/d3b/d10c 0 2026-03-09T16:15:25.553 INFO:tasks.workunit.client.1.vm05.stdout:2/730: rename db/dd/d15/d1f/d21/d87/fbf to db/dd/d15/d1f/d20/fee 0 2026-03-09T16:15:25.557 INFO:tasks.workunit.client.1.vm05.stdout:2/731: unlink db/dd/d15/lae 0 2026-03-09T16:15:25.557 INFO:tasks.workunit.client.1.vm05.stdout:9/820: dwrite d4/d10/d35/d36/fce [0,4194304] 0 2026-03-09T16:15:25.557 INFO:tasks.workunit.client.1.vm05.stdout:9/821: rmdir d4/d10/dd7 39 2026-03-09T16:15:25.560 INFO:tasks.workunit.client.1.vm05.stdout:9/822: mknod d4/d10c/ddd/c10e 0 2026-03-09T16:15:25.561 INFO:tasks.workunit.client.1.vm05.stdout:9/823: chown d4/d10/d35/d2b/f45 39 1 2026-03-09T16:15:25.561 INFO:tasks.workunit.client.1.vm05.stdout:2/732: link db/dd/d15/d1f/c83 db/dd/d15/d3f/d5b/d60/d95/dd7/cef 0 2026-03-09T16:15:25.567 INFO:tasks.workunit.client.1.vm05.stdout:9/824: mkdir d4/d10/d35/d2b/dc1/dc2/d10f 0 2026-03-09T16:15:25.568 INFO:tasks.workunit.client.1.vm05.stdout:2/733: unlink db/dd/d15/d1f/d20/d23/cb7 0 2026-03-09T16:15:25.573 INFO:tasks.workunit.client.1.vm05.stdout:5/829: dread d8/d1d/f44 [0,4194304] 0 2026-03-09T16:15:25.576 INFO:tasks.workunit.client.1.vm05.stdout:9/825: creat d4/d10/d35/d36/d48/d54/db0/f110 x:0 0 0 2026-03-09T16:15:25.576 INFO:tasks.workunit.client.1.vm05.stdout:2/734: unlink db/dd/d15/d4c/d56/l9e 0 2026-03-09T16:15:25.576 INFO:tasks.workunit.client.1.vm05.stdout:5/830: chown d8/d18/c19 1168 1 2026-03-09T16:15:25.577 INFO:tasks.workunit.client.1.vm05.stdout:9/826: symlink d4/d10/d35/d36/d48/d60/dcb/l111 0 2026-03-09T16:15:25.578 INFO:tasks.workunit.client.1.vm05.stdout:5/831: truncate d8/d5e/f72 933685 0 2026-03-09T16:15:25.578 INFO:tasks.workunit.client.1.vm05.stdout:9/827: fdatasync d4/d10/d35/d2b/d31/dc8/f10b 0 2026-03-09T16:15:25.579 INFO:tasks.workunit.client.1.vm05.stdout:2/735: unlink db/dd/d15/d1f/d21/d87/fd9 0 2026-03-09T16:15:25.592 INFO:tasks.workunit.client.1.vm05.stdout:6/765: dwrite d17/d22/d27/d34/d42/d53/fba [0,4194304] 0 2026-03-09T16:15:25.607 INFO:tasks.workunit.client.1.vm05.stdout:3/724: dread d0/d9/d22/d5f/d7b/f9a [0,4194304] 0 2026-03-09T16:15:25.612 INFO:tasks.workunit.client.1.vm05.stdout:1/852: dread d7/d62/f90 [0,4194304] 0 2026-03-09T16:15:25.612 INFO:tasks.workunit.client.1.vm05.stdout:1/853: chown d7/daa/cb2 10372288 1 2026-03-09T16:15:25.613 INFO:tasks.workunit.client.1.vm05.stdout:1/854: chown d7/dd/d21/d39/d48/lac 895 1 2026-03-09T16:15:25.620 INFO:tasks.workunit.client.1.vm05.stdout:1/855: dread d7/d15/d16/dc2/fc7 [0,4194304] 0 2026-03-09T16:15:25.666 INFO:tasks.workunit.client.1.vm05.stdout:9/828: sync 2026-03-09T16:15:25.667 INFO:tasks.workunit.client.1.vm05.stdout:1/856: sync 2026-03-09T16:15:25.668 INFO:tasks.workunit.client.1.vm05.stdout:7/830: write d1/d2/d8/dc/d33/f9f [3095621,123616] 0 2026-03-09T16:15:25.670 INFO:tasks.workunit.client.1.vm05.stdout:1/857: sync 2026-03-09T16:15:25.672 INFO:tasks.workunit.client.1.vm05.stdout:4/868: dwrite d5/de/d15/da9/db1/dad/d37/d60/f62 [0,4194304] 0 2026-03-09T16:15:25.674 INFO:tasks.workunit.client.1.vm05.stdout:7/831: symlink d1/d2/d8/dfd/l124 0 2026-03-09T16:15:25.682 INFO:tasks.workunit.client.1.vm05.stdout:1/858: truncate d7/dd/d21/d39/fdb 1273203 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:8/817: dwrite d4/d6/db/dc/d5d/d79/f91 [0,4194304] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:4/869: rename d5/de/d15/da9/db1/dad/d37/d60/dbf/d7d/c7f to d5/de/d82/dc1/c13c 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:8/818: dread d4/d6/db/dc/d5d/d79/f91 [0,4194304] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:1/859: rmdir d7/d15/d45 39 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:8/819: unlink d4/d6/d3a/d15/cfa 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:0/776: dwrite d5/d11/d4f/d68/f94 [0,4194304] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:4/870: creat d5/de/d15/d21/d27/d3c/d5c/dfc/f13d x:0 0 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:8/820: mkdir d4/de9/d10c 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:1/860: getdents d7/dd/d21/d39/d48/d5d 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:0/777: link d5/db/d5b/d82/fe5 d5/db/d5b/d82/f10d 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:4/871: creat d5/d9c/f13e x:0 0 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:1/861: creat d7/dd/de/d52/df6/d55/df9/d122/f126 x:0 0 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:5/832: dwrite d8/d18/d1b/f31 [4194304,4194304] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:1/862: write d7/dbe/dca/f11e [154529,43103] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:0/778: dwrite d5/d11/f106 [0,4194304] 0 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:5/833: chown d8/d18/c67 102 1 2026-03-09T16:15:25.713 INFO:tasks.workunit.client.1.vm05.stdout:8/821: dwrite d4/d6/db/df/fdc [0,4194304] 0 2026-03-09T16:15:25.720 INFO:tasks.workunit.client.1.vm05.stdout:0/779: chown d5/db/d5b/da5/fbe 81146699 1 2026-03-09T16:15:25.733 INFO:tasks.workunit.client.1.vm05.stdout:5/834: creat d8/d95/f11c x:0 0 0 2026-03-09T16:15:25.733 INFO:tasks.workunit.client.1.vm05.stdout:1/863: mknod d7/dd/d21/d44/c127 0 2026-03-09T16:15:25.734 INFO:tasks.workunit.client.1.vm05.stdout:0/780: creat d5/db/d5b/f10e x:0 0 0 2026-03-09T16:15:25.740 INFO:tasks.workunit.client.1.vm05.stdout:8/822: dwrite d4/d6/d3a/d15/f66 [0,4194304] 0 2026-03-09T16:15:25.755 INFO:tasks.workunit.client.1.vm05.stdout:4/872: dread d5/de/d2f/f99 [0,4194304] 0 2026-03-09T16:15:25.779 INFO:tasks.workunit.client.1.vm05.stdout:5/835: stat d8/d59/d5b/d8b/da0/lec 0 2026-03-09T16:15:25.779 INFO:tasks.workunit.client.1.vm05.stdout:8/823: rmdir d4/d6/d3a/d40/d6a 39 2026-03-09T16:15:25.780 INFO:tasks.workunit.client.1.vm05.stdout:4/873: mknod d5/d116/c13f 0 2026-03-09T16:15:25.780 INFO:tasks.workunit.client.1.vm05.stdout:4/874: dread - d5/de/d15/d21/d27/fcc zero size 2026-03-09T16:15:25.781 INFO:tasks.workunit.client.1.vm05.stdout:4/875: write d5/de/d15/da9/db1/dad/d37/d60/f113 [850549,41595] 0 2026-03-09T16:15:25.782 INFO:tasks.workunit.client.1.vm05.stdout:5/836: rename d8/d5e/d8e to d8/d18/d1b/d47/d4e/d76/d11d 0 2026-03-09T16:15:25.782 INFO:tasks.workunit.client.1.vm05.stdout:8/824: creat d4/d6/d3a/d40/d71/f10d x:0 0 0 2026-03-09T16:15:25.783 INFO:tasks.workunit.client.1.vm05.stdout:5/837: write d8/d18/d1b/d47/d48/d73/fdd [625622,101404] 0 2026-03-09T16:15:25.783 INFO:tasks.workunit.client.1.vm05.stdout:5/838: chown d8/d18/dbc/dcc/ca3 9103108 1 2026-03-09T16:15:25.784 INFO:tasks.workunit.client.1.vm05.stdout:1/864: link d7/dd/d21/d39/d87/db9/cd3 d7/dd/d21/d39/d48/d8c/dd8/c128 0 2026-03-09T16:15:25.784 INFO:tasks.workunit.client.1.vm05.stdout:5/839: dread - d8/d95/f11c zero size 2026-03-09T16:15:25.784 INFO:tasks.workunit.client.1.vm05.stdout:4/876: getdents d5/de/d15/d21/da0/de3/d100/d119 0 2026-03-09T16:15:25.785 INFO:tasks.workunit.client.1.vm05.stdout:4/877: fdatasync d5/de/d15/f52 0 2026-03-09T16:15:25.787 INFO:tasks.workunit.client.1.vm05.stdout:1/865: creat d7/dd/de/d52/df6/db4/f129 x:0 0 0 2026-03-09T16:15:25.789 INFO:tasks.workunit.client.1.vm05.stdout:5/840: unlink d8/d18/d1b/c33 0 2026-03-09T16:15:25.790 INFO:tasks.workunit.client.1.vm05.stdout:8/825: truncate d4/d6/fae 801439 0 2026-03-09T16:15:25.790 INFO:tasks.workunit.client.1.vm05.stdout:5/841: write d8/d18/d1b/d78/d90/fc4 [1003999,126438] 0 2026-03-09T16:15:25.805 INFO:tasks.workunit.client.1.vm05.stdout:4/878: dread d5/f3e [0,4194304] 0 2026-03-09T16:15:25.805 INFO:tasks.workunit.client.1.vm05.stdout:5/842: mkdir d8/d5e/d11b/d11e 0 2026-03-09T16:15:25.806 INFO:tasks.workunit.client.1.vm05.stdout:1/866: link d7/dd/d21/d44/d5c/c81 d7/dd/d21/d39/c12a 0 2026-03-09T16:15:25.809 INFO:tasks.workunit.client.1.vm05.stdout:5/843: creat d8/d18/dbc/f11f x:0 0 0 2026-03-09T16:15:25.810 INFO:tasks.workunit.client.1.vm05.stdout:1/867: chown d7/d15/d45/l89 1603 1 2026-03-09T16:15:25.811 INFO:tasks.workunit.client.1.vm05.stdout:1/868: creat d7/dd/d21/d2d/f12b x:0 0 0 2026-03-09T16:15:25.819 INFO:tasks.workunit.client.1.vm05.stdout:1/869: mkdir d7/dd/de/d52/df6/d55/d125/d12c 0 2026-03-09T16:15:25.832 INFO:tasks.workunit.client.1.vm05.stdout:5/844: read f1 [5462461,113670] 0 2026-03-09T16:15:25.833 INFO:tasks.workunit.client.1.vm05.stdout:2/736: write db/dd/d15/d3f/d5b/d60/d6a/f8a [5197537,117955] 0 2026-03-09T16:15:25.840 INFO:tasks.workunit.client.1.vm05.stdout:3/725: dwrite d0/d9/f93 [0,4194304] 0 2026-03-09T16:15:25.841 INFO:tasks.workunit.client.1.vm05.stdout:6/766: write d17/d22/dce/fdf [153531,34102] 0 2026-03-09T16:15:25.844 INFO:tasks.workunit.client.1.vm05.stdout:7/832: write d1/d2/d8/dc/d1b/d30/d7d/fc1 [1175191,107218] 0 2026-03-09T16:15:25.845 INFO:tasks.workunit.client.1.vm05.stdout:8/826: getdents d4/de9 0 2026-03-09T16:15:25.847 INFO:tasks.workunit.client.1.vm05.stdout:5/845: read - d8/d95/f98 zero size 2026-03-09T16:15:25.849 INFO:tasks.workunit.client.1.vm05.stdout:6/767: write d17/d22/d9d/da9/ff0 [562451,24774] 0 2026-03-09T16:15:25.857 INFO:tasks.workunit.client.1.vm05.stdout:9/829: dwrite d4/d10/d35/d2b/d38/d65/dd6/de3/f93 [0,4194304] 0 2026-03-09T16:15:25.877 INFO:tasks.workunit.client.1.vm05.stdout:9/830: sync 2026-03-09T16:15:25.877 INFO:tasks.workunit.client.1.vm05.stdout:2/737: creat db/dd/d15/d46/d67/ff0 x:0 0 0 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:25.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:25 vm03.local ceph-mon[51019]: pgmap v22: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 61 MiB/s rd, 185 MiB/s wr, 386 op/s 2026-03-09T16:15:25.935 INFO:tasks.workunit.client.1.vm05.stdout:3/726: fsync d0/d33/f92 0 2026-03-09T16:15:25.935 INFO:tasks.workunit.client.1.vm05.stdout:7/833: dread - d1/d2/d8/dc/d1b/d30/d4b/db2/fd9 zero size 2026-03-09T16:15:25.936 INFO:tasks.workunit.client.1.vm05.stdout:5/846: creat d8/d59/d5b/d8b/da0/de2/f120 x:0 0 0 2026-03-09T16:15:25.936 INFO:tasks.workunit.client.1.vm05.stdout:6/768: dread - d17/d22/d27/d44/f7a zero size 2026-03-09T16:15:25.936 INFO:tasks.workunit.client.1.vm05.stdout:5/847: chown d8/d18/d1b/f2d 21 1 2026-03-09T16:15:25.937 INFO:tasks.workunit.client.1.vm05.stdout:6/769: chown d17/d1d/fd3 40 1 2026-03-09T16:15:25.938 INFO:tasks.workunit.client.1.vm05.stdout:2/738: rename db/dd/d15/d1f/f49 to db/dd/d15/d46/d67/ff1 0 2026-03-09T16:15:25.940 INFO:tasks.workunit.client.1.vm05.stdout:4/879: write d5/de/d15/d21/d27/fc8 [3609147,78006] 0 2026-03-09T16:15:25.945 INFO:tasks.workunit.client.1.vm05.stdout:1/870: dwrite d7/dd/de/d52/df6/ff5 [0,4194304] 0 2026-03-09T16:15:25.947 INFO:tasks.workunit.client.1.vm05.stdout:8/827: fsync d4/d6/d53/f7f 0 2026-03-09T16:15:25.954 INFO:tasks.workunit.client.1.vm05.stdout:1/871: dwrite d7/d15/d16/f74 [0,4194304] 0 2026-03-09T16:15:25.958 INFO:tasks.workunit.client.1.vm05.stdout:1/872: dwrite d7/d15/d16/f74 [0,4194304] 0 2026-03-09T16:15:25.961 INFO:tasks.workunit.client.1.vm05.stdout:1/873: dread - d7/d15/d16/f118 zero size 2026-03-09T16:15:25.961 INFO:tasks.workunit.client.1.vm05.stdout:1/874: chown d7/dd/d21/d44/dcc/fcd 2758987 1 2026-03-09T16:15:25.963 INFO:tasks.workunit.client.1.vm05.stdout:7/834: write d1/d2/d8/dc/d33/f9d [1305896,18971] 0 2026-03-09T16:15:25.967 INFO:tasks.workunit.client.1.vm05.stdout:1/875: dwrite d7/dd/de/d52/df6/db4/f129 [0,4194304] 0 2026-03-09T16:15:25.991 INFO:tasks.workunit.client.1.vm05.stdout:6/770: creat d17/d22/d27/d34/d42/d53/d87/f121 x:0 0 0 2026-03-09T16:15:25.992 INFO:tasks.workunit.client.1.vm05.stdout:9/831: dread d4/d10/d35/d36/d48/fb7 [0,4194304] 0 2026-03-09T16:15:26.012 INFO:tasks.workunit.client.1.vm05.stdout:3/727: creat d0/da9/fff x:0 0 0 2026-03-09T16:15:26.013 INFO:tasks.workunit.client.1.vm05.stdout:7/835: creat d1/d2/d8/dc/d1b/d30/d4b/db2/df3/f125 x:0 0 0 2026-03-09T16:15:26.016 INFO:tasks.workunit.client.1.vm05.stdout:1/876: unlink d7/dd/d21/d2d/c68 0 2026-03-09T16:15:26.018 INFO:tasks.workunit.client.1.vm05.stdout:6/771: fdatasync d17/d1d/fc1 0 2026-03-09T16:15:26.019 INFO:tasks.workunit.client.1.vm05.stdout:8/828: read d4/f23 [2568160,15567] 0 2026-03-09T16:15:26.019 INFO:tasks.workunit.client.1.vm05.stdout:3/728: sync 2026-03-09T16:15:26.025 INFO:tasks.workunit.client.1.vm05.stdout:2/739: rename db/dd/fb8 to db/dd/d7b/ff2 0 2026-03-09T16:15:26.030 INFO:tasks.workunit.client.1.vm05.stdout:7/836: symlink d1/d2/d8/dc/d1b/d30/d7d/l126 0 2026-03-09T16:15:26.030 INFO:tasks.workunit.client.1.vm05.stdout:6/772: fdatasync fa 0 2026-03-09T16:15:26.036 INFO:tasks.workunit.client.1.vm05.stdout:4/880: rmdir d5/de/d15/d120 0 2026-03-09T16:15:26.036 INFO:tasks.workunit.client.1.vm05.stdout:1/877: rename d7/f9 to d7/dd/d21/d63/d71/ddc/df8/f12d 0 2026-03-09T16:15:26.038 INFO:tasks.workunit.client.1.vm05.stdout:2/740: mkdir db/dd/d15/d46/df3 0 2026-03-09T16:15:26.043 INFO:tasks.workunit.client.1.vm05.stdout:4/881: truncate d5/fb 1591956 0 2026-03-09T16:15:26.044 INFO:tasks.workunit.client.1.vm05.stdout:8/829: dwrite d4/f77 [4194304,4194304] 0 2026-03-09T16:15:26.056 INFO:tasks.workunit.client.1.vm05.stdout:1/878: chown d7/dd/de/d52/df6/cf0 1811171928 1 2026-03-09T16:15:26.057 INFO:tasks.workunit.client.1.vm05.stdout:2/741: truncate db/dd/d15/d4c/fe4 947618 0 2026-03-09T16:15:26.058 INFO:tasks.workunit.client.1.vm05.stdout:6/773: dread d17/d22/d27/d34/d42/d53/fba [0,4194304] 0 2026-03-09T16:15:26.061 INFO:tasks.workunit.client.1.vm05.stdout:3/729: rename d0/d33/ff2 to d0/d9/d22/d5f/d75/f100 0 2026-03-09T16:15:26.066 INFO:tasks.workunit.client.1.vm05.stdout:3/730: dwrite d0/d9/d97/fca [0,4194304] 0 2026-03-09T16:15:26.071 INFO:tasks.workunit.client.1.vm05.stdout:4/882: fsync d5/de/d82/fbe 0 2026-03-09T16:15:26.074 INFO:tasks.workunit.client.1.vm05.stdout:4/883: chown d5/de/d15/d21/d27/d3c/d5c/d5f/ce5 135 1 2026-03-09T16:15:26.074 INFO:tasks.workunit.client.1.vm05.stdout:1/879: mknod d7/dd/d21/d39/d48/d8c/dd8/d103/c12e 0 2026-03-09T16:15:26.075 INFO:tasks.workunit.client.1.vm05.stdout:4/884: stat d5/de/d15/d21/f6d 0 2026-03-09T16:15:26.076 INFO:tasks.workunit.client.1.vm05.stdout:6/774: mkdir d17/d22/d9d/da5/d122 0 2026-03-09T16:15:26.080 INFO:tasks.workunit.client.1.vm05.stdout:3/731: dwrite d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/f5d [0,4194304] 0 2026-03-09T16:15:26.082 INFO:tasks.workunit.client.1.vm05.stdout:8/830: creat d4/d6/db/dc/d5d/da0/dd7/d10a/f10e x:0 0 0 2026-03-09T16:15:26.086 INFO:tasks.workunit.client.1.vm05.stdout:1/880: symlink d7/d15/d6e/dbc/dd6/l12f 0 2026-03-09T16:15:26.091 INFO:tasks.workunit.client.1.vm05.stdout:6/775: rename d17/d22/d27/d34/d42/d53/fba to d17/d22/d27/d34/d42/d53/d87/df6/f123 0 2026-03-09T16:15:26.091 INFO:tasks.workunit.client.1.vm05.stdout:3/732: rmdir d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3 39 2026-03-09T16:15:26.092 INFO:tasks.workunit.client.1.vm05.stdout:8/831: write d4/d6/db/d9b/fbe [616893,2129] 0 2026-03-09T16:15:26.097 INFO:tasks.workunit.client.1.vm05.stdout:6/776: dread d17/d4f/f8c [0,4194304] 0 2026-03-09T16:15:26.098 INFO:tasks.workunit.client.1.vm05.stdout:3/733: mkdir d0/d9/d22/d5f/d90/dae/dd2/d101 0 2026-03-09T16:15:26.098 INFO:tasks.workunit.client.1.vm05.stdout:3/734: write d0/d9/d22/d5f/d75/d76/d88/f9c [2831323,73371] 0 2026-03-09T16:15:26.114 INFO:tasks.workunit.client.1.vm05.stdout:3/735: chown d0/f45 129552142 1 2026-03-09T16:15:26.123 INFO:tasks.workunit.client.1.vm05.stdout:3/736: fdatasync d0/d9/d97/dc2/ffc 0 2026-03-09T16:15:26.125 INFO:tasks.workunit.client.1.vm05.stdout:3/737: fdatasync d0/d33/f77 0 2026-03-09T16:15:26.125 INFO:tasks.workunit.client.1.vm05.stdout:3/738: dread - d0/d9/d97/dbc/ff3 zero size 2026-03-09T16:15:26.126 INFO:tasks.workunit.client.1.vm05.stdout:6/777: getdents d17/d22/d27/d34/dd1 0 2026-03-09T16:15:26.131 INFO:tasks.workunit.client.1.vm05.stdout:3/739: sync 2026-03-09T16:15:26.161 INFO:tasks.workunit.client.1.vm05.stdout:0/781: dwrite d5/db/d5b/f69 [0,4194304] 0 2026-03-09T16:15:26.164 INFO:tasks.workunit.client.1.vm05.stdout:6/778: dread d17/d22/d9d/da9/ff0 [0,4194304] 0 2026-03-09T16:15:26.166 INFO:tasks.workunit.client.1.vm05.stdout:5/848: dwrite d8/dc8/fdb [0,4194304] 0 2026-03-09T16:15:26.175 INFO:tasks.workunit.client.1.vm05.stdout:5/849: write d8/d18/d1b/d47/f4c [3506168,38070] 0 2026-03-09T16:15:26.184 INFO:tasks.workunit.client.1.vm05.stdout:5/850: dread d8/dc8/fdb [0,4194304] 0 2026-03-09T16:15:26.185 INFO:tasks.workunit.client.1.vm05.stdout:9/832: dwrite d4/d10/d35/d2b/d38/d65/f6a [0,4194304] 0 2026-03-09T16:15:26.225 INFO:tasks.workunit.client.1.vm05.stdout:0/782: mknod d5/d1b/d30/c10f 0 2026-03-09T16:15:26.229 INFO:tasks.workunit.client.1.vm05.stdout:0/783: chown d5/d2c/dff 35587 1 2026-03-09T16:15:26.230 INFO:tasks.workunit.client.1.vm05.stdout:0/784: chown d5/d1b/d30/ca0 124 1 2026-03-09T16:15:26.237 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:26.239 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:25 vm05.local ceph-mon[58702]: pgmap v22: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 61 MiB/s rd, 185 MiB/s wr, 386 op/s 2026-03-09T16:15:26.239 INFO:tasks.workunit.client.1.vm05.stdout:6/779: dread - d17/d1d/fd3 zero size 2026-03-09T16:15:26.252 INFO:tasks.workunit.client.1.vm05.stdout:3/740: fdatasync d0/f5a 0 2026-03-09T16:15:26.252 INFO:tasks.workunit.client.1.vm05.stdout:0/785: read d5/d11/d4f/d70/fbf [686538,95959] 0 2026-03-09T16:15:26.252 INFO:tasks.workunit.client.1.vm05.stdout:5/851: dread d8/f11 [0,4194304] 0 2026-03-09T16:15:26.253 INFO:tasks.workunit.client.1.vm05.stdout:7/837: truncate d1/d2/d8/d31/f105 3549914 0 2026-03-09T16:15:26.254 INFO:tasks.workunit.client.1.vm05.stdout:5/852: chown d8/d95/cdf 616682 1 2026-03-09T16:15:26.255 INFO:tasks.workunit.client.1.vm05.stdout:0/786: write d5/d2c/d49/d83/fc9 [813419,79138] 0 2026-03-09T16:15:26.257 INFO:tasks.workunit.client.1.vm05.stdout:7/838: chown d1/d2/d8/dc/dd4/da8 6 1 2026-03-09T16:15:26.258 INFO:tasks.workunit.client.1.vm05.stdout:7/839: chown d1/d2/d11/d86/d8a/fa3 0 1 2026-03-09T16:15:26.259 INFO:tasks.workunit.client.1.vm05.stdout:7/840: fdatasync d1/d2/d8/f109 0 2026-03-09T16:15:26.279 INFO:tasks.workunit.client.1.vm05.stdout:2/742: dwrite db/dd/d15/d3f/f4a [0,4194304] 0 2026-03-09T16:15:26.289 INFO:tasks.workunit.client.1.vm05.stdout:4/885: write d5/de/d2f/f78 [421393,99925] 0 2026-03-09T16:15:26.304 INFO:tasks.workunit.client.1.vm05.stdout:5/853: symlink d8/d5e/d11b/l121 0 2026-03-09T16:15:26.304 INFO:tasks.workunit.client.1.vm05.stdout:8/832: dwrite d4/d6/f44 [0,4194304] 0 2026-03-09T16:15:26.304 INFO:tasks.workunit.client.1.vm05.stdout:8/833: chown d4/d6/d53/c6b 81245 1 2026-03-09T16:15:26.304 INFO:tasks.workunit.client.1.vm05.stdout:8/834: chown d4/d6/d53/l6e 32283 1 2026-03-09T16:15:26.305 INFO:tasks.workunit.client.1.vm05.stdout:1/881: dwrite d7/d62/db6/fc1 [0,4194304] 0 2026-03-09T16:15:26.342 INFO:tasks.workunit.client.1.vm05.stdout:0/787: creat d5/db/def/f110 x:0 0 0 2026-03-09T16:15:26.342 INFO:tasks.workunit.client.1.vm05.stdout:6/780: creat d17/d22/d27/d34/d42/d65/f124 x:0 0 0 2026-03-09T16:15:26.346 INFO:tasks.workunit.client.1.vm05.stdout:2/743: rmdir db/dd/d7b 39 2026-03-09T16:15:26.348 INFO:tasks.workunit.client.1.vm05.stdout:2/744: stat db/dd/d15/d46/d8d 0 2026-03-09T16:15:26.350 INFO:tasks.workunit.client.1.vm05.stdout:9/833: rename d4/d10/d35/d36/d48/d60/fad to d4/d10/d35/d2b/d38/f112 0 2026-03-09T16:15:26.352 INFO:tasks.workunit.client.1.vm05.stdout:5/854: truncate d8/d1d/f44 1513402 0 2026-03-09T16:15:26.352 INFO:tasks.workunit.client.1.vm05.stdout:8/835: rmdir d4/d6/db/dc/d5d/da0/dd7 39 2026-03-09T16:15:26.358 INFO:tasks.workunit.client.1.vm05.stdout:0/788: truncate d5/d1b/d3b/f3c 491679 0 2026-03-09T16:15:26.365 INFO:tasks.workunit.client.1.vm05.stdout:6/781: fdatasync d17/d22/d27/d34/f47 0 2026-03-09T16:15:26.368 INFO:tasks.workunit.client.1.vm05.stdout:4/886: write d5/de/d2f/d8a/fb0 [306488,36484] 0 2026-03-09T16:15:26.396 INFO:tasks.workunit.client.1.vm05.stdout:2/745: mknod db/dd/d15/d3f/d5b/d60/cf4 0 2026-03-09T16:15:26.405 INFO:tasks.workunit.client.1.vm05.stdout:5/855: chown d8/d18/d1b/d47/d4e/c7d 1337 1 2026-03-09T16:15:26.415 INFO:tasks.workunit.client.1.vm05.stdout:8/836: write d4/d6/db/dc/d5d/da0/dd7/dd8/ffe [10229,17726] 0 2026-03-09T16:15:26.428 INFO:tasks.workunit.client.1.vm05.stdout:3/741: link d0/d33/f64 d0/d9/d22/d5f/d7b/d99/f102 0 2026-03-09T16:15:26.438 INFO:tasks.workunit.client.1.vm05.stdout:0/789: symlink d5/d97/l111 0 2026-03-09T16:15:26.447 INFO:tasks.workunit.client.1.vm05.stdout:6/782: rmdir d17/d22/d27/d58 39 2026-03-09T16:15:26.457 INFO:tasks.workunit.client.1.vm05.stdout:4/887: creat d5/de/d15/d21/dfe/f140 x:0 0 0 2026-03-09T16:15:26.460 INFO:tasks.workunit.client.1.vm05.stdout:4/888: dwrite d5/f95 [0,4194304] 0 2026-03-09T16:15:26.468 INFO:tasks.workunit.client.1.vm05.stdout:7/841: link d1/d2/d8/d31/d8d/f80 d1/d2/d11/d86/da2/f127 0 2026-03-09T16:15:26.468 INFO:tasks.workunit.client.1.vm05.stdout:7/842: stat d1/d2/d11/d86/d8a/d91/ffe 0 2026-03-09T16:15:26.474 INFO:tasks.workunit.client.1.vm05.stdout:2/746: write db/fc6 [1008208,33324] 0 2026-03-09T16:15:26.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.475+0000 7f3310767640 1 -- 192.168.123.103:0/3014473383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33080719c0 msgr2=0x7f3308071dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:26.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.475+0000 7f3310767640 1 --2- 192.168.123.103:0/3014473383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33080719c0 0x7f3308071dc0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f33040099b0 tx=0x7f330402f220 comp rx=0 tx=0).stop 2026-03-09T16:15:26.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.477+0000 7f3310767640 1 -- 192.168.123.103:0/3014473383 shutdown_connections 2026-03-09T16:15:26.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.477+0000 7f3310767640 1 --2- 192.168.123.103:0/3014473383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3308072390 0x7f330810c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.477+0000 7f3310767640 1 --2- 192.168.123.103:0/3014473383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33080719c0 0x7f3308071dc0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.477+0000 7f3310767640 1 -- 192.168.123.103:0/3014473383 >> 192.168.123.103:0/3014473383 conn(0x7f330806d4f0 msgr2=0x7f330806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:26.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.478+0000 7f3310767640 1 -- 192.168.123.103:0/3014473383 shutdown_connections 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.478+0000 7f3310767640 1 -- 192.168.123.103:0/3014473383 wait complete. 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 Processor -- start 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 -- start start 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33080719c0 0x7f33081afdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33081b0a10 con 0x7f3308072390 2026-03-09T16:15:26.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.481+0000 7f3310767640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33081b0b80 con 0x7f33080719c0 2026-03-09T16:15:26.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:26.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33506/0 (socket says 192.168.123.103:33506) 2026-03-09T16:15:26.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 -- 192.168.123.103:0/3314105931 learned_addr learned my addr 192.168.123.103:0/3314105931 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:26.485 INFO:tasks.workunit.client.1.vm05.stdout:5/856: symlink d8/d18/dbc/dcc/daa/d43/l122 0 2026-03-09T16:15:26.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 -- 192.168.123.103:0/3314105931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33080719c0 msgr2=0x7f33081afdf0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:15:26.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33080719c0 0x7f33081afdf0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.482+0000 7f330dcdb640 1 -- 192.168.123.103:0/3314105931 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3304009660 con 0x7f3308072390 2026-03-09T16:15:26.485 INFO:tasks.workunit.client.1.vm05.stdout:5/857: write d8/d18/dbc/f11f [776953,56574] 0 2026-03-09T16:15:26.494 INFO:tasks.workunit.client.1.vm05.stdout:1/882: creat d7/d15/d45/f130 x:0 0 0 2026-03-09T16:15:26.495 INFO:tasks.workunit.client.1.vm05.stdout:0/790: dread - d5/db/ff7 zero size 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.494+0000 7f330dcdb640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f330000ba50 tx=0x7f330000bf20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.495+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3300002c70 con 0x7f3308072390 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.495+0000 7f3310767640 1 -- 192.168.123.103:0/3314105931 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33081b52c0 con 0x7f3308072390 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.495+0000 7f3310767640 1 -- 192.168.123.103:0/3314105931 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33081b5890 con 0x7f3308072390 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.495+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3300002dd0 con 0x7f3308072390 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.495+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3300004900 con 0x7f3308072390 2026-03-09T16:15:26.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.496+0000 7f3310767640 1 -- 192.168.123.103:0/3314105931 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f32d0005350 con 0x7f3308072390 2026-03-09T16:15:26.499 INFO:tasks.workunit.client.1.vm05.stdout:4/889: unlink d5/fce 0 2026-03-09T16:15:26.502 INFO:tasks.workunit.client.1.vm05.stdout:6/783: mkdir d17/d22/d27/d44/d125 0 2026-03-09T16:15:26.502 INFO:tasks.workunit.client.1.vm05.stdout:2/747: symlink db/dd/d15/d3f/d5b/d60/lf5 0 2026-03-09T16:15:26.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.501+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f330000c780 con 0x7f3308072390 2026-03-09T16:15:26.503 INFO:tasks.workunit.client.1.vm05.stdout:4/890: dread d5/de/d15/da9/db1/dad/d37/d60/f62 [0,4194304] 0 2026-03-09T16:15:26.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.502+0000 7f32f77fe640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 0x7f32e4079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.503+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3300013370 con 0x7f3308072390 2026-03-09T16:15:26.508 INFO:tasks.workunit.client.1.vm05.stdout:8/837: mkdir d4/d6/db/df/d10f 0 2026-03-09T16:15:26.510 INFO:tasks.workunit.client.1.vm05.stdout:3/742: truncate d0/f60 4458750 0 2026-03-09T16:15:26.510 INFO:tasks.workunit.client.1.vm05.stdout:1/883: creat d7/dd/d21/d63/d71/ddc/f131 x:0 0 0 2026-03-09T16:15:26.511 INFO:tasks.workunit.client.1.vm05.stdout:3/743: chown d0/d9/d97/dac/lf8 8982 1 2026-03-09T16:15:26.514 INFO:tasks.workunit.client.1.vm05.stdout:0/791: creat d5/db/def/df2/f112 x:0 0 0 2026-03-09T16:15:26.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.513+0000 7f330e4dc640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 0x7f32e4079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.515+0000 7f330e4dc640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 0x7f32e4079b40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f3304002af0 tx=0x7f33040023d0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:26.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.518+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f3300061f80 con 0x7f3308072390 2026-03-09T16:15:26.534 INFO:tasks.workunit.client.1.vm05.stdout:4/891: creat d5/de/d15/d21/d27/d3c/d5c/dfc/f141 x:0 0 0 2026-03-09T16:15:26.536 INFO:tasks.workunit.client.1.vm05.stdout:5/858: fdatasync d8/d18/d1b/d47/d48/f61 0 2026-03-09T16:15:26.540 INFO:tasks.workunit.client.1.vm05.stdout:1/884: symlink d7/dd/l132 0 2026-03-09T16:15:26.541 INFO:tasks.workunit.client.1.vm05.stdout:1/885: truncate d7/dd/de/d52/df6/d55/df9/d122/f10f 9869 0 2026-03-09T16:15:26.544 INFO:tasks.workunit.client.1.vm05.stdout:5/859: write d8/d59/d5b/d8b/da0/de2/f120 [637052,9669] 0 2026-03-09T16:15:26.545 INFO:tasks.workunit.client.1.vm05.stdout:3/744: fdatasync d0/da9/fb5 0 2026-03-09T16:15:26.546 INFO:tasks.workunit.client.1.vm05.stdout:3/745: write d0/d9/d8b/ff6 [496898,128837] 0 2026-03-09T16:15:26.553 INFO:tasks.workunit.client.1.vm05.stdout:3/746: dwrite d0/d33/f7d [0,4194304] 0 2026-03-09T16:15:26.558 INFO:tasks.workunit.client.1.vm05.stdout:0/792: mknod d5/db/def/df2/c113 0 2026-03-09T16:15:26.560 INFO:tasks.workunit.client.1.vm05.stdout:6/784: rename d17/d22/d9d/f7c to d17/d22/d27/d34/dd1/d10a/f126 0 2026-03-09T16:15:26.561 INFO:tasks.workunit.client.1.vm05.stdout:3/747: chown d0/d9/d22/d5f/d90/fcb 239062 1 2026-03-09T16:15:26.574 INFO:tasks.workunit.client.1.vm05.stdout:1/886: dread d7/dd/de/d52/fd7 [0,4194304] 0 2026-03-09T16:15:26.626 INFO:tasks.workunit.client.1.vm05.stdout:9/834: truncate d4/d10/f18 136793 0 2026-03-09T16:15:26.633 INFO:tasks.workunit.client.1.vm05.stdout:7/843: truncate d1/d2/d8/dc/d1b/d30/d4b/d65/fe3 3751495 0 2026-03-09T16:15:26.642 INFO:tasks.workunit.client.1.vm05.stdout:8/838: mkdir d4/d6/d9a/db3/d110 0 2026-03-09T16:15:26.647 INFO:tasks.workunit.client.1.vm05.stdout:8/839: read d4/d6/d3a/f88 [1914519,45444] 0 2026-03-09T16:15:26.661 INFO:tasks.workunit.client.1.vm05.stdout:6/785: creat d17/d22/d27/d34/d42/d65/f127 x:0 0 0 2026-03-09T16:15:26.667 INFO:tasks.workunit.client.1.vm05.stdout:4/892: rename d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d7b to d5/de/d15/d21/d27/d3c/d142 0 2026-03-09T16:15:26.670 INFO:tasks.workunit.client.1.vm05.stdout:9/835: mknod d4/d10/d35/d36/d48/d60/d94/c113 0 2026-03-09T16:15:26.680 INFO:tasks.workunit.client.1.vm05.stdout:7/844: symlink d1/d2/d8/dc/d1b/d30/d4b/db2/df3/l128 0 2026-03-09T16:15:26.691 INFO:tasks.workunit.client.1.vm05.stdout:8/840: symlink d4/d6/d9a/l111 0 2026-03-09T16:15:26.691 INFO:tasks.workunit.client.1.vm05.stdout:5/860: getdents d8/d18/d1b/d47/d4e/d76/d11d/d115 0 2026-03-09T16:15:26.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.698+0000 7f3310767640 1 -- 192.168.123.103:0/3314105931 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f32d0002bf0 con 0x7f32e4077680 2026-03-09T16:15:26.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.703+0000 7f32f77fe640 1 -- 192.168.123.103:0/3314105931 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f32d0002bf0 con 0x7f32e4077680 2026-03-09T16:15:26.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.707+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 msgr2=0x7f32e4079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:26.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.707+0000 7f32f57fa640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 0x7f32e4079b40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f3304002af0 tx=0x7f33040023d0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.709 INFO:tasks.workunit.client.1.vm05.stdout:1/887: rename d7/d15/d45/dee to d7/dbe/dca/d133 0 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.707+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 msgr2=0x7f33081b03d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.707+0000 7f32f57fa640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f330000ba50 tx=0x7f330000bf20 comp rx=0 tx=0).stop 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 shutdown_connections 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f32e4077680 0x7f32e4079b40 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3308072390 0x7f33081b03d0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 --2- 192.168.123.103:0/3314105931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33080719c0 0x7f33081afdf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 >> 192.168.123.103:0/3314105931 conn(0x7f330806d4f0 msgr2=0x7f330810a7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 shutdown_connections 2026-03-09T16:15:26.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.708+0000 7f32f57fa640 1 -- 192.168.123.103:0/3314105931 wait complete. 2026-03-09T16:15:26.724 INFO:tasks.workunit.client.1.vm05.stdout:1/888: dread d7/dd/d21/d63/d71/ddc/df8/f110 [0,4194304] 0 2026-03-09T16:15:26.744 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:15:26.753 INFO:tasks.workunit.client.1.vm05.stdout:0/793: write d5/d2c/d49/f5d [1630640,115785] 0 2026-03-09T16:15:26.767 INFO:tasks.workunit.client.1.vm05.stdout:2/748: getdents db/dd/d15/d1f 0 2026-03-09T16:15:26.771 INFO:tasks.workunit.client.1.vm05.stdout:2/749: chown db/dd/d15/d46/d67/ld0 744 1 2026-03-09T16:15:26.772 INFO:tasks.workunit.client.1.vm05.stdout:2/750: chown db/dd/d15/d3f/d5b/d60/lf5 537320 1 2026-03-09T16:15:26.792 INFO:tasks.workunit.client.1.vm05.stdout:6/786: write d17/d22/d9d/da5/fd9 [590388,51743] 0 2026-03-09T16:15:26.792 INFO:tasks.workunit.client.1.vm05.stdout:3/748: truncate d0/d9/f93 2789807 0 2026-03-09T16:15:26.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.841+0000 7fbeb49c5640 1 -- 192.168.123.103:0/1848695001 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 msgr2=0x7fbeb0071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:26.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.841+0000 7fbeb49c5640 1 --2- 192.168.123.103:0/1848695001 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0071da0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fbea00099b0 tx=0x7fbea002f240 comp rx=0 tx=0).stop 2026-03-09T16:15:26.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.845+0000 7fbeb49c5640 1 -- 192.168.123.103:0/1848695001 shutdown_connections 2026-03-09T16:15:26.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.845+0000 7fbeb49c5640 1 --2- 192.168.123.103:0/1848695001 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb010c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.845+0000 7fbeb49c5640 1 --2- 192.168.123.103:0/1848695001 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0071da0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.845+0000 7fbeb49c5640 1 -- 192.168.123.103:0/1848695001 >> 192.168.123.103:0/1848695001 conn(0x7fbeb006d4f0 msgr2=0x7fbeb006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:26.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.846+0000 7fbeb49c5640 1 -- 192.168.123.103:0/1848695001 shutdown_connections 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 -- 192.168.123.103:0/1848695001 wait complete. 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 Processor -- start 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 -- start start 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0115b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbeb0117580 con 0x7fbeb0072370 2026-03-09T16:15:26.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.847+0000 7fbeb49c5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbeb01165c0 con 0x7fbeb00719a0 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33532/0 (socket says 192.168.123.103:33532) 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 -- 192.168.123.103:0/103257650 learned_addr learned my addr 192.168.123.103:0/103257650 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbeaf7fe640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0115b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 -- 192.168.123.103:0/103257650 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 msgr2=0x7fbeb0115b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0115b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.849+0000 7fbea7fff640 1 -- 192.168.123.103:0/103257650 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbea0009660 con 0x7fbeb0072370 2026-03-09T16:15:26.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.850+0000 7fbea7fff640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fbe9c00b700 tx=0x7fbe9c00bbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:26.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.851+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe9c0042e0 con 0x7fbeb0072370 2026-03-09T16:15:26.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.851+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbe9c009450 con 0x7fbeb0072370 2026-03-09T16:15:26.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.851+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe9c00cae0 con 0x7fbeb0072370 2026-03-09T16:15:26.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.851+0000 7fbeb49c5640 1 -- 192.168.123.103:0/103257650 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbeb0116870 con 0x7fbeb0072370 2026-03-09T16:15:26.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.851+0000 7fbeb49c5640 1 -- 192.168.123.103:0/103257650 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbeb01b59b0 con 0x7fbeb0072370 2026-03-09T16:15:26.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.852+0000 7fbeb49c5640 1 -- 192.168.123.103:0/103257650 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbeb0071da0 con 0x7fbeb0072370 2026-03-09T16:15:26.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.857+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fbe9c00cc40 con 0x7fbeb0072370 2026-03-09T16:15:26.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.858+0000 7fbead7fa640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 0x7fbe80079930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:26.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.858+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fbe9c01a030 con 0x7fbeb0072370 2026-03-09T16:15:26.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.858+0000 7fbeaf7fe640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 0x7fbe80079930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:26.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.858+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fbe9c098c00 con 0x7fbeb0072370 2026-03-09T16:15:26.863 INFO:tasks.workunit.client.1.vm05.stdout:4/893: creat d5/de/d15/da9/db1/dad/d37/dfb/f143 x:0 0 0 2026-03-09T16:15:26.889 INFO:tasks.workunit.client.1.vm05.stdout:4/894: dread d5/de/d15/d21/d39/d91/faa [0,4194304] 0 2026-03-09T16:15:26.893 INFO:tasks.workunit.client.1.vm05.stdout:4/895: dwrite d5/de/d15/d21/d27/f130 [0,4194304] 0 2026-03-09T16:15:26.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:26.900+0000 7fbeaf7fe640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 0x7fbe80079930 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fbea0002410 tx=0x7fbea003a040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:26.904 INFO:tasks.workunit.client.1.vm05.stdout:9/836: write d4/d10/d35/d36/f91 [703763,36279] 0 2026-03-09T16:15:26.919 INFO:tasks.workunit.client.1.vm05.stdout:7/845: mknod d1/d2/d8/dc/c129 0 2026-03-09T16:15:26.926 INFO:tasks.workunit.client.1.vm05.stdout:0/794: rename d5/d1b/d30/cf4 to d5/db/d48/d66/c114 0 2026-03-09T16:15:26.929 INFO:tasks.workunit.client.1.vm05.stdout:8/841: mknod d4/d6/d3a/d40/d6a/d97/c112 0 2026-03-09T16:15:26.938 INFO:tasks.workunit.client.1.vm05.stdout:0/795: dread d5/d2c/dff/f2e [0,4194304] 0 2026-03-09T16:15:26.942 INFO:tasks.workunit.client.1.vm05.stdout:5/861: truncate d8/d53/d7a/f84 673832 0 2026-03-09T16:15:26.947 INFO:tasks.workunit.client.1.vm05.stdout:6/787: mkdir d17/d22/d9d/da9/d128 0 2026-03-09T16:15:26.967 INFO:tasks.workunit.client.1.vm05.stdout:1/889: link d7/dd/de/d52/df6/db4/f129 d7/dd/d21/d39/d48/d5d/f134 0 2026-03-09T16:15:26.969 INFO:tasks.workunit.client.1.vm05.stdout:1/890: read d7/dd/de/d52/df6/f65 [143338,54910] 0 2026-03-09T16:15:26.984 INFO:tasks.workunit.client.1.vm05.stdout:8/842: write d4/d6/db/dc/fa9 [14130,119208] 0 2026-03-09T16:15:26.994 INFO:tasks.workunit.client.1.vm05.stdout:0/796: chown d5/d1b/l2b 24 1 2026-03-09T16:15:27.002 INFO:tasks.workunit.client.1.vm05.stdout:5/862: chown d8/d59/d5b/d8b/da0/ca6 1 1 2026-03-09T16:15:27.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:26 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:26 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.031 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:26 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.031 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:26 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.030+0000 7fbeb49c5640 1 -- 192.168.123.103:0/103257650 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbeb010b960 con 0x7fbe80077470 2026-03-09T16:15:27.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.037+0000 7fbead7fa640 1 -- 192.168.123.103:0/103257650 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7fbeb010b960 con 0x7fbe80077470 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 msgr2=0x7fbe80079930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 0x7fbe80079930 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fbea0002410 tx=0x7fbea003a040 comp rx=0 tx=0).stop 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 msgr2=0x7fbeb0116080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fbe9c00b700 tx=0x7fbe9c00bbd0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 shutdown_connections 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fbe80077470 0x7fbe80079930 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbeb0072370 0x7fbeb0116080 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.043+0000 7fbea67fc640 1 --2- 192.168.123.103:0/103257650 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbeb00719a0 0x7fbeb0115b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.044+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 >> 192.168.123.103:0/103257650 conn(0x7fbeb006d4f0 msgr2=0x7fbeb0070750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.044+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 shutdown_connections 2026-03-09T16:15:27.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.044+0000 7fbea67fc640 1 -- 192.168.123.103:0/103257650 wait complete. 2026-03-09T16:15:27.051 INFO:tasks.workunit.client.1.vm05.stdout:8/843: mknod d4/d6/d3a/d15/dd9/c113 0 2026-03-09T16:15:27.065 INFO:tasks.workunit.client.1.vm05.stdout:5/863: fdatasync d8/d18/d1b/f32 0 2026-03-09T16:15:27.075 INFO:tasks.workunit.client.1.vm05.stdout:6/788: mknod d17/d22/d27/d34/d42/d65/d117/c129 0 2026-03-09T16:15:27.076 INFO:tasks.workunit.client.1.vm05.stdout:6/789: stat d17/d22/d27/d34/d42/d53/l6a 0 2026-03-09T16:15:27.076 INFO:tasks.workunit.client.1.vm05.stdout:6/790: readlink d17/d1d/lbb 0 2026-03-09T16:15:27.078 INFO:tasks.workunit.client.1.vm05.stdout:7/846: link d1/d2/d8/dc/d1b/d71/c32 d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d116/c12a 0 2026-03-09T16:15:27.084 INFO:tasks.workunit.client.1.vm05.stdout:1/891: symlink d7/dd/d21/d39/d5a/d113/l135 0 2026-03-09T16:15:27.097 INFO:tasks.workunit.client.1.vm05.stdout:2/751: rename db/dd/d15/d3f/d5b/d60/d6a/fb6 to db/dd/d15/d4c/ff6 0 2026-03-09T16:15:27.109 INFO:tasks.workunit.client.1.vm05.stdout:8/844: creat d4/d6/db/d59/db0/f114 x:0 0 0 2026-03-09T16:15:27.110 INFO:tasks.workunit.client.1.vm05.stdout:8/845: truncate d4/d6/db/dc/fec 4292273 0 2026-03-09T16:15:27.110 INFO:tasks.workunit.client.1.vm05.stdout:8/846: chown d4/d6/d53/f89 6399654 1 2026-03-09T16:15:27.111 INFO:tasks.workunit.client.1.vm05.stdout:8/847: read - d4/d6/db/dc/d2e/d85/ff9 zero size 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- 192.168.123.103:0/761265359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd74072420 msgr2=0x7fcd74077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 --2- 192.168.123.103:0/761265359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd74072420 0x7fcd74077190 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c009040 tx=0x7fcd6c02fc10 comp rx=0 tx=0).stop 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- 192.168.123.103:0/761265359 shutdown_connections 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 --2- 192.168.123.103:0/761265359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd74072420 0x7fcd74077190 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 --2- 192.168.123.103:0/761265359 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd74071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- 192.168.123.103:0/761265359 >> 192.168.123.103:0/761265359 conn(0x7fcd7406d4f0 msgr2=0x7fcd7406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- 192.168.123.103:0/761265359 shutdown_connections 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- 192.168.123.103:0/761265359 wait complete. 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 Processor -- start 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.134+0000 7fcd78f29640 1 -- start start 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd78f29640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd78f29640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 0x7fcd74082b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd78f29640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd740845e0 con 0x7fcd740826f0 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd78f29640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd740830b0 con 0x7fcd74071a50 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd737fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd737fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47846/0 (socket says 192.168.123.103:47846) 2026-03-09T16:15:27.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.135+0000 7fcd737fe640 1 -- 192.168.123.103:0/182720852 learned_addr learned my addr 192.168.123.103:0/182720852 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd72ffd640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 0x7fcd74082b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd737fe640 1 -- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 msgr2=0x7fcd74082b70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd737fe640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 0x7fcd74082b70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd737fe640 1 -- 192.168.123.103:0/182720852 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd6c008cf0 con 0x7fcd74071a50 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd72ffd640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 0x7fcd74082b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:15:27.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.136+0000 7fcd737fe640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fcd64009870 tx=0x7fcd64009d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.141+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd64010040 con 0x7fcd74071a50 2026-03-09T16:15:27.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.141+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcd6400ecf0 con 0x7fcd74071a50 2026-03-09T16:15:27.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.141+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd64002cf0 con 0x7fcd74071a50 2026-03-09T16:15:27.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.141+0000 7fcd78f29640 1 -- 192.168.123.103:0/182720852 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd74083330 con 0x7fcd74071a50 2026-03-09T16:15:27.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.141+0000 7fcd78f29640 1 -- 192.168.123.103:0/182720852 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd7412ef70 con 0x7fcd74071a50 2026-03-09T16:15:27.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.143+0000 7fcd78f29640 1 -- 192.168.123.103:0/182720852 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd74079e40 con 0x7fcd74071a50 2026-03-09T16:15:27.149 INFO:tasks.workunit.client.1.vm05.stdout:5/864: mknod d8/d18/d1b/d47/d48/d73/c123 0 2026-03-09T16:15:27.153 INFO:tasks.workunit.client.1.vm05.stdout:3/749: getdents d0/d9/d22/d5f/d75/d76/d88/da3/df7 0 2026-03-09T16:15:27.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.151+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fcd6400e830 con 0x7fcd74071a50 2026-03-09T16:15:27.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.152+0000 7fcd70ff9640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 0x7fcd54079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.152+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fcd640990f0 con 0x7fcd74071a50 2026-03-09T16:15:27.154 INFO:tasks.workunit.client.1.vm05.stdout:3/750: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/fe7 [975508,74976] 0 2026-03-09T16:15:27.154 INFO:tasks.workunit.client.1.vm05.stdout:6/791: mknod d17/d22/dce/c12a 0 2026-03-09T16:15:27.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.154+0000 7fcd72ffd640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 0x7fcd54079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.155+0000 7fcd72ffd640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 0x7fcd54079c10 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c002790 tx=0x7fcd6c034040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.161 INFO:tasks.workunit.client.1.vm05.stdout:1/892: mkdir d7/dd/d21/d39/d48/d8c/dd8/d103/d136 0 2026-03-09T16:15:27.162 INFO:tasks.workunit.client.1.vm05.stdout:2/752: fsync db/dd/d15/d3f/d5b/d60/d95/f76 0 2026-03-09T16:15:27.163 INFO:tasks.workunit.client.1.vm05.stdout:1/893: dread d7/dd/d21/d39/d48/d5d/f98 [0,4194304] 0 2026-03-09T16:15:27.164 INFO:tasks.workunit.client.1.vm05.stdout:8/848: symlink d4/d6/db/dc/d5d/l115 0 2026-03-09T16:15:27.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.164+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fcd64061b20 con 0x7fcd74071a50 2026-03-09T16:15:27.173 INFO:tasks.workunit.client.1.vm05.stdout:1/894: dwrite d7/f3f [0,4194304] 0 2026-03-09T16:15:27.175 INFO:tasks.workunit.client.1.vm05.stdout:1/895: write d7/d62/db6/fff [66040,26603] 0 2026-03-09T16:15:27.220 INFO:tasks.workunit.client.1.vm05.stdout:0/797: write d5/db/d5b/d82/fc7 [187550,63785] 0 2026-03-09T16:15:27.221 INFO:tasks.workunit.client.1.vm05.stdout:5/865: dread d8/d18/d1b/f30 [0,4194304] 0 2026-03-09T16:15:27.231 INFO:tasks.workunit.client.1.vm05.stdout:5/866: dread d8/d18/dbc/dcc/daa/fb1 [0,4194304] 0 2026-03-09T16:15:27.235 INFO:tasks.workunit.client.1.vm05.stdout:6/792: mknod d17/d22/d27/d8a/d8b/c12b 0 2026-03-09T16:15:27.247 INFO:tasks.workunit.client.1.vm05.stdout:7/847: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f83 [0,4194304] 0 2026-03-09T16:15:27.260 INFO:tasks.workunit.client.1.vm05.stdout:3/751: dwrite d0/d9/d22/d5f/d90/dae/fb1 [0,4194304] 0 2026-03-09T16:15:27.269 INFO:tasks.workunit.client.1.vm05.stdout:4/896: rename d5/de/d82/dc1/c13c to d5/de/d15/d21/da0/de3/d100/c144 0 2026-03-09T16:15:27.303 INFO:tasks.workunit.client.1.vm05.stdout:8/849: creat d4/d6/db/dc/d5d/da0/dbf/f116 x:0 0 0 2026-03-09T16:15:27.303 INFO:tasks.workunit.client.1.vm05.stdout:1/896: creat d7/dd/d21/d39/d5a/d113/f137 x:0 0 0 2026-03-09T16:15:27.325 INFO:tasks.workunit.client.1.vm05.stdout:7/848: truncate d1/d2/d11/f25 1206064 0 2026-03-09T16:15:27.325 INFO:tasks.workunit.client.1.vm05.stdout:3/752: mknod d0/d9/d22/d5f/d75/d76/d88/c103 0 2026-03-09T16:15:27.329 INFO:tasks.workunit.client.1.vm05.stdout:7/849: dwrite d1/d2/d8/dc/d1b/d30/d4b/fdf [0,4194304] 0 2026-03-09T16:15:27.334 INFO:tasks.workunit.client.1.vm05.stdout:7/850: dwrite d1/d2/d8/dc/d14/f41 [0,4194304] 0 2026-03-09T16:15:27.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.358+0000 7fcd78f29640 1 -- 192.168.123.103:0/182720852 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fcd740008d0 con 0x7fcd54077750 2026-03-09T16:15:27.364 INFO:tasks.workunit.client.1.vm05.stdout:9/837: rename d4/d10/d35/d36/d48/d54/d59/f9f to d4/d10/d35/d2b/d31/d82/dec/f114 0 2026-03-09T16:15:27.373 INFO:tasks.workunit.client.1.vm05.stdout:2/753: dwrite db/dd/d15/d3f/d5b/f9f [0,4194304] 0 2026-03-09T16:15:27.381 INFO:tasks.workunit.client.1.vm05.stdout:9/838: dread d4/d10/d35/d36/d48/d54/db0/fc6 [0,4194304] 0 2026-03-09T16:15:27.388 INFO:tasks.workunit.client.1.vm05.stdout:0/798: creat d5/d2c/d49/d83/d8b/daf/f115 x:0 0 0 2026-03-09T16:15:27.389 INFO:tasks.workunit.client.1.vm05.stdout:0/799: chown d5/d2c/d49/d83/fc9 441 1 2026-03-09T16:15:27.390 INFO:tasks.workunit.client.1.vm05.stdout:1/897: dwrite d7/d62/f90 [4194304,4194304] 0 2026-03-09T16:15:27.391 INFO:tasks.workunit.client.1.vm05.stdout:7/851: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d12b 0 2026-03-09T16:15:27.399 INFO:tasks.workunit.client.1.vm05.stdout:3/753: rename d0/d9/d22/d5f/d75/d76/d88/da3/lc1 to d0/d9/d22/d5f/d90/dae/dd2/d101/l104 0 2026-03-09T16:15:27.400 INFO:tasks.workunit.client.1.vm05.stdout:2/754: fdatasync db/dd/d15/d3f/d5b/f97 0 2026-03-09T16:15:27.400 INFO:tasks.workunit.client.1.vm05.stdout:0/800: fsync d5/d1b/f50 0 2026-03-09T16:15:27.400 INFO:tasks.workunit.client.1.vm05.stdout:5/867: link d8/d18/dbc/fc3 d8/d18/d1b/d47/d48/d73/dfb/f124 0 2026-03-09T16:15:27.401 INFO:tasks.workunit.client.1.vm05.stdout:6/793: link d17/d1d/c2b d17/d22/d27/d34/dd1/d10a/c12c 0 2026-03-09T16:15:27.402 INFO:tasks.workunit.client.1.vm05.stdout:7/852: dwrite d1/d2/d8/dc/dd4/ff4 [0,4194304] 0 2026-03-09T16:15:27.402 INFO:tasks.workunit.client.1.vm05.stdout:5/868: truncate d8/d95/f11c 477849 0 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 3s ago 5m 25.0M - 0.25.0 c8568f914cd2 062551060e4c 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 3s ago 5m 8958k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (4m) 3s ago 4m 8812k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 3s ago 5m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (4m) 3s ago 4m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 3s ago 5m 89.6M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (3m) 3s ago 3m 17.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (3m) 3s ago 3m 263M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (3m) 3s ago 3m 15.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (3m) 3s ago 3m 17.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (51s) 3s ago 5m 601M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (26s) 3s ago 4m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 3s ago 5m 57.5M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (4m) 3s ago 4m 48.2M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (13s) 3s ago 5m 8158k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:15:27.407 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (6s) 3s ago 4m 5356k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 3s ago 4m 323M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 3s ago 4m 327M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 3s ago 4m 251M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (3m) 3s ago 3m 354M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (3m) 3s ago 3m 304M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (3m) 3s ago 3m 266M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (29s) 3s ago 4m 41.6M - 2.43.0 a07b618ecd1d e929b201f901 2026-03-09T16:15:27.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.404+0000 7fcd70ff9640 1 -- 192.168.123.103:0/182720852 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fcd740008d0 con 0x7fcd54077750 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.411+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 msgr2=0x7fcd54079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.411+0000 7fcd527fc640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 0x7fcd54079c10 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c002790 tx=0x7fcd6c034040 comp rx=0 tx=0).stop 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.411+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 msgr2=0x7fcd740840a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.411+0000 7fcd527fc640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fcd64009870 tx=0x7fcd64009d40 comp rx=0 tx=0).stop 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 shutdown_connections 2026-03-09T16:15:27.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcd54077750 0x7fcd54079c10 secure :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fcd6c002790 tx=0x7fcd6c034040 comp rx=0 tx=0).stop 2026-03-09T16:15:27.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd740826f0 0x7fcd74082b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 --2- 192.168.123.103:0/182720852 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd74071a50 0x7fcd740840a0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 >> 192.168.123.103:0/182720852 conn(0x7fcd7406d4f0 msgr2=0x7fcd74073130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 shutdown_connections 2026-03-09T16:15:27.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.412+0000 7fcd527fc640 1 -- 192.168.123.103:0/182720852 wait complete. 2026-03-09T16:15:27.413 INFO:tasks.workunit.client.1.vm05.stdout:7/853: chown d1/d2/d8/dc/dd4/da8/cc2 27443 1 2026-03-09T16:15:27.413 INFO:tasks.workunit.client.1.vm05.stdout:7/854: dread - d1/d2/d8/dc/d1b/f66 zero size 2026-03-09T16:15:27.437 INFO:tasks.workunit.client.1.vm05.stdout:1/898: rename d7/dd/de/d52/df6 to d7/dd/d21/d39/d87/db9/d138 0 2026-03-09T16:15:27.442 INFO:tasks.workunit.client.1.vm05.stdout:3/754: creat d0/d9/d8b/f105 x:0 0 0 2026-03-09T16:15:27.452 INFO:tasks.workunit.client.1.vm05.stdout:4/897: dwrite d5/de/d15/d21/f2a [0,4194304] 0 2026-03-09T16:15:27.453 INFO:tasks.workunit.client.1.vm05.stdout:2/755: unlink db/dd/d15/d3f/d5b/d60/d95/ldb 0 2026-03-09T16:15:27.457 INFO:tasks.workunit.client.1.vm05.stdout:5/869: creat d8/d1d/f125 x:0 0 0 2026-03-09T16:15:27.462 INFO:tasks.workunit.client.1.vm05.stdout:3/755: fsync d0/d33/f63 0 2026-03-09T16:15:27.462 INFO:tasks.workunit.client.1.vm05.stdout:8/850: dwrite d4/d6/f9 [0,4194304] 0 2026-03-09T16:15:27.476 INFO:tasks.workunit.client.1.vm05.stdout:6/794: mknod d17/d22/d27/d34/d42/d65/d10d/c12d 0 2026-03-09T16:15:27.480 INFO:tasks.workunit.client.1.vm05.stdout:5/870: rename d8/d18/d1b/d6b/cca to d8/d18/d1b/d6b/c126 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:5/871: stat d8/d18/d1b/d47/d48/d73/d80/le6 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:5/872: write d8/d59/d5b/d8b/ffe [1562561,117514] 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:7/855: mkdir d1/d2/d11/d12c 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:0/801: rename d5/ldd to d5/db/d5f/l116 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:0/802: write d5/d2c/d49/d83/d8b/daf/f108 [329073,108704] 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:4/898: creat d5/de/d15/d21/d27/d3c/f145 x:0 0 0 2026-03-09T16:15:27.497 INFO:tasks.workunit.client.1.vm05.stdout:3/756: unlink d0/d33/cc8 0 2026-03-09T16:15:27.498 INFO:tasks.workunit.client.1.vm05.stdout:8/851: mkdir d4/d6/d3a/d15/dd9/d117 0 2026-03-09T16:15:27.500 INFO:tasks.workunit.client.1.vm05.stdout:9/839: rename d4/d10/d35/c1e to d4/d10/d35/c115 0 2026-03-09T16:15:27.511 INFO:tasks.workunit.client.1.vm05.stdout:7/856: truncate d1/d2/d8/dc/d1b/d30/d4b/d65/f8f 4860095 0 2026-03-09T16:15:27.516 INFO:tasks.workunit.client.1.vm05.stdout:1/899: rename d7/d15/d6e/dbc/dd6/d11c/f11f to d7/d62/d72/f139 0 2026-03-09T16:15:27.522 INFO:tasks.workunit.client.1.vm05.stdout:8/852: fsync d4/f3e 0 2026-03-09T16:15:27.530 INFO:tasks.workunit.client.1.vm05.stdout:8/853: dread - d4/d6/db/f102 zero size 2026-03-09T16:15:27.530 INFO:tasks.workunit.client.1.vm05.stdout:8/854: stat d4/d6/db/fed 0 2026-03-09T16:15:27.530 INFO:tasks.workunit.client.1.vm05.stdout:7/857: creat d1/d2/d8/dc/d33/f12d x:0 0 0 2026-03-09T16:15:27.530 INFO:tasks.workunit.client.1.vm05.stdout:5/873: rename d8/d18/d1b/d47/d48/f61 to d8/dc8/f127 0 2026-03-09T16:15:27.535 INFO:tasks.workunit.client.1.vm05.stdout:4/899: creat d5/d9c/d124/f146 x:0 0 0 2026-03-09T16:15:27.536 INFO:tasks.workunit.client.1.vm05.stdout:4/900: write d5/f95 [1494737,25452] 0 2026-03-09T16:15:27.537 INFO:tasks.workunit.client.1.vm05.stdout:3/757: mknod d0/c106 0 2026-03-09T16:15:27.553 INFO:tasks.workunit.client.1.vm05.stdout:2/756: write db/dd/d15/f6f [93187,15688] 0 2026-03-09T16:15:27.563 INFO:tasks.workunit.client.1.vm05.stdout:7/858: creat d1/d2/d11/d86/d8a/f12e x:0 0 0 2026-03-09T16:15:27.564 INFO:tasks.workunit.client.1.vm05.stdout:1/900: mkdir d7/d15/d6e/dbc/dd6/d112/d13a 0 2026-03-09T16:15:27.564 INFO:tasks.workunit.client.1.vm05.stdout:0/803: link d5/d97/cda d5/d2c/d49/d83/d8b/daf/c117 0 2026-03-09T16:15:27.565 INFO:tasks.workunit.client.1.vm05.stdout:9/840: dwrite d4/d10/d35/d2b/d38/d65/fa5 [0,4194304] 0 2026-03-09T16:15:27.570 INFO:tasks.workunit.client.1.vm05.stdout:9/841: chown d4/d10/d35/d2b/d31/d82 11 1 2026-03-09T16:15:27.573 INFO:tasks.workunit.client.1.vm05.stdout:9/842: write d4/f3c [4510824,12765] 0 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 -- 192.168.123.103:0/3664879393 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c072370 msgr2=0x7f903c10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 --2- 192.168.123.103:0/3664879393 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c072370 0x7f903c10c590 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f90240099b0 tx=0x7f902402f240 comp rx=0 tx=0).stop 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 -- 192.168.123.103:0/3664879393 shutdown_connections 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 --2- 192.168.123.103:0/3664879393 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c072370 0x7f903c10c590 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 --2- 192.168.123.103:0/3664879393 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f903c0719a0 0x7f903c071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 -- 192.168.123.103:0/3664879393 >> 192.168.123.103:0/3664879393 conn(0x7f903c06d4f0 msgr2=0x7f903c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 -- 192.168.123.103:0/3664879393 shutdown_connections 2026-03-09T16:15:27.574 INFO:tasks.workunit.client.1.vm05.stdout:9/843: write d4/f4a [9243634,111290] 0 2026-03-09T16:15:27.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.573+0000 7f904180f640 1 -- 192.168.123.103:0/3664879393 wait complete. 2026-03-09T16:15:27.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.574+0000 7f904180f640 1 Processor -- start 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904180f640 1 -- start start 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904180f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904180f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f903c072370 0x7f903c115eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904180f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f903c1173b0 con 0x7f903c072370 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904180f640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f903c117520 con 0x7f903c0719a0 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47854/0 (socket says 192.168.123.103:47854) 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 -- 192.168.123.103:0/2714829032 learned_addr learned my addr 192.168.123.103:0/2714829032 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 -- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f903c072370 msgr2=0x7f903c115eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f903c072370 0x7f903c115eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.575+0000 7f904080d640 1 -- 192.168.123.103:0/2714829032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9030009590 con 0x7f903c0719a0 2026-03-09T16:15:27.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f904080d640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9030002760 tx=0x7f9030002c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f903000ecf0 con 0x7f903c0719a0 2026-03-09T16:15:27.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f904180f640 1 -- 192.168.123.103:0/2714829032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9024009660 con 0x7f903c0719a0 2026-03-09T16:15:27.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f904180f640 1 -- 192.168.123.103:0/2714829032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f903c1b5700 con 0x7f903c0719a0 2026-03-09T16:15:27.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9030002e90 con 0x7f903c0719a0 2026-03-09T16:15:27.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.576+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90300105e0 con 0x7f903c0719a0 2026-03-09T16:15:27.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.577+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f9030016020 con 0x7f903c0719a0 2026-03-09T16:15:27.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.578+0000 7f9039ffb640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 0x7f9014079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.578+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9030099fd0 con 0x7f903c0719a0 2026-03-09T16:15:27.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.578+0000 7f903bfff640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 0x7f9014079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.580+0000 7f904180f640 1 -- 192.168.123.103:0/2714829032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f903c1183e0 con 0x7f903c0719a0 2026-03-09T16:15:27.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.581+0000 7f903bfff640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 0x7f9014079b40 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f902402f750 tx=0x7f9024005b20 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.584+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f90300628e0 con 0x7f903c0719a0 2026-03-09T16:15:27.592 INFO:tasks.workunit.client.1.vm05.stdout:3/758: mknod d0/d9/d22/df0/c107 0 2026-03-09T16:15:27.608 INFO:tasks.workunit.client.1.vm05.stdout:8/855: rename d4/d6/d3a/d40/d71/lde to d4/d6/db/df/dd1/l118 0 2026-03-09T16:15:27.611 INFO:tasks.workunit.client.1.vm05.stdout:1/901: rmdir d7/dd 39 2026-03-09T16:15:27.612 INFO:tasks.workunit.client.1.vm05.stdout:0/804: fdatasync d5/f73 0 2026-03-09T16:15:27.615 INFO:tasks.workunit.client.1.vm05.stdout:4/901: dwrite d5/de/d15/d21/d39/f42 [0,4194304] 0 2026-03-09T16:15:27.619 INFO:tasks.workunit.client.1.vm05.stdout:3/759: symlink d0/d9/d22/d5f/d7b/l108 0 2026-03-09T16:15:27.619 INFO:tasks.workunit.client.1.vm05.stdout:9/844: fsync d4/d10/d35/d2b/d38/f112 0 2026-03-09T16:15:27.620 INFO:tasks.workunit.client.1.vm05.stdout:6/795: link d17/d22/d27/d34/d4b/f98 d17/d22/d27/f12e 0 2026-03-09T16:15:27.621 INFO:tasks.workunit.client.1.vm05.stdout:6/796: chown d17/d1d/fc1 11 1 2026-03-09T16:15:27.621 INFO:tasks.workunit.client.1.vm05.stdout:3/760: truncate d0/d9/d22/d5f/d75/d76/d88/d89/fd5 434134 0 2026-03-09T16:15:27.622 INFO:tasks.workunit.client.1.vm05.stdout:8/856: mknod d4/de9/d10c/c119 0 2026-03-09T16:15:27.622 INFO:tasks.workunit.client.1.vm05.stdout:6/797: read d17/d1d/fa8 [391912,4640] 0 2026-03-09T16:15:27.625 INFO:tasks.workunit.client.1.vm05.stdout:1/902: link d7/f34 d7/d15/d16/f13b 0 2026-03-09T16:15:27.626 INFO:tasks.workunit.client.1.vm05.stdout:3/761: write d0/da9/fff [980639,74445] 0 2026-03-09T16:15:27.627 INFO:tasks.workunit.client.1.vm05.stdout:9/845: rmdir d4/d10/d35/d2b 39 2026-03-09T16:15:27.627 INFO:tasks.workunit.client.1.vm05.stdout:5/874: link d8/d18/d1b/d47/d4e/la7 d8/d18/d1b/d47/d4e/d76/d8f/dab/l128 0 2026-03-09T16:15:27.638 INFO:tasks.workunit.client.1.vm05.stdout:3/762: fdatasync d0/d33/f85 0 2026-03-09T16:15:27.638 INFO:tasks.workunit.client.1.vm05.stdout:1/903: truncate d7/d27/f4d 680063 0 2026-03-09T16:15:27.638 INFO:tasks.workunit.client.1.vm05.stdout:8/857: dwrite d4/d6/db/dc/d2e/d85/ff9 [0,4194304] 0 2026-03-09T16:15:27.656 INFO:tasks.workunit.client.1.vm05.stdout:2/757: link db/dd/d15/d3f/c96 db/dd/d7b/cf7 0 2026-03-09T16:15:27.656 INFO:tasks.workunit.client.1.vm05.stdout:2/758: fdatasync db/dd/d15/d46/d67/ff0 0 2026-03-09T16:15:27.657 INFO:tasks.workunit.client.1.vm05.stdout:5/875: dread d8/f6f [0,4194304] 0 2026-03-09T16:15:27.658 INFO:tasks.workunit.client.1.vm05.stdout:9/846: rmdir d4/d10/d35/d36/d10d 0 2026-03-09T16:15:27.658 INFO:tasks.workunit.client.1.vm05.stdout:3/763: creat d0/d9/d22/d5f/d7b/da8/f109 x:0 0 0 2026-03-09T16:15:27.659 INFO:tasks.workunit.client.1.vm05.stdout:8/858: symlink d4/d6/d3a/d40/d6a/l11a 0 2026-03-09T16:15:27.659 INFO:tasks.workunit.client.1.vm05.stdout:4/902: sync 2026-03-09T16:15:27.662 INFO:tasks.workunit.client.1.vm05.stdout:1/904: truncate d7/dd/d21/d39/d87/db9/d138/d55/df9/d122/f10f 731895 0 2026-03-09T16:15:27.663 INFO:tasks.workunit.client.1.vm05.stdout:8/859: dread d4/d6/f9 [0,4194304] 0 2026-03-09T16:15:27.665 INFO:tasks.workunit.client.1.vm05.stdout:4/903: truncate d5/de/d15/d21/d27/f29 11679 0 2026-03-09T16:15:27.669 INFO:tasks.workunit.client.1.vm05.stdout:1/905: unlink d7/daa/cb2 0 2026-03-09T16:15:27.669 INFO:tasks.workunit.client.1.vm05.stdout:5/876: truncate d8/d18/d1b/d47/d4e/f64 362628 0 2026-03-09T16:15:27.670 INFO:tasks.workunit.client.1.vm05.stdout:8/860: rmdir d4/d6/db/dc/d3b 39 2026-03-09T16:15:27.670 INFO:tasks.workunit.client.1.vm05.stdout:1/906: read d7/dd/d21/d39/f86 [240634,104861] 0 2026-03-09T16:15:27.671 INFO:tasks.workunit.client.1.vm05.stdout:5/877: write d8/d18/dbc/dcc/daa/f10c [268767,35551] 0 2026-03-09T16:15:27.672 INFO:tasks.workunit.client.1.vm05.stdout:4/904: unlink d5/de/d15/da9/db1/dad/f48 0 2026-03-09T16:15:27.676 INFO:tasks.workunit.client.1.vm05.stdout:1/907: sync 2026-03-09T16:15:27.676 INFO:tasks.workunit.client.1.vm05.stdout:5/878: sync 2026-03-09T16:15:27.677 INFO:tasks.workunit.client.1.vm05.stdout:4/905: truncate d5/de/d15/da9/db1/dad/d90/dd8/f117 589088 0 2026-03-09T16:15:27.678 INFO:tasks.workunit.client.1.vm05.stdout:8/861: fsync d4/d6/fae 0 2026-03-09T16:15:27.680 INFO:tasks.workunit.client.1.vm05.stdout:1/908: dread - d7/dd/d21/d63/d71/ddc/df8/fbb zero size 2026-03-09T16:15:27.680 INFO:tasks.workunit.client.1.vm05.stdout:4/906: mknod d5/de/d15/da9/c147 0 2026-03-09T16:15:27.683 INFO:tasks.workunit.client.1.vm05.stdout:8/862: creat d4/d6/d9a/f11b x:0 0 0 2026-03-09T16:15:27.686 INFO:tasks.workunit.client.1.vm05.stdout:1/909: readlink d7/dd/d21/d63/d71/ddc/df8/le5 0 2026-03-09T16:15:27.688 INFO:tasks.workunit.client.1.vm05.stdout:4/907: creat d5/de/d15/da9/db1/dad/d90/f148 x:0 0 0 2026-03-09T16:15:27.693 INFO:tasks.workunit.client.1.vm05.stdout:5/879: rename d8/d18/d1b/d47/d48/d73/dfb/f124 to d8/d18/d1b/d78/f129 0 2026-03-09T16:15:27.696 INFO:tasks.workunit.client.1.vm05.stdout:8/863: creat d4/d6/d3a/d15/f11c x:0 0 0 2026-03-09T16:15:27.699 INFO:tasks.workunit.client.1.vm05.stdout:4/908: write d5/de/d15/d21/d39/d91/fc3 [293361,102194] 0 2026-03-09T16:15:27.699 INFO:tasks.workunit.client.1.vm05.stdout:7/859: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f8f [0,4194304] 0 2026-03-09T16:15:27.712 INFO:tasks.workunit.client.1.vm05.stdout:5/880: rename d8/d18/d1b/d47/dda to d8/d18/d1b/d47/d4e/d76/d8f/d12a 0 2026-03-09T16:15:27.714 INFO:tasks.workunit.client.1.vm05.stdout:6/798: dwrite d17/d22/d27/d34/f85 [0,4194304] 0 2026-03-09T16:15:27.727 INFO:tasks.workunit.client.1.vm05.stdout:2/759: write db/dd/d15/d1f/d21/f47 [3760957,48673] 0 2026-03-09T16:15:27.727 INFO:tasks.workunit.client.1.vm05.stdout:3/764: write d0/d9/d22/d6b/fab [562031,49735] 0 2026-03-09T16:15:27.728 INFO:tasks.workunit.client.1.vm05.stdout:0/805: dwrite d5/d2c/d49/d83/d8b/daf/fc4 [0,4194304] 0 2026-03-09T16:15:27.733 INFO:tasks.workunit.client.1.vm05.stdout:4/909: creat d5/de/d15/d21/d39/f149 x:0 0 0 2026-03-09T16:15:27.737 INFO:tasks.workunit.client.1.vm05.stdout:7/860: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d116/f12f x:0 0 0 2026-03-09T16:15:27.741 INFO:tasks.workunit.client.1.vm05.stdout:9/847: dwrite d4/d10/f80 [0,4194304] 0 2026-03-09T16:15:27.770 INFO:tasks.workunit.client.1.vm05.stdout:5/881: symlink d8/d53/d7e/l12b 0 2026-03-09T16:15:27.779 INFO:tasks.workunit.client.1.vm05.stdout:3/765: chown d0/c3f 102650 1 2026-03-09T16:15:27.784 INFO:tasks.workunit.client.1.vm05.stdout:3/766: chown d0/d9/d97/dbc 28167736 1 2026-03-09T16:15:27.784 INFO:tasks.workunit.client.1.vm05.stdout:0/806: unlink d5/d2c/f28 0 2026-03-09T16:15:27.784 INFO:tasks.workunit.client.1.vm05.stdout:0/807: chown d5/d11/d4f/d68/ff1 1255755 1 2026-03-09T16:15:27.822 INFO:tasks.workunit.client.1.vm05.stdout:1/910: truncate d7/dd/d21/d2d/f117 542984 0 2026-03-09T16:15:27.833 INFO:tasks.workunit.client.1.vm05.stdout:8/864: dread d4/d6/d9a/db3/fa5 [0,4194304] 0 2026-03-09T16:15:27.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.878+0000 7f904180f640 1 -- 192.168.123.103:0/2714829032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f903c1b5a50 con 0x7f903c0719a0 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:15:27.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.881+0000 7f9039ffb640 1 -- 192.168.123.103:0/2714829032 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f903000b2d0 con 0x7f903c0719a0 2026-03-09T16:15:27.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.884+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 msgr2=0x7f9014079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.884+0000 7f900b7fe640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 0x7f9014079b40 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f902402f750 tx=0x7f9024005b20 comp rx=0 tx=0).stop 2026-03-09T16:15:27.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.884+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 msgr2=0x7f903c115970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.884+0000 7f900b7fe640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9030002760 tx=0x7f9030002c30 comp rx=0 tx=0).stop 2026-03-09T16:15:27.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 shutdown_connections 2026-03-09T16:15:27.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9014077680 0x7f9014079b40 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f903c072370 0x7f903c115eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 --2- 192.168.123.103:0/2714829032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f903c0719a0 0x7f903c115970 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 >> 192.168.123.103:0/2714829032 conn(0x7f903c06d4f0 msgr2=0x7f903c10a810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 shutdown_connections 2026-03-09T16:15:27.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.887+0000 7f900b7fe640 1 -- 192.168.123.103:0/2714829032 wait complete. 2026-03-09T16:15:27.946 INFO:tasks.workunit.client.1.vm05.stdout:6/799: creat d17/d22/d27/d34/d42/d53/d87/d104/f12f x:0 0 0 2026-03-09T16:15:27.956 INFO:tasks.workunit.client.1.vm05.stdout:4/910: write d5/de/d15/d21/d27/d3c/d5c/d5f/fdd [929241,47343] 0 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1852687424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 msgr2=0x7ff2d810c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 --2- 192.168.123.103:0/1852687424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d810c590 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7ff2d400b0a0 tx=0x7ff2d402f4c0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1852687424 shutdown_connections 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 --2- 192.168.123.103:0/1852687424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d810c590 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 --2- 192.168.123.103:0/1852687424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2d80719c0 0x7ff2d8071dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1852687424 >> 192.168.123.103:0/1852687424 conn(0x7ff2d806d4f0 msgr2=0x7ff2d806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:27.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.963+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1852687424 shutdown_connections 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='client.14670 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='client.14672 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: pgmap v23: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 46 MiB/s rd, 129 MiB/s wr, 275 op/s 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:27.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:27.965 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:27 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1852687424 wait complete. 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 Processor -- start 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 -- start start 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2d80719c0 0x7ff2d8115b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2d81175c0 con 0x7ff2d80719c0 2026-03-09T16:15:27.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.964+0000 7ff2dfb97640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2d8116630 con 0x7ff2d8072390 2026-03-09T16:15:27.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.966+0000 7ff2dd10b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.967+0000 7ff2dd10b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47870/0 (socket says 192.168.123.103:47870) 2026-03-09T16:15:27.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.967+0000 7ff2dd10b640 1 -- 192.168.123.103:0/1345500716 learned_addr learned my addr 192.168.123.103:0/1345500716 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:27.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.967+0000 7ff2dd10b640 1 -- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2d80719c0 msgr2=0x7ff2d8115b80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:27.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.967+0000 7ff2dd10b640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2d80719c0 0x7ff2d8115b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:27.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.967+0000 7ff2dd10b640 1 -- 192.168.123.103:0/1345500716 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2d4009d00 con 0x7ff2d8072390 2026-03-09T16:15:27.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.968+0000 7ff2dd10b640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff2d4009cd0 tx=0x7ff2d4009300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.969+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2d4004960 con 0x7ff2d8072390 2026-03-09T16:15:27.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.969+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff2d4007d40 con 0x7ff2d8072390 2026-03-09T16:15:27.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.969+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2d4040b60 con 0x7ff2d8072390 2026-03-09T16:15:27.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.969+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1345500716 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2d81168b0 con 0x7ff2d8072390 2026-03-09T16:15:27.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.969+0000 7ff2dfb97640 1 -- 192.168.123.103:0/1345500716 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2d81b5980 con 0x7ff2d8072390 2026-03-09T16:15:27.975 INFO:tasks.workunit.client.1.vm05.stdout:9/848: mknod d4/d10/d35/d36/d48/d54/c116 0 2026-03-09T16:15:27.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.973+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2d81183e0 con 0x7ff2d8072390 2026-03-09T16:15:27.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.976+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7ff2d4007880 con 0x7ff2d8072390 2026-03-09T16:15:27.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.977+0000 7ff2c6ffd640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 0x7ff2b4079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:27.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.977+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff2d40be8d0 con 0x7ff2d8072390 2026-03-09T16:15:27.981 INFO:tasks.workunit.client.1.vm05.stdout:3/767: dread - d0/d9/d22/d5f/d75/d76/f7e zero size 2026-03-09T16:15:27.981 INFO:tasks.workunit.client.1.vm05.stdout:1/911: creat d7/dd/d21/d2d/f13c x:0 0 0 2026-03-09T16:15:27.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.981+0000 7ff2dd90c640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 0x7ff2b4079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:27.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.981+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7ff2d40872e0 con 0x7ff2d8072390 2026-03-09T16:15:27.985 INFO:tasks.workunit.client.1.vm05.stdout:7/861: dwrite d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:27.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:27.982+0000 7ff2dd90c640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 0x7ff2b4079af0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7ff2c800a9f0 tx=0x7ff2c8005cf0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:27.987 INFO:tasks.workunit.client.1.vm05.stdout:7/862: read d1/d2/d8/dc/d33/f9f [2339685,51341] 0 2026-03-09T16:15:27.989 INFO:tasks.workunit.client.1.vm05.stdout:7/863: truncate d1/d2/d11/d86/d8a/f12e 862231 0 2026-03-09T16:15:27.999 INFO:tasks.workunit.client.1.vm05.stdout:6/800: creat d17/d22/d27/df8/d112/f130 x:0 0 0 2026-03-09T16:15:28.004 INFO:tasks.workunit.client.1.vm05.stdout:7/864: dread d1/f26 [0,4194304] 0 2026-03-09T16:15:28.010 INFO:tasks.workunit.client.1.vm05.stdout:8/865: write d4/f3e [2955870,13381] 0 2026-03-09T16:15:28.011 INFO:tasks.workunit.client.1.vm05.stdout:4/911: dread d5/de/d15/d21/d27/f8f [0,4194304] 0 2026-03-09T16:15:28.013 INFO:tasks.workunit.client.1.vm05.stdout:8/866: dread d4/d6/f44 [0,4194304] 0 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='client.14670 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='client.14672 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: pgmap v23: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 46 MiB/s rd, 129 MiB/s wr, 275 op/s 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:28.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:27 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:28.027 INFO:tasks.workunit.client.1.vm05.stdout:2/760: creat db/ff8 x:0 0 0 2026-03-09T16:15:28.027 INFO:tasks.workunit.client.1.vm05.stdout:9/849: write d4/f43 [96899,24321] 0 2026-03-09T16:15:28.028 INFO:tasks.workunit.client.1.vm05.stdout:9/850: readlink d4/d10/d35/d36/d48/d54/l7b 0 2026-03-09T16:15:28.036 INFO:tasks.workunit.client.1.vm05.stdout:3/768: mkdir d0/d33/d10a 0 2026-03-09T16:15:28.036 INFO:tasks.workunit.client.1.vm05.stdout:1/912: truncate d7/ff4 4600558 0 2026-03-09T16:15:28.039 INFO:tasks.workunit.client.1.vm05.stdout:1/913: dread d7/dbe/dca/f11e [0,4194304] 0 2026-03-09T16:15:28.040 INFO:tasks.workunit.client.1.vm05.stdout:1/914: fdatasync d7/d62/db6/fc1 0 2026-03-09T16:15:28.041 INFO:tasks.workunit.client.1.vm05.stdout:3/769: dwrite d0/d9/d97/fd4 [0,4194304] 0 2026-03-09T16:15:28.053 INFO:tasks.workunit.client.1.vm05.stdout:6/801: fsync d17/d22/d27/d8a/fa1 0 2026-03-09T16:15:28.074 INFO:tasks.workunit.client.1.vm05.stdout:4/912: mkdir d5/d9c/d124/d14a 0 2026-03-09T16:15:28.076 INFO:tasks.workunit.client.1.vm05.stdout:5/882: creat d8/d18/d1b/f12c x:0 0 0 2026-03-09T16:15:28.081 INFO:tasks.workunit.client.1.vm05.stdout:2/761: symlink db/dd/d15/d1f/d20/d23/d78/lf9 0 2026-03-09T16:15:28.097 INFO:tasks.workunit.client.1.vm05.stdout:7/865: write d1/d2/d11/d86/da2/f127 [2591391,62780] 0 2026-03-09T16:15:28.099 INFO:tasks.workunit.client.1.vm05.stdout:9/851: dwrite d4/d10/f8d [0,4194304] 0 2026-03-09T16:15:28.099 INFO:tasks.workunit.client.1.vm05.stdout:8/867: dwrite d4/d6/d9a/db3/f9d [0,4194304] 0 2026-03-09T16:15:28.100 INFO:tasks.workunit.client.1.vm05.stdout:8/868: dread - d4/d6/d9a/f11b zero size 2026-03-09T16:15:28.101 INFO:tasks.workunit.client.1.vm05.stdout:8/869: dread - d4/d6/db/d59/db0/ff6 zero size 2026-03-09T16:15:28.144 INFO:tasks.workunit.client.1.vm05.stdout:3/770: symlink d0/d9/d22/d5f/d75/l10b 0 2026-03-09T16:15:28.163 INFO:tasks.workunit.client.1.vm05.stdout:6/802: stat d17/d5d/c8d 0 2026-03-09T16:15:28.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.159+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff2d81185f0 con 0x7ff2b4077630 2026-03-09T16:15:28.171 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/2 daemons upgraded", 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading prometheus daemons", 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.167+0000 7ff2c6ffd640 1 -- 192.168.123.103:0/1345500716 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+367 (secure 0 0 0) 0x7ff2d81185f0 con 0x7ff2b4077630 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 msgr2=0x7ff2b4079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 0x7ff2b4079af0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7ff2c800a9f0 tx=0x7ff2c8005cf0 comp rx=0 tx=0).stop 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 msgr2=0x7ff2d81160c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff2d4009cd0 tx=0x7ff2d4009300 comp rx=0 tx=0).stop 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 shutdown_connections 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7ff2b4077630 0x7ff2b4079af0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d8072390 0x7ff2d81160c0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.170+0000 7ff2c4ff9640 1 --2- 192.168.123.103:0/1345500716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2d80719c0 0x7ff2d8115b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.171+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 >> 192.168.123.103:0/1345500716 conn(0x7ff2d806d4f0 msgr2=0x7ff2d80707e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.171+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 shutdown_connections 2026-03-09T16:15:28.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:28.171+0000 7ff2c4ff9640 1 -- 192.168.123.103:0/1345500716 wait complete. 2026-03-09T16:15:28.182 INFO:tasks.workunit.client.1.vm05.stdout:5/883: dread - d8/d18/d1b/d47/d4e/ff1 zero size 2026-03-09T16:15:28.184 INFO:tasks.workunit.client.1.vm05.stdout:2/762: dread - db/fa5 zero size 2026-03-09T16:15:28.186 INFO:tasks.workunit.client.1.vm05.stdout:5/884: dwrite d8/d18/d1b/f28 [4194304,4194304] 0 2026-03-09T16:15:28.193 INFO:tasks.workunit.client.1.vm05.stdout:0/808: getdents d5/db/d77 0 2026-03-09T16:15:28.208 INFO:tasks.workunit.client.1.vm05.stdout:9/852: rename d4/c8c to d4/d10/d35/d36/d48/d60/c117 0 2026-03-09T16:15:28.218 INFO:tasks.workunit.client.1.vm05.stdout:5/885: dread d8/d18/d1b/f28 [0,4194304] 0 2026-03-09T16:15:28.223 INFO:tasks.workunit.client.1.vm05.stdout:1/915: dwrite d7/dd/d21/d2d/f108 [0,4194304] 0 2026-03-09T16:15:28.229 INFO:tasks.workunit.client.1.vm05.stdout:6/803: fdatasync d17/d22/d27/d8a/fd0 0 2026-03-09T16:15:28.229 INFO:tasks.workunit.client.1.vm05.stdout:1/916: write d7/fc [1676045,27292] 0 2026-03-09T16:15:28.230 INFO:tasks.workunit.client.1.vm05.stdout:1/917: chown d7/dd/de/f3e 127041480 1 2026-03-09T16:15:28.239 INFO:tasks.workunit.client.1.vm05.stdout:3/771: dread d0/d9/d22/f18 [0,4194304] 0 2026-03-09T16:15:28.249 INFO:tasks.workunit.client.1.vm05.stdout:0/809: dread - d5/fd4 zero size 2026-03-09T16:15:28.250 INFO:tasks.workunit.client.1.vm05.stdout:2/763: dread db/dd/d15/d1f/d20/d23/fbb [0,4194304] 0 2026-03-09T16:15:28.250 INFO:tasks.workunit.client.1.vm05.stdout:1/918: sync 2026-03-09T16:15:28.261 INFO:tasks.workunit.client.1.vm05.stdout:9/853: dwrite d4/d10/faa [0,4194304] 0 2026-03-09T16:15:28.265 INFO:tasks.workunit.client.1.vm05.stdout:5/886: write d8/fb [1430193,44787] 0 2026-03-09T16:15:28.265 INFO:tasks.workunit.client.1.vm05.stdout:5/887: chown d8/d59 9 1 2026-03-09T16:15:28.267 INFO:tasks.workunit.client.1.vm05.stdout:5/888: sync 2026-03-09T16:15:28.278 INFO:tasks.workunit.client.1.vm05.stdout:4/913: truncate d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/ff3 900005 0 2026-03-09T16:15:28.278 INFO:tasks.workunit.client.1.vm05.stdout:3/772: mkdir d0/d9/d22/d5f/d75/d76/d88/d10c 0 2026-03-09T16:15:28.278 INFO:tasks.workunit.client.1.vm05.stdout:0/810: truncate d5/d2c/f63 65816 0 2026-03-09T16:15:28.287 INFO:tasks.workunit.client.1.vm05.stdout:4/914: creat d5/de/d15/d21/d39/d91/de9/f14b x:0 0 0 2026-03-09T16:15:28.291 INFO:tasks.workunit.client.1.vm05.stdout:3/773: mknod d0/d9/d97/dad/c10d 0 2026-03-09T16:15:28.296 INFO:tasks.workunit.client.1.vm05.stdout:6/804: dwrite d17/d4f/f8c [0,4194304] 0 2026-03-09T16:15:28.302 INFO:tasks.workunit.client.1.vm05.stdout:2/764: truncate db/dd/d15/d3f/f75 4508616 0 2026-03-09T16:15:28.302 INFO:tasks.workunit.client.1.vm05.stdout:8/870: creat d4/d6/d3a/d7c/f11d x:0 0 0 2026-03-09T16:15:28.302 INFO:tasks.workunit.client.1.vm05.stdout:5/889: creat d8/dd5/f12d x:0 0 0 2026-03-09T16:15:28.303 INFO:tasks.workunit.client.1.vm05.stdout:9/854: creat d4/d10/d35/d2b/d31/d82/f118 x:0 0 0 2026-03-09T16:15:28.307 INFO:tasks.workunit.client.1.vm05.stdout:3/774: dread d0/d9/f2b [0,4194304] 0 2026-03-09T16:15:28.312 INFO:tasks.workunit.client.1.vm05.stdout:7/866: link d1/d2/d8/dc/d1b/d71/d3c/cb9 d1/d2/d8/dc/dd4/c130 0 2026-03-09T16:15:28.319 INFO:tasks.workunit.client.1.vm05.stdout:4/915: unlink d5/de/d15/da9/c10a 0 2026-03-09T16:15:28.327 INFO:tasks.workunit.client.1.vm05.stdout:6/805: mknod d17/d5d/d73/d83/c131 0 2026-03-09T16:15:28.328 INFO:tasks.workunit.client.1.vm05.stdout:6/806: fsync d17/d22/d9d/da5/fd9 0 2026-03-09T16:15:28.329 INFO:tasks.workunit.client.1.vm05.stdout:5/890: unlink d8/l14 0 2026-03-09T16:15:28.334 INFO:tasks.workunit.client.1.vm05.stdout:8/871: unlink d4/d6/db/d75/fe4 0 2026-03-09T16:15:28.336 INFO:tasks.workunit.client.1.vm05.stdout:0/811: rmdir d5/d1b/d3b/d10c 0 2026-03-09T16:15:28.338 INFO:tasks.workunit.client.1.vm05.stdout:7/867: truncate d1/d2/d8/dc/f1e 2423998 0 2026-03-09T16:15:28.339 INFO:tasks.workunit.client.1.vm05.stdout:4/916: truncate d5/de/d15/d21/f26 591469 0 2026-03-09T16:15:28.340 INFO:tasks.workunit.client.1.vm05.stdout:1/919: link d7/dd/f1f d7/dd/de/d96/f13d 0 2026-03-09T16:15:28.344 INFO:tasks.workunit.client.1.vm05.stdout:5/891: symlink d8/d5e/l12e 0 2026-03-09T16:15:28.374 INFO:tasks.workunit.client.1.vm05.stdout:2/765: dwrite db/dd/d15/d46/d67/f77 [0,4194304] 0 2026-03-09T16:15:28.375 INFO:tasks.workunit.client.1.vm05.stdout:2/766: chown db/dd/d15/d3f/d5b 20839792 1 2026-03-09T16:15:28.375 INFO:tasks.workunit.client.1.vm05.stdout:3/775: symlink d0/d9/d22/d5f/dfd/l10e 0 2026-03-09T16:15:28.376 INFO:tasks.workunit.client.1.vm05.stdout:7/868: rmdir d1/d2/d8/dc/d1b/d30/d7d 39 2026-03-09T16:15:28.376 INFO:tasks.workunit.client.1.vm05.stdout:3/776: readlink d0/d9/d22/d5f/d7b/d99/ldf 0 2026-03-09T16:15:28.376 INFO:tasks.workunit.client.1.vm05.stdout:7/869: fdatasync d1/d2/d8/dc/d1b/d30/d4b/d65/f7f 0 2026-03-09T16:15:28.382 INFO:tasks.workunit.client.1.vm05.stdout:7/870: dwrite d1/d2/d8/d31/ff7 [0,4194304] 0 2026-03-09T16:15:28.390 INFO:tasks.workunit.client.1.vm05.stdout:8/872: mknod d4/d6/db/dc/d5d/da0/c11e 0 2026-03-09T16:15:28.390 INFO:tasks.workunit.client.1.vm05.stdout:9/855: rename d4/d10/d35/d36/d48/d60/d94 to d4/d119 0 2026-03-09T16:15:28.391 INFO:tasks.workunit.client.1.vm05.stdout:0/812: creat d5/d109/f118 x:0 0 0 2026-03-09T16:15:28.394 INFO:tasks.workunit.client.1.vm05.stdout:8/873: dread d4/d6/d53/f7f [0,4194304] 0 2026-03-09T16:15:28.394 INFO:tasks.workunit.client.1.vm05.stdout:3/777: write d0/d9/d22/d5f/d75/f100 [4372022,87543] 0 2026-03-09T16:15:28.397 INFO:tasks.workunit.client.1.vm05.stdout:3/778: chown d0/da9/fb5 423 1 2026-03-09T16:15:28.397 INFO:tasks.workunit.client.1.vm05.stdout:7/871: stat d1/d2/d8/dc/d1b/d71/l21 0 2026-03-09T16:15:28.397 INFO:tasks.workunit.client.1.vm05.stdout:3/779: chown d0/d9/d22/d5f/d7b/da8 920 1 2026-03-09T16:15:28.400 INFO:tasks.workunit.client.1.vm05.stdout:9/856: chown d4/d10/d35/d36/c3b 0 1 2026-03-09T16:15:28.400 INFO:tasks.workunit.client.1.vm05.stdout:0/813: fdatasync d5/d11/d4f/d70/fbf 0 2026-03-09T16:15:28.403 INFO:tasks.workunit.client.1.vm05.stdout:3/780: dwrite d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/fe7 [0,4194304] 0 2026-03-09T16:15:28.411 INFO:tasks.workunit.client.1.vm05.stdout:0/814: creat d5/db/d48/d66/f119 x:0 0 0 2026-03-09T16:15:28.415 INFO:tasks.workunit.client.1.vm05.stdout:3/781: rmdir d0/d9/d97/dbc 39 2026-03-09T16:15:28.418 INFO:tasks.workunit.client.1.vm05.stdout:5/892: rename d8/d59/f83 to d8/d18/d1b/f12f 0 2026-03-09T16:15:28.418 INFO:tasks.workunit.client.1.vm05.stdout:0/815: dread d5/d11/f106 [0,4194304] 0 2026-03-09T16:15:28.419 INFO:tasks.workunit.client.1.vm05.stdout:3/782: dread d0/d9/f2b [0,4194304] 0 2026-03-09T16:15:28.420 INFO:tasks.workunit.client.1.vm05.stdout:0/816: truncate d5/d2c/d49/d83/d8b/daf/ff9 426419 0 2026-03-09T16:15:28.420 INFO:tasks.workunit.client.1.vm05.stdout:0/817: truncate d5/d109/f118 274773 0 2026-03-09T16:15:28.423 INFO:tasks.workunit.client.1.vm05.stdout:8/874: creat d4/d6/d3a/d40/d6a/f11f x:0 0 0 2026-03-09T16:15:28.428 INFO:tasks.workunit.client.1.vm05.stdout:1/920: read d7/d27/f57 [3133389,113839] 0 2026-03-09T16:15:28.448 INFO:tasks.workunit.client.1.vm05.stdout:7/872: link d1/d2/d8/lb d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/l131 0 2026-03-09T16:15:28.454 INFO:tasks.workunit.client.1.vm05.stdout:4/917: write d5/f10 [4698754,47162] 0 2026-03-09T16:15:28.455 INFO:tasks.workunit.client.1.vm05.stdout:6/807: write d17/d5d/f71 [1517104,130468] 0 2026-03-09T16:15:28.461 INFO:tasks.workunit.client.1.vm05.stdout:4/918: sync 2026-03-09T16:15:28.461 INFO:tasks.workunit.client.1.vm05.stdout:2/767: dwrite db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:28.477 INFO:tasks.workunit.client.1.vm05.stdout:9/857: dwrite d4/d10/d35/d36/d48/f68 [0,4194304] 0 2026-03-09T16:15:28.479 INFO:tasks.workunit.client.1.vm05.stdout:5/893: symlink d8/d5e/d11b/l130 0 2026-03-09T16:15:28.480 INFO:tasks.workunit.client.1.vm05.stdout:8/875: creat d4/d6/d3a/d15/f120 x:0 0 0 2026-03-09T16:15:28.481 INFO:tasks.workunit.client.1.vm05.stdout:8/876: sync 2026-03-09T16:15:28.483 INFO:tasks.workunit.client.1.vm05.stdout:8/877: write d4/d6/db/d59/db0/f114 [1042848,120439] 0 2026-03-09T16:15:28.485 INFO:tasks.workunit.client.1.vm05.stdout:1/921: rename d7/d15/d6e/dbc/cc8 to d7/d62/db6/c13e 0 2026-03-09T16:15:28.502 INFO:tasks.workunit.client.1.vm05.stdout:9/858: fdatasync d4/f17 0 2026-03-09T16:15:28.505 INFO:tasks.workunit.client.1.vm05.stdout:8/878: creat d4/d6/d3a/d40/d71/f121 x:0 0 0 2026-03-09T16:15:28.508 INFO:tasks.workunit.client.1.vm05.stdout:1/922: rename d7/dd/d21/d39/d87/db9/d138/db4/f129 to d7/dd/d21/d44/dcc/f13f 0 2026-03-09T16:15:28.510 INFO:tasks.workunit.client.1.vm05.stdout:7/873: rmdir d1/d2/d8/d31/d8d/d5d 39 2026-03-09T16:15:28.515 INFO:tasks.workunit.client.1.vm05.stdout:9/859: readlink d4/d10/d35/lb2 0 2026-03-09T16:15:28.519 INFO:tasks.workunit.client.1.vm05.stdout:1/923: dwrite d7/dd/d21/d39/d48/d8c/dd8/d103/f10a [0,4194304] 0 2026-03-09T16:15:28.524 INFO:tasks.workunit.client.1.vm05.stdout:5/894: dread d8/d18/dbc/dcc/daa/f52 [0,4194304] 0 2026-03-09T16:15:28.524 INFO:tasks.workunit.client.1.vm05.stdout:9/860: write d4/d10/d35/d2b/d38/d65/dd6/de3/f93 [4516401,18792] 0 2026-03-09T16:15:28.528 INFO:tasks.workunit.client.1.vm05.stdout:8/879: mkdir d4/d6/d9a/d122 0 2026-03-09T16:15:28.528 INFO:tasks.workunit.client.1.vm05.stdout:0/818: write d5/db/d5b/d82/f89 [985876,121720] 0 2026-03-09T16:15:28.532 INFO:tasks.workunit.client.1.vm05.stdout:2/768: creat db/ffa x:0 0 0 2026-03-09T16:15:28.533 INFO:tasks.workunit.client.1.vm05.stdout:3/783: dwrite d0/fd [0,4194304] 0 2026-03-09T16:15:28.533 INFO:tasks.workunit.client.1.vm05.stdout:2/769: write db/dd/d15/d4c/fe4 [567065,86422] 0 2026-03-09T16:15:28.551 INFO:tasks.workunit.client.1.vm05.stdout:1/924: stat d7/dd/d21/d39/d87/c80 0 2026-03-09T16:15:28.552 INFO:tasks.workunit.client.1.vm05.stdout:9/861: unlink d4/d10/d35/d2b/d31/d82/le4 0 2026-03-09T16:15:28.552 INFO:tasks.workunit.client.1.vm05.stdout:1/925: chown d7/dd/d21/d39/d87/ceb 367510759 1 2026-03-09T16:15:28.553 INFO:tasks.workunit.client.1.vm05.stdout:6/808: getdents d17/d22/d27/d34/d42/d53/d87 0 2026-03-09T16:15:28.554 INFO:tasks.workunit.client.1.vm05.stdout:8/880: mknod d4/d6/d3a/c123 0 2026-03-09T16:15:28.557 INFO:tasks.workunit.client.1.vm05.stdout:0/819: rename d5/d11/c4c to d5/d9e/c11a 0 2026-03-09T16:15:28.557 INFO:tasks.workunit.client.1.vm05.stdout:4/919: getdents d5/de/d15/da9/db1/dad/d90 0 2026-03-09T16:15:28.558 INFO:tasks.workunit.client.1.vm05.stdout:3/784: creat d0/d9/d97/dc2/f10f x:0 0 0 2026-03-09T16:15:28.559 INFO:tasks.workunit.client.1.vm05.stdout:2/770: truncate db/dd/d15/d3f/d5b/d60/d95/ddd/f8f 5174874 0 2026-03-09T16:15:28.560 INFO:tasks.workunit.client.1.vm05.stdout:5/895: mknod d8/d59/d10f/c131 0 2026-03-09T16:15:28.563 INFO:tasks.workunit.client.1.vm05.stdout:0/820: unlink d5/d1b/d30/l46 0 2026-03-09T16:15:28.565 INFO:tasks.workunit.client.1.vm05.stdout:3/785: creat d0/d9/d22/d5f/dfd/f110 x:0 0 0 2026-03-09T16:15:28.566 INFO:tasks.workunit.client.1.vm05.stdout:5/896: unlink d8/d59/d5b/d8b/l105 0 2026-03-09T16:15:28.566 INFO:tasks.workunit.client.1.vm05.stdout:1/926: dread d7/dd/d21/d39/d48/f59 [0,4194304] 0 2026-03-09T16:15:28.567 INFO:tasks.workunit.client.1.vm05.stdout:7/874: getdents d1/d2/d8/d31 0 2026-03-09T16:15:28.569 INFO:tasks.workunit.client.1.vm05.stdout:3/786: creat d0/d9/d22/d5f/d7b/f111 x:0 0 0 2026-03-09T16:15:28.569 INFO:tasks.workunit.client.1.vm05.stdout:2/771: link c8 db/dd/d15/d1f/cfb 0 2026-03-09T16:15:28.570 INFO:tasks.workunit.client.1.vm05.stdout:0/821: rename d5/db/d5b/da5 to d5/d11/d4f/d11b 0 2026-03-09T16:15:28.571 INFO:tasks.workunit.client.1.vm05.stdout:3/787: truncate d0/da9/fe3 548783 0 2026-03-09T16:15:28.571 INFO:tasks.workunit.client.1.vm05.stdout:0/822: stat d5/d11/d4f/d11b/lfb 0 2026-03-09T16:15:28.571 INFO:tasks.workunit.client.1.vm05.stdout:3/788: chown d0/d9/fe9 177218375 1 2026-03-09T16:15:28.584 INFO:tasks.workunit.client.1.vm05.stdout:8/881: write d4/d6/db/df/d4f/d9f/ff2 [735142,8924] 0 2026-03-09T16:15:28.587 INFO:tasks.workunit.client.1.vm05.stdout:4/920: write d5/f35 [3151346,7515] 0 2026-03-09T16:15:28.589 INFO:tasks.workunit.client.1.vm05.stdout:6/809: dwrite d17/d22/d27/d44/f86 [0,4194304] 0 2026-03-09T16:15:28.591 INFO:tasks.workunit.client.1.vm05.stdout:9/862: dwrite d4/f2e [0,4194304] 0 2026-03-09T16:15:28.597 INFO:tasks.workunit.client.1.vm05.stdout:1/927: write d7/dd/d21/d39/d48/f59 [152700,9507] 0 2026-03-09T16:15:28.597 INFO:tasks.workunit.client.1.vm05.stdout:5/897: write d8/d18/dbc/dcc/daa/fb1 [1906047,42794] 0 2026-03-09T16:15:28.598 INFO:tasks.workunit.client.1.vm05.stdout:2/772: unlink db/dd/d15/f70 0 2026-03-09T16:15:28.607 INFO:tasks.workunit.client.1.vm05.stdout:3/789: creat d0/d9/d22/d5f/d75/d76/d88/d89/f112 x:0 0 0 2026-03-09T16:15:28.607 INFO:tasks.workunit.client.1.vm05.stdout:3/790: stat d0/da9/fe3 0 2026-03-09T16:15:28.614 INFO:tasks.workunit.client.1.vm05.stdout:7/875: dread d1/d2/d11/d86/d8a/d91/ffe [0,4194304] 0 2026-03-09T16:15:28.618 INFO:tasks.workunit.client.1.vm05.stdout:9/863: dread d4/d10/d35/d2b/f2c [0,4194304] 0 2026-03-09T16:15:28.621 INFO:tasks.workunit.client.1.vm05.stdout:4/921: read d5/de/d15/da9/db1/dad/d37/fa5 [950345,70980] 0 2026-03-09T16:15:28.628 INFO:tasks.workunit.client.1.vm05.stdout:6/810: dread d17/d22/d9d/da5/ff5 [0,4194304] 0 2026-03-09T16:15:28.629 INFO:tasks.workunit.client.1.vm05.stdout:1/928: dread d7/f4b [4194304,4194304] 0 2026-03-09T16:15:28.632 INFO:tasks.workunit.client.1.vm05.stdout:0/823: mkdir d5/db/d77/df3/d11c 0 2026-03-09T16:15:28.639 INFO:tasks.workunit.client.1.vm05.stdout:8/882: write d4/d6/d3a/d15/f93 [706555,105816] 0 2026-03-09T16:15:28.641 INFO:tasks.workunit.client.1.vm05.stdout:8/883: chown d4/d6/db/d59/db0/c103 14 1 2026-03-09T16:15:28.642 INFO:tasks.workunit.client.1.vm05.stdout:8/884: dread - d4/d6/d9a/f11b zero size 2026-03-09T16:15:28.642 INFO:tasks.workunit.client.1.vm05.stdout:8/885: fdatasync d4/fca 0 2026-03-09T16:15:28.645 INFO:tasks.workunit.client.1.vm05.stdout:3/791: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/f50 [2489971,120182] 0 2026-03-09T16:15:28.651 INFO:tasks.workunit.client.1.vm05.stdout:8/886: dread d4/f23 [0,4194304] 0 2026-03-09T16:15:28.651 INFO:tasks.workunit.client.1.vm05.stdout:8/887: fsync d4/d6/db/df/f18 0 2026-03-09T16:15:28.652 INFO:tasks.workunit.client.1.vm05.stdout:7/876: unlink d1/d2/d8/c16 0 2026-03-09T16:15:28.661 INFO:tasks.workunit.client.1.vm05.stdout:4/922: creat d5/de/d15/d21/d39/d91/f14c x:0 0 0 2026-03-09T16:15:28.661 INFO:tasks.workunit.client.1.vm05.stdout:4/923: dread - d5/de/d15/d21/d27/d3c/f145 zero size 2026-03-09T16:15:28.662 INFO:tasks.workunit.client.1.vm05.stdout:4/924: stat d5/d116/c96 0 2026-03-09T16:15:28.665 INFO:tasks.workunit.client.1.vm05.stdout:6/811: mkdir d17/d22/d27/d34/d42/d132 0 2026-03-09T16:15:28.669 INFO:tasks.workunit.client.1.vm05.stdout:1/929: symlink d7/dd/d21/d39/d48/d8c/l140 0 2026-03-09T16:15:28.675 INFO:tasks.workunit.client.1.vm05.stdout:0/824: creat d5/db/d5f/f11d x:0 0 0 2026-03-09T16:15:28.676 INFO:tasks.workunit.client.1.vm05.stdout:0/825: dread d5/d2c/d49/d83/f9c [0,4194304] 0 2026-03-09T16:15:28.687 INFO:tasks.workunit.client.1.vm05.stdout:3/792: dread - d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/fbf zero size 2026-03-09T16:15:28.687 INFO:tasks.workunit.client.1.vm05.stdout:3/793: dread - d0/d33/f63 zero size 2026-03-09T16:15:28.688 INFO:tasks.workunit.client.1.vm05.stdout:3/794: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/fe7 [3524740,22846] 0 2026-03-09T16:15:28.691 INFO:tasks.workunit.client.1.vm05.stdout:8/888: dread d4/d6/db/dc/d5d/fbd [0,4194304] 0 2026-03-09T16:15:28.694 INFO:tasks.workunit.client.1.vm05.stdout:7/877: creat d1/d2/d8/dc/d14/f132 x:0 0 0 2026-03-09T16:15:28.709 INFO:tasks.workunit.client.1.vm05.stdout:9/864: fsync d4/d10/dd7/feb 0 2026-03-09T16:15:28.737 INFO:tasks.workunit.client.1.vm05.stdout:5/898: creat d8/f132 x:0 0 0 2026-03-09T16:15:28.743 INFO:tasks.workunit.client.1.vm05.stdout:5/899: dread d8/d18/dbc/dcc/daa/f52 [4194304,4194304] 0 2026-03-09T16:15:28.863 INFO:tasks.workunit.client.1.vm05.stdout:4/925: creat d5/f14d x:0 0 0 2026-03-09T16:15:28.864 INFO:tasks.workunit.client.1.vm05.stdout:6/812: symlink d17/d22/d9d/db4/l133 0 2026-03-09T16:15:28.864 INFO:tasks.workunit.client.1.vm05.stdout:4/926: chown d5/de/d15/da9/db1/dad/d37/d60/f113 1360912 1 2026-03-09T16:15:28.865 INFO:tasks.workunit.client.1.vm05.stdout:1/930: fdatasync d7/dd/d21/d63/d71/ddc/df8/f12d 0 2026-03-09T16:15:28.866 INFO:tasks.workunit.client.1.vm05.stdout:0/826: rmdir d5/db/d5f/da3 39 2026-03-09T16:15:28.874 INFO:tasks.workunit.client.1.vm05.stdout:3/795: mkdir d0/d9/d97/dac/d113 0 2026-03-09T16:15:28.884 INFO:tasks.workunit.client.1.vm05.stdout:9/865: mknod d4/d119/c11a 0 2026-03-09T16:15:28.884 INFO:tasks.workunit.client.1.vm05.stdout:9/866: chown d4/d10/dd7 82971 1 2026-03-09T16:15:28.885 INFO:tasks.workunit.client.1.vm05.stdout:3/796: sync 2026-03-09T16:15:28.887 INFO:tasks.workunit.client.1.vm05.stdout:3/797: readlink d0/d9/d22/d5f/d75/d76/l8e 0 2026-03-09T16:15:28.887 INFO:tasks.workunit.client.1.vm05.stdout:8/889: dwrite d4/d6/db/dc/fa2 [4194304,4194304] 0 2026-03-09T16:15:28.892 INFO:tasks.workunit.client.1.vm05.stdout:9/867: dwrite d4/d10/d35/d36/d48/d54/d59/f5c [4194304,4194304] 0 2026-03-09T16:15:28.994 INFO:tasks.workunit.client.1.vm05.stdout:4/927: creat d5/d9c/d124/f14e x:0 0 0 2026-03-09T16:15:28.995 INFO:tasks.workunit.client.1.vm05.stdout:5/900: dwrite d8/d18/dbc/dcc/daa/f110 [0,4194304] 0 2026-03-09T16:15:28.995 INFO:tasks.workunit.client.1.vm05.stdout:4/928: dread - d5/d9c/d124/f14e zero size 2026-03-09T16:15:28.998 INFO:tasks.workunit.client.1.vm05.stdout:0/827: read d5/f76 [1567316,59074] 0 2026-03-09T16:15:29.000 INFO:tasks.workunit.client.1.vm05.stdout:4/929: dread d5/de/d15/da9/f136 [0,4194304] 0 2026-03-09T16:15:29.013 INFO:tasks.workunit.client.1.vm05.stdout:6/813: symlink d17/d22/d9d/da9/d128/l134 0 2026-03-09T16:15:29.015 INFO:tasks.workunit.client.1.vm05.stdout:1/931: creat d7/dd/d21/d39/d48/d8c/dd8/d103/d107/f141 x:0 0 0 2026-03-09T16:15:29.015 INFO:tasks.workunit.client.1.vm05.stdout:0/828: stat d5/db/d5b 0 2026-03-09T16:15:29.019 INFO:tasks.workunit.client.1.vm05.stdout:6/814: dwrite d17/d22/dce/fdf [0,4194304] 0 2026-03-09T16:15:29.030 INFO:tasks.workunit.client.1.vm05.stdout:5/901: dread d8/d59/d5b/d8b/da0/fc1 [0,4194304] 0 2026-03-09T16:15:29.037 INFO:tasks.workunit.client.1.vm05.stdout:2/773: dwrite db/dd/d15/d1f/d20/d23/d78/f92 [0,4194304] 0 2026-03-09T16:15:29.038 INFO:tasks.workunit.client.1.vm05.stdout:4/930: unlink d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/ff3 0 2026-03-09T16:15:29.055 INFO:tasks.workunit.client.1.vm05.stdout:0/829: creat d5/db/d77/f11e x:0 0 0 2026-03-09T16:15:29.067 INFO:tasks.workunit.client.1.vm05.stdout:6/815: creat d17/d22/d27/d34/dd1/d10a/f135 x:0 0 0 2026-03-09T16:15:29.074 INFO:tasks.workunit.client.1.vm05.stdout:5/902: read d8/d18/d1b/f36 [1154504,113901] 0 2026-03-09T16:15:29.074 INFO:tasks.workunit.client.1.vm05.stdout:5/903: dread - d8/d95/f98 zero size 2026-03-09T16:15:29.082 INFO:tasks.workunit.client.1.vm05.stdout:5/904: dread - d8/d53/f102 zero size 2026-03-09T16:15:29.083 INFO:tasks.workunit.client.1.vm05.stdout:1/932: mknod d7/dd/d21/d63/d71/c142 0 2026-03-09T16:15:29.085 INFO:tasks.workunit.client.1.vm05.stdout:1/933: unlink d7/dd/de/d52/cad 0 2026-03-09T16:15:29.086 INFO:tasks.workunit.client.1.vm05.stdout:5/905: getdents d8/d53 0 2026-03-09T16:15:29.087 INFO:tasks.workunit.client.1.vm05.stdout:6/816: dread d17/d1d/f1e [0,4194304] 0 2026-03-09T16:15:29.088 INFO:tasks.workunit.client.1.vm05.stdout:5/906: write d8/d59/f5f [4404746,4572] 0 2026-03-09T16:15:29.090 INFO:tasks.workunit.client.1.vm05.stdout:6/817: fdatasync d17/d1d/fd3 0 2026-03-09T16:15:29.092 INFO:tasks.workunit.client.1.vm05.stdout:6/818: creat d17/d22/d27/d34/d42/d53/f136 x:0 0 0 2026-03-09T16:15:29.097 INFO:tasks.workunit.client.1.vm05.stdout:5/907: sync 2026-03-09T16:15:29.097 INFO:tasks.workunit.client.1.vm05.stdout:6/819: sync 2026-03-09T16:15:29.099 INFO:tasks.workunit.client.1.vm05.stdout:5/908: mkdir d8/d18/d1b/d47/d48/d73/d80/de4/d133 0 2026-03-09T16:15:29.100 INFO:tasks.workunit.client.1.vm05.stdout:3/798: rename d0/d9/d8b/ccc to d0/d9/d22/d5f/d7b/da8/c114 0 2026-03-09T16:15:29.100 INFO:tasks.workunit.client.1.vm05.stdout:6/820: truncate d17/d22/d27/f6b 5371668 0 2026-03-09T16:15:29.101 INFO:tasks.workunit.client.1.vm05.stdout:8/890: write d4/d6/d3a/d3c/f45 [1472345,66443] 0 2026-03-09T16:15:29.101 INFO:tasks.workunit.client.1.vm05.stdout:3/799: stat d0/d9/d22/d5f/d7b/d99/ldf 0 2026-03-09T16:15:29.106 INFO:tasks.workunit.client.1.vm05.stdout:9/868: rename d4/d10/d35/d2b/d31/d82/dec/f114 to d4/d10/d35/d2b/d31/dc8/f11b 0 2026-03-09T16:15:29.106 INFO:tasks.workunit.client.1.vm05.stdout:3/800: write d0/d9/d22/d5f/d90/ff9 [446946,117086] 0 2026-03-09T16:15:29.106 INFO:tasks.workunit.client.1.vm05.stdout:6/821: mkdir d17/d22/d27/d34/d42/d53/d87/d137 0 2026-03-09T16:15:29.108 INFO:tasks.workunit.client.1.vm05.stdout:5/909: getdents d8/d18 0 2026-03-09T16:15:29.110 INFO:tasks.workunit.client.1.vm05.stdout:8/891: fdatasync d4/d6/d3a/d15/f63 0 2026-03-09T16:15:29.112 INFO:tasks.workunit.client.1.vm05.stdout:7/878: rename d1/d2/d8/dc/d1b/d71/c37 to d1/d2/d11/d86/da2/c133 0 2026-03-09T16:15:29.122 INFO:tasks.workunit.client.1.vm05.stdout:0/830: write d5/d1b/d30/f55 [4980518,107549] 0 2026-03-09T16:15:29.122 INFO:tasks.workunit.client.1.vm05.stdout:9/869: creat d4/d10/d35/d2b/dc1/dc2/d10f/f11c x:0 0 0 2026-03-09T16:15:29.122 INFO:tasks.workunit.client.1.vm05.stdout:9/870: readlink d4/la4 0 2026-03-09T16:15:29.135 INFO:tasks.workunit.client.1.vm05.stdout:4/931: dwrite d5/de/d82/fbe [0,4194304] 0 2026-03-09T16:15:29.138 INFO:tasks.workunit.client.1.vm05.stdout:1/934: dwrite d7/dd/d21/fde [0,4194304] 0 2026-03-09T16:15:29.139 INFO:tasks.workunit.client.1.vm05.stdout:9/871: read d4/d10/d35/d36/fb3 [833490,90665] 0 2026-03-09T16:15:29.139 INFO:tasks.workunit.client.1.vm05.stdout:9/872: dread - d4/d10/d35/d36/d48/f8e zero size 2026-03-09T16:15:29.140 INFO:tasks.workunit.client.1.vm05.stdout:2/774: dwrite db/dd/d15/fd4 [0,4194304] 0 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='client.24459 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/2714829032' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:28 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.150 INFO:tasks.workunit.client.1.vm05.stdout:4/932: sync 2026-03-09T16:15:29.154 INFO:tasks.workunit.client.1.vm05.stdout:8/892: link d4/d6/db/df/d4f/d9f/fbb d4/d6/db/df/d10f/f124 0 2026-03-09T16:15:29.155 INFO:tasks.workunit.client.1.vm05.stdout:6/822: write d17/d4f/fbd [4530519,89791] 0 2026-03-09T16:15:29.159 INFO:tasks.workunit.client.1.vm05.stdout:7/879: write d1/d2/d8/dc/d1b/d30/d4b/d65/f63 [1075387,82289] 0 2026-03-09T16:15:29.163 INFO:tasks.workunit.client.1.vm05.stdout:5/910: getdents d8/d18/d1b 0 2026-03-09T16:15:29.163 INFO:tasks.workunit.client.1.vm05.stdout:1/935: fdatasync d7/d27/f64 0 2026-03-09T16:15:29.164 INFO:tasks.workunit.client.1.vm05.stdout:0/831: creat d5/db/d5f/da3/f11f x:0 0 0 2026-03-09T16:15:29.165 INFO:tasks.workunit.client.1.vm05.stdout:6/823: unlink d17/d22/d27/ff7 0 2026-03-09T16:15:29.167 INFO:tasks.workunit.client.1.vm05.stdout:7/880: write d1/d2/d8/dc/d14/f41 [2036615,51492] 0 2026-03-09T16:15:29.167 INFO:tasks.workunit.client.1.vm05.stdout:9/873: creat d4/d10/d35/d2b/d38/d65/dea/f11d x:0 0 0 2026-03-09T16:15:29.167 INFO:tasks.workunit.client.1.vm05.stdout:5/911: write d8/ffa [2244836,98778] 0 2026-03-09T16:15:29.168 INFO:tasks.workunit.client.1.vm05.stdout:3/801: getdents d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3 0 2026-03-09T16:15:29.171 INFO:tasks.workunit.client.1.vm05.stdout:2/775: mkdir db/dd/d15/d3f/d5b/d60/d6a/dea/dfc 0 2026-03-09T16:15:29.171 INFO:tasks.workunit.client.1.vm05.stdout:6/824: creat d17/d22/d27/d44/f138 x:0 0 0 2026-03-09T16:15:29.172 INFO:tasks.workunit.client.1.vm05.stdout:9/874: creat d4/d10c/f11e x:0 0 0 2026-03-09T16:15:29.172 INFO:tasks.workunit.client.1.vm05.stdout:5/912: write d8/d18/dbc/dcc/daa/f10c [469044,60785] 0 2026-03-09T16:15:29.175 INFO:tasks.workunit.client.1.vm05.stdout:7/881: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108/f10a zero size 2026-03-09T16:15:29.181 INFO:tasks.workunit.client.1.vm05.stdout:3/802: mknod d0/d9/d97/c115 0 2026-03-09T16:15:29.181 INFO:tasks.workunit.client.1.vm05.stdout:8/893: getdents d4/d6/db/d75 0 2026-03-09T16:15:29.181 INFO:tasks.workunit.client.1.vm05.stdout:2/776: dwrite db/dd/d15/d1f/d20/d23/d78/f92 [4194304,4194304] 0 2026-03-09T16:15:29.183 INFO:tasks.workunit.client.1.vm05.stdout:4/933: sync 2026-03-09T16:15:29.183 INFO:tasks.workunit.client.1.vm05.stdout:0/832: sync 2026-03-09T16:15:29.183 INFO:tasks.workunit.client.1.vm05.stdout:1/936: sync 2026-03-09T16:15:29.189 INFO:tasks.workunit.client.1.vm05.stdout:0/833: fdatasync d5/db/d77/f11e 0 2026-03-09T16:15:29.195 INFO:tasks.workunit.client.1.vm05.stdout:8/894: symlink d4/d6/db/d9b/l125 0 2026-03-09T16:15:29.195 INFO:tasks.workunit.client.1.vm05.stdout:3/803: rename d0/d33/f63 to d0/d9/d22/d6b/f116 0 2026-03-09T16:15:29.198 INFO:tasks.workunit.client.1.vm05.stdout:6/825: dread d17/d22/d27/d34/dd1/fe2 [0,4194304] 0 2026-03-09T16:15:29.202 INFO:tasks.workunit.client.1.vm05.stdout:3/804: chown d0/d33/l3c 15 1 2026-03-09T16:15:29.203 INFO:tasks.workunit.client.1.vm05.stdout:2/777: dwrite db/dd/d15/d3f/d5b/d60/d95/ddd/fdc [0,4194304] 0 2026-03-09T16:15:29.208 INFO:tasks.workunit.client.1.vm05.stdout:9/875: dread d4/d10/d35/d36/d48/f9e [0,4194304] 0 2026-03-09T16:15:29.210 INFO:tasks.workunit.client.1.vm05.stdout:0/834: fdatasync d5/db/d48/d66/f99 0 2026-03-09T16:15:29.214 INFO:tasks.workunit.client.1.vm05.stdout:4/934: creat d5/d9c/dbd/f14f x:0 0 0 2026-03-09T16:15:29.214 INFO:tasks.workunit.client.1.vm05.stdout:6/826: mkdir d17/d22/d27/d34/dd1/d10a/d139 0 2026-03-09T16:15:29.214 INFO:tasks.workunit.client.1.vm05.stdout:8/895: dwrite d4/d6/db/dc/d5d/d79/fe8 [0,4194304] 0 2026-03-09T16:15:29.218 INFO:tasks.workunit.client.1.vm05.stdout:9/876: stat d4/d10/d35/d2b/cfc 0 2026-03-09T16:15:29.218 INFO:tasks.workunit.client.1.vm05.stdout:3/805: creat d0/d9/d22/d5f/d75/f117 x:0 0 0 2026-03-09T16:15:29.222 INFO:tasks.workunit.client.1.vm05.stdout:7/882: rename d1/d2/d8/dc/d1b/d30/c10e to d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108/c134 0 2026-03-09T16:15:29.222 INFO:tasks.workunit.client.1.vm05.stdout:7/883: chown d1/d2/d8/dc/d1b/d30/d4b/d65/f63 66 1 2026-03-09T16:15:29.222 INFO:tasks.workunit.client.1.vm05.stdout:0/835: symlink d5/d109/l120 0 2026-03-09T16:15:29.222 INFO:tasks.workunit.client.1.vm05.stdout:4/935: creat d5/d116/f150 x:0 0 0 2026-03-09T16:15:29.233 INFO:tasks.workunit.client.1.vm05.stdout:8/896: creat d4/d6/db/dc/d5d/da0/dd7/d10a/f126 x:0 0 0 2026-03-09T16:15:29.237 INFO:tasks.workunit.client.1.vm05.stdout:9/877: unlink d4/d10/d35/d36/d48/f8e 0 2026-03-09T16:15:29.239 INFO:tasks.workunit.client.1.vm05.stdout:2/778: mknod db/dd/d15/d4c/cfd 0 2026-03-09T16:15:29.240 INFO:tasks.workunit.client.1.vm05.stdout:7/884: stat d1/d2/d8/dc/l61 0 2026-03-09T16:15:29.242 INFO:tasks.workunit.client.1.vm05.stdout:6/827: mknod d17/d22/d27/d58/db8/c13a 0 2026-03-09T16:15:29.243 INFO:tasks.workunit.client.1.vm05.stdout:4/936: write d5/de/d15/d21/d39/d91/faa [3459909,12645] 0 2026-03-09T16:15:29.248 INFO:tasks.workunit.client.1.vm05.stdout:4/937: stat d5/de/d15/d21/f11e 0 2026-03-09T16:15:29.252 INFO:tasks.workunit.client.1.vm05.stdout:3/806: unlink d0/d9/d22/l40 0 2026-03-09T16:15:29.258 INFO:tasks.workunit.client.1.vm05.stdout:2/779: write db/dd/d15/d1f/d20/fee [762981,35394] 0 2026-03-09T16:15:29.261 INFO:tasks.workunit.client.1.vm05.stdout:7/885: mknod d1/d2/d11/d86/da2/c135 0 2026-03-09T16:15:29.261 INFO:tasks.workunit.client.1.vm05.stdout:7/886: readlink d1/lee 0 2026-03-09T16:15:29.261 INFO:tasks.workunit.client.1.vm05.stdout:1/937: getdents d7/dd/d21/d39/d87/db9/d138/db4 0 2026-03-09T16:15:29.269 INFO:tasks.workunit.client.1.vm05.stdout:2/780: creat db/dd/d15/d3f/ffe x:0 0 0 2026-03-09T16:15:29.270 INFO:tasks.workunit.client.1.vm05.stdout:3/807: dread d0/d9/fe9 [0,4194304] 0 2026-03-09T16:15:29.271 INFO:tasks.workunit.client.1.vm05.stdout:6/828: mknod d17/d22/d27/d44/d125/c13b 0 2026-03-09T16:15:29.272 INFO:tasks.workunit.client.1.vm05.stdout:6/829: write d17/d5d/f71 [3209532,99932] 0 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='client.24459 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/2714829032' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:28 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:29.277 INFO:tasks.workunit.client.1.vm05.stdout:5/913: truncate d8/d53/ffc 106617 0 2026-03-09T16:15:29.279 INFO:tasks.workunit.client.1.vm05.stdout:3/808: fsync d0/d9/fe9 0 2026-03-09T16:15:29.280 INFO:tasks.workunit.client.1.vm05.stdout:5/914: write d8/d18/d1b/f12f [4493028,49808] 0 2026-03-09T16:15:29.280 INFO:tasks.workunit.client.1.vm05.stdout:9/878: getdents d4/d10/d35/d2b/d38/d65/dd6/de3 0 2026-03-09T16:15:29.282 INFO:tasks.workunit.client.1.vm05.stdout:9/879: write d4/d10/d35/d2b/d31/d82/f118 [479936,49409] 0 2026-03-09T16:15:29.283 INFO:tasks.workunit.client.1.vm05.stdout:2/781: truncate db/dd/d15/d1f/f25 3084268 0 2026-03-09T16:15:29.286 INFO:tasks.workunit.client.1.vm05.stdout:3/809: fsync d0/d9/f4d 0 2026-03-09T16:15:29.288 INFO:tasks.workunit.client.1.vm05.stdout:6/830: truncate d17/d22/d9d/fe8 745398 0 2026-03-09T16:15:29.288 INFO:tasks.workunit.client.1.vm05.stdout:1/938: getdents d7/dd/de/d52/d5b 0 2026-03-09T16:15:29.290 INFO:tasks.workunit.client.1.vm05.stdout:9/880: creat d4/d10/d35/d2b/dc1/dc2/d10f/f11f x:0 0 0 2026-03-09T16:15:29.291 INFO:tasks.workunit.client.1.vm05.stdout:3/810: readlink d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/l84 0 2026-03-09T16:15:29.294 INFO:tasks.workunit.client.1.vm05.stdout:6/831: rmdir d17/d22/dce 39 2026-03-09T16:15:29.296 INFO:tasks.workunit.client.1.vm05.stdout:7/887: write d1/d2/d8/dc/d1b/d71/d3c/f9b [3383924,61227] 0 2026-03-09T16:15:29.298 INFO:tasks.workunit.client.1.vm05.stdout:5/915: dwrite d8/d18/d1b/d47/d48/f118 [0,4194304] 0 2026-03-09T16:15:29.299 INFO:tasks.workunit.client.1.vm05.stdout:1/939: symlink d7/d15/d6e/dbc/l143 0 2026-03-09T16:15:29.299 INFO:tasks.workunit.client.1.vm05.stdout:9/881: dread - d4/d10/d35/d36/d48/d54/db0/f110 zero size 2026-03-09T16:15:29.300 INFO:tasks.workunit.client.1.vm05.stdout:4/938: write d5/de/d15/f1b [1455619,11041] 0 2026-03-09T16:15:29.309 INFO:tasks.workunit.client.1.vm05.stdout:8/897: dwrite d4/d6/db/dc/d2e/f47 [0,4194304] 0 2026-03-09T16:15:29.309 INFO:tasks.workunit.client.1.vm05.stdout:8/898: chown d4/d6/d9a 7 1 2026-03-09T16:15:29.309 INFO:tasks.workunit.client.1.vm05.stdout:0/836: dwrite d5/d1b/f61 [0,4194304] 0 2026-03-09T16:15:29.309 INFO:tasks.workunit.client.1.vm05.stdout:3/811: unlink d0/d9/d22/d5f/d7b/d99/lc6 0 2026-03-09T16:15:29.311 INFO:tasks.workunit.client.1.vm05.stdout:1/940: read d7/f4b [8146629,92417] 0 2026-03-09T16:15:29.314 INFO:tasks.workunit.client.1.vm05.stdout:4/939: write d5/f14d [903993,84356] 0 2026-03-09T16:15:29.319 INFO:tasks.workunit.client.1.vm05.stdout:0/837: chown d5/d2c/f41 11555 1 2026-03-09T16:15:29.323 INFO:tasks.workunit.client.1.vm05.stdout:1/941: write d7/d27/f84 [3730416,100110] 0 2026-03-09T16:15:29.328 INFO:tasks.workunit.client.1.vm05.stdout:9/882: stat d4/lff 0 2026-03-09T16:15:29.330 INFO:tasks.workunit.client.1.vm05.stdout:7/888: write d1/d2/d8/d31/d8d/f6f [2651212,73957] 0 2026-03-09T16:15:29.330 INFO:tasks.workunit.client.1.vm05.stdout:2/782: dread db/dd/d15/d4c/f58 [0,4194304] 0 2026-03-09T16:15:29.332 INFO:tasks.workunit.client.1.vm05.stdout:5/916: mkdir d8/d18/d1b/d78/d90/d134 0 2026-03-09T16:15:29.337 INFO:tasks.workunit.client.1.vm05.stdout:3/812: creat d0/dce/f118 x:0 0 0 2026-03-09T16:15:29.340 INFO:tasks.workunit.client.1.vm05.stdout:9/883: unlink d4/d10/d35/d36/d48/d54/c116 0 2026-03-09T16:15:29.341 INFO:tasks.workunit.client.1.vm05.stdout:7/889: stat d1/d2/d11/d86/da2/c133 0 2026-03-09T16:15:29.342 INFO:tasks.workunit.client.1.vm05.stdout:7/890: readlink d1/d2/d11/d86/da2/db6/lf2 0 2026-03-09T16:15:29.343 INFO:tasks.workunit.client.1.vm05.stdout:5/917: truncate d8/d18/dbc/dcc/daa/f35 5856502 0 2026-03-09T16:15:29.344 INFO:tasks.workunit.client.1.vm05.stdout:5/918: chown d8/d53/l9f 342 1 2026-03-09T16:15:29.346 INFO:tasks.workunit.client.1.vm05.stdout:4/940: sync 2026-03-09T16:15:29.348 INFO:tasks.workunit.client.1.vm05.stdout:7/891: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/f8f [4194304,4194304] 0 2026-03-09T16:15:29.349 INFO:tasks.workunit.client.1.vm05.stdout:0/838: mkdir d5/d2c/d49/d83/d8b/dd5/d121 0 2026-03-09T16:15:29.351 INFO:tasks.workunit.client.1.vm05.stdout:7/892: read d1/d2/f5 [3480548,80002] 0 2026-03-09T16:15:29.353 INFO:tasks.workunit.client.1.vm05.stdout:9/884: rmdir d4/d10/d35/d2b/d38/d65/dd6/de3 39 2026-03-09T16:15:29.356 INFO:tasks.workunit.client.1.vm05.stdout:2/783: mkdir db/dd/d15/dff 0 2026-03-09T16:15:29.356 INFO:tasks.workunit.client.1.vm05.stdout:5/919: truncate d8/d59/d5b/d8b/da0/fc1 1488715 0 2026-03-09T16:15:29.357 INFO:tasks.workunit.client.1.vm05.stdout:9/885: dread d4/d10/d35/d2b/d38/f112 [0,4194304] 0 2026-03-09T16:15:29.362 INFO:tasks.workunit.client.1.vm05.stdout:4/941: mknod d5/de/d15/d21/d27/d3c/d142/c151 0 2026-03-09T16:15:29.364 INFO:tasks.workunit.client.1.vm05.stdout:4/942: truncate d5/de/d15/da9/db1/dad/d37/d60/f113 1787813 0 2026-03-09T16:15:29.366 INFO:tasks.workunit.client.1.vm05.stdout:0/839: dwrite d5/db/d5f/fd6 [0,4194304] 0 2026-03-09T16:15:29.366 INFO:tasks.workunit.client.1.vm05.stdout:8/899: truncate d4/d6/f9 2220869 0 2026-03-09T16:15:29.366 INFO:tasks.workunit.client.1.vm05.stdout:1/942: creat d7/d15/d6e/dbc/f144 x:0 0 0 2026-03-09T16:15:29.367 INFO:tasks.workunit.client.1.vm05.stdout:8/900: chown d4/d6/f58 463532397 1 2026-03-09T16:15:29.368 INFO:tasks.workunit.client.1.vm05.stdout:8/901: write d4/d6/d53/f89 [3144450,13725] 0 2026-03-09T16:15:29.370 INFO:tasks.workunit.client.1.vm05.stdout:4/943: rmdir d5/de/d15/d21/d27/d3c/d142 39 2026-03-09T16:15:29.371 INFO:tasks.workunit.client.1.vm05.stdout:0/840: chown d5/db/d5b/de9 8533914 1 2026-03-09T16:15:29.378 INFO:tasks.workunit.client.1.vm05.stdout:7/893: symlink d1/d2/d8/dc/d1b/d30/d7d/l136 0 2026-03-09T16:15:29.378 INFO:tasks.workunit.client.1.vm05.stdout:1/943: rmdir d7/dd/de/d52/d5b 39 2026-03-09T16:15:29.384 INFO:tasks.workunit.client.1.vm05.stdout:5/920: write d8/d18/d1b/d47/d4e/ff1 [256181,104121] 0 2026-03-09T16:15:29.384 INFO:tasks.workunit.client.1.vm05.stdout:6/832: dwrite d17/d22/d27/d34/d4b/fa4 [0,4194304] 0 2026-03-09T16:15:29.390 INFO:tasks.workunit.client.1.vm05.stdout:8/902: truncate d4/d6/d3a/d3c/f3f 2124285 0 2026-03-09T16:15:29.391 INFO:tasks.workunit.client.1.vm05.stdout:5/921: dwrite d8/d18/d1b/d47/d48/d73/d80/fe5 [4194304,4194304] 0 2026-03-09T16:15:29.393 INFO:tasks.workunit.client.1.vm05.stdout:4/944: mkdir d5/de/d15/d21/da0/de3/d100/d152 0 2026-03-09T16:15:29.395 INFO:tasks.workunit.client.1.vm05.stdout:3/813: getdents d0/d9/d22/d5f/d75/d76/d88/da3 0 2026-03-09T16:15:29.396 INFO:tasks.workunit.client.1.vm05.stdout:7/894: chown d1/d2/d8/dc/f1e 0 1 2026-03-09T16:15:29.397 INFO:tasks.workunit.client.1.vm05.stdout:1/944: mknod d7/dd/d21/d63/c145 0 2026-03-09T16:15:29.397 INFO:tasks.workunit.client.1.vm05.stdout:4/945: readlink d5/de/d15/d21/d39/d91/de9/l129 0 2026-03-09T16:15:29.399 INFO:tasks.workunit.client.1.vm05.stdout:4/946: chown c3 856807 1 2026-03-09T16:15:29.404 INFO:tasks.workunit.client.1.vm05.stdout:1/945: dwrite d7/dd/d21/d44/f11a [0,4194304] 0 2026-03-09T16:15:29.421 INFO:tasks.workunit.client.1.vm05.stdout:4/947: sync 2026-03-09T16:15:29.422 INFO:tasks.workunit.client.1.vm05.stdout:2/784: getdents db/dd/d15/d1f/d20/d23 0 2026-03-09T16:15:29.422 INFO:tasks.workunit.client.1.vm05.stdout:9/886: truncate d4/d10/d35/d2b/d38/f5e 1247989 0 2026-03-09T16:15:29.425 INFO:tasks.workunit.client.1.vm05.stdout:7/895: dread - d1/d2/d8/d67/d76/ff1 zero size 2026-03-09T16:15:29.428 INFO:tasks.workunit.client.1.vm05.stdout:2/785: creat db/dd/d15/d3f/f100 x:0 0 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:5/922: truncate d8/d18/dbc/dcc/daa/f35 246580 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:7/896: creat d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/f137 x:0 0 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:3/814: link d0/d9/d22/d5f/d75/d76/fed d0/d9/d97/dac/d113/f119 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:4/948: fsync d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:2/786: chown db/dd/lac 11309651 1 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:0/841: getdents d5/db/d77 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:3/815: creat d0/d9/d22/d6b/f11a x:0 0 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:9/887: creat d4/d10/d35/d2b/d38/d65/dd6/f120 x:0 0 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:7/897: mknod d1/d2/c138 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:9/888: chown d4/d10/d35/d2b/d38/fa0 0 1 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:3/816: mknod d0/d9/d22/d5f/d75/d76/d88/d89/c11b 0 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:9/889: chown d4/d10/laf 19147 1 2026-03-09T16:15:29.437 INFO:tasks.workunit.client.1.vm05.stdout:2/787: dwrite db/fc6 [0,4194304] 0 2026-03-09T16:15:29.441 INFO:tasks.workunit.client.1.vm05.stdout:7/898: chown d1/d2/d8/dc/f45 10632831 1 2026-03-09T16:15:29.454 INFO:tasks.workunit.client.1.vm05.stdout:2/788: dread db/dd/d15/fd4 [0,4194304] 0 2026-03-09T16:15:29.455 INFO:tasks.workunit.client.1.vm05.stdout:9/890: dwrite d4/d10/d35/d36/fce [0,4194304] 0 2026-03-09T16:15:29.455 INFO:tasks.workunit.client.1.vm05.stdout:4/949: dread d5/de/d15/d21/f50 [0,4194304] 0 2026-03-09T16:15:29.455 INFO:tasks.workunit.client.1.vm05.stdout:5/923: getdents d8/d18/d1b/d78 0 2026-03-09T16:15:29.456 INFO:tasks.workunit.client.1.vm05.stdout:4/950: chown d5/d9c/d124/f14e 35623 1 2026-03-09T16:15:29.465 INFO:tasks.workunit.client.1.vm05.stdout:7/899: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d12b/f139 x:0 0 0 2026-03-09T16:15:29.466 INFO:tasks.workunit.client.1.vm05.stdout:5/924: mkdir d8/d18/d1b/d47/d4e/d76/d11d/d135 0 2026-03-09T16:15:29.466 INFO:tasks.workunit.client.1.vm05.stdout:5/925: write d8/d18/d1b/f12c [490182,91137] 0 2026-03-09T16:15:29.468 INFO:tasks.workunit.client.1.vm05.stdout:5/926: fdatasync d8/d59/f5f 0 2026-03-09T16:15:29.469 INFO:tasks.workunit.client.1.vm05.stdout:9/891: rename d4/d119/c11a to d4/d10/d35/d2b/d31/d82/dec/c121 0 2026-03-09T16:15:29.470 INFO:tasks.workunit.client.1.vm05.stdout:4/951: symlink d5/de/d15/d21/d39/d91/l153 0 2026-03-09T16:15:29.473 INFO:tasks.workunit.client.1.vm05.stdout:3/817: creat d0/d9/d22/d5f/f11c x:0 0 0 2026-03-09T16:15:29.473 INFO:tasks.workunit.client.1.vm05.stdout:4/952: read - d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/f86 zero size 2026-03-09T16:15:29.473 INFO:tasks.workunit.client.1.vm05.stdout:7/900: mknod d1/d2/d8/dc/d1b/d30/c13a 0 2026-03-09T16:15:29.474 INFO:tasks.workunit.client.1.vm05.stdout:5/927: creat d8/d5e/d11b/d11e/f136 x:0 0 0 2026-03-09T16:15:29.474 INFO:tasks.workunit.client.1.vm05.stdout:9/892: dread d4/d10/d35/d36/f91 [0,4194304] 0 2026-03-09T16:15:29.475 INFO:tasks.workunit.client.1.vm05.stdout:7/901: dread - d1/d2/d8/dc/d14/f132 zero size 2026-03-09T16:15:29.475 INFO:tasks.workunit.client.1.vm05.stdout:3/818: mknod d0/d9/d22/d5f/d75/d76/d88/d89/c11d 0 2026-03-09T16:15:29.476 INFO:tasks.workunit.client.1.vm05.stdout:4/953: dread - d5/d9c/fa8 zero size 2026-03-09T16:15:29.477 INFO:tasks.workunit.client.1.vm05.stdout:5/928: stat d8/d18/d1b/d47/d48/d73/d80/cd7 0 2026-03-09T16:15:29.477 INFO:tasks.workunit.client.1.vm05.stdout:3/819: rmdir d0/d33 39 2026-03-09T16:15:29.478 INFO:tasks.workunit.client.1.vm05.stdout:7/902: fdatasync d1/d2/d8/dc/d33/f57 0 2026-03-09T16:15:29.483 INFO:tasks.workunit.client.1.vm05.stdout:4/954: creat d5/f154 x:0 0 0 2026-03-09T16:15:29.483 INFO:tasks.workunit.client.1.vm05.stdout:9/893: getdents d4/d10/d35/d2b/d31/d82/dc5 0 2026-03-09T16:15:29.483 INFO:tasks.workunit.client.1.vm05.stdout:7/903: dread - d1/d2/d8/dc/d1b/d30/d7d/d114/f92 zero size 2026-03-09T16:15:29.483 INFO:tasks.workunit.client.1.vm05.stdout:4/955: write d5/de/d15/da9/db1/dad/d37/d60/dbf/fc7 [2583191,71644] 0 2026-03-09T16:15:29.483 INFO:tasks.workunit.client.1.vm05.stdout:3/820: symlink d0/da9/l11e 0 2026-03-09T16:15:29.484 INFO:tasks.workunit.client.1.vm05.stdout:5/929: dread d8/d53/ffc [0,4194304] 0 2026-03-09T16:15:29.488 INFO:tasks.workunit.client.1.vm05.stdout:4/956: truncate d5/de/d15/d21/d39/f42 4279837 0 2026-03-09T16:15:29.496 INFO:tasks.workunit.client.1.vm05.stdout:2/789: sync 2026-03-09T16:15:29.496 INFO:tasks.workunit.client.1.vm05.stdout:3/821: sync 2026-03-09T16:15:29.498 INFO:tasks.workunit.client.1.vm05.stdout:7/904: dwrite d1/d2/d8/dc/f1a [0,4194304] 0 2026-03-09T16:15:29.500 INFO:tasks.workunit.client.1.vm05.stdout:0/842: write d5/d2c/dff/f2e [449206,90513] 0 2026-03-09T16:15:29.509 INFO:tasks.workunit.client.1.vm05.stdout:6/833: dwrite d17/d22/d27/d44/f7a [0,4194304] 0 2026-03-09T16:15:29.509 INFO:tasks.workunit.client.1.vm05.stdout:8/903: dwrite d4/d6/d53/fb1 [4194304,4194304] 0 2026-03-09T16:15:29.512 INFO:tasks.workunit.client.1.vm05.stdout:7/905: chown d1/d2/d8/dc/d1b/d30/d4b/fe7 78 1 2026-03-09T16:15:29.514 INFO:tasks.workunit.client.1.vm05.stdout:0/843: write d5/d11/f40 [302666,122882] 0 2026-03-09T16:15:29.515 INFO:tasks.workunit.client.1.vm05.stdout:7/906: dread - d1/d2/d8/dc/d33/fb5 zero size 2026-03-09T16:15:29.520 INFO:tasks.workunit.client.1.vm05.stdout:5/930: creat d8/d95/f137 x:0 0 0 2026-03-09T16:15:29.524 INFO:tasks.workunit.client.1.vm05.stdout:1/946: dwrite d7/dd/d21/d63/d71/fd0 [0,4194304] 0 2026-03-09T16:15:29.524 INFO:tasks.workunit.client.1.vm05.stdout:4/957: symlink d5/de/d15/d21/d27/d3c/d5c/da2/dc9/l155 0 2026-03-09T16:15:29.531 INFO:tasks.workunit.client.1.vm05.stdout:9/894: write d4/d10/d35/f44 [112008,122560] 0 2026-03-09T16:15:29.538 INFO:tasks.workunit.client.1.vm05.stdout:6/834: fdatasync d17/d22/d27/d8a/d8b/f9c 0 2026-03-09T16:15:29.539 INFO:tasks.workunit.client.1.vm05.stdout:6/835: chown d17/f31 10 1 2026-03-09T16:15:29.539 INFO:tasks.workunit.client.1.vm05.stdout:0/844: rename d5/d2c/fde to d5/d97/f122 0 2026-03-09T16:15:29.540 INFO:tasks.workunit.client.1.vm05.stdout:0/845: fsync d5/db/d5f/da3/f11f 0 2026-03-09T16:15:29.549 INFO:tasks.workunit.client.1.vm05.stdout:1/947: creat d7/dd/d21/d39/d87/db9/d138/d55/d125/f146 x:0 0 0 2026-03-09T16:15:29.554 INFO:tasks.workunit.client.1.vm05.stdout:4/958: dread d5/de/d15/da9/db1/dad/d37/d60/f6c [0,4194304] 0 2026-03-09T16:15:29.557 INFO:tasks.workunit.client.1.vm05.stdout:8/904: sync 2026-03-09T16:15:29.558 INFO:tasks.workunit.client.1.vm05.stdout:9/895: sync 2026-03-09T16:15:29.558 INFO:tasks.workunit.client.1.vm05.stdout:9/896: stat d4/d10/d35/d2b/f9d 0 2026-03-09T16:15:29.561 INFO:tasks.workunit.client.1.vm05.stdout:6/836: rename d17/d22/d27/d34/d42/f11a to d17/d22/d27/d34/dd1/d10a/f13c 0 2026-03-09T16:15:29.561 INFO:tasks.workunit.client.1.vm05.stdout:0/846: readlink d5/d11/d4f/ddc/lf6 0 2026-03-09T16:15:29.562 INFO:tasks.workunit.client.1.vm05.stdout:5/931: symlink d8/d18/dbc/dcc/l138 0 2026-03-09T16:15:29.567 INFO:tasks.workunit.client.1.vm05.stdout:5/932: chown d8/d18/d1b/d47/d68/f70 632 1 2026-03-09T16:15:29.567 INFO:tasks.workunit.client.1.vm05.stdout:2/790: rmdir db/dd/d15/d3f/d5b/d60/d95/de0 0 2026-03-09T16:15:29.567 INFO:tasks.workunit.client.1.vm05.stdout:4/959: chown d5/de/d15/da9/db1/dad/f32 262315663 1 2026-03-09T16:15:29.567 INFO:tasks.workunit.client.1.vm05.stdout:4/960: chown d5/de/d15/da9/df6/f12a 20 1 2026-03-09T16:15:29.570 INFO:tasks.workunit.client.1.vm05.stdout:8/905: fsync d4/d6/f44 0 2026-03-09T16:15:29.577 INFO:tasks.workunit.client.1.vm05.stdout:2/791: dwrite db/fc6 [0,4194304] 0 2026-03-09T16:15:29.587 INFO:tasks.workunit.client.1.vm05.stdout:0/847: fsync d5/f8 0 2026-03-09T16:15:29.587 INFO:tasks.workunit.client.1.vm05.stdout:5/933: creat d8/d59/d5b/f139 x:0 0 0 2026-03-09T16:15:29.587 INFO:tasks.workunit.client.1.vm05.stdout:1/948: dread d7/dd/f1f [0,4194304] 0 2026-03-09T16:15:29.589 INFO:tasks.workunit.client.1.vm05.stdout:9/897: write d4/d10/d35/d2b/d38/fa0 [839883,49913] 0 2026-03-09T16:15:29.589 INFO:tasks.workunit.client.1.vm05.stdout:8/906: mknod d4/d6/db/dc/d2e/d85/dc7/dda/c127 0 2026-03-09T16:15:29.592 INFO:tasks.workunit.client.1.vm05.stdout:0/848: mknod d5/db/d5f/da3/da4/c123 0 2026-03-09T16:15:29.595 INFO:tasks.workunit.client.1.vm05.stdout:3/822: dwrite d0/f45 [0,4194304] 0 2026-03-09T16:15:29.599 INFO:tasks.workunit.client.1.vm05.stdout:2/792: unlink db/dd/d15/d1f/d20/d23/d78/ld1 0 2026-03-09T16:15:29.607 INFO:tasks.workunit.client.1.vm05.stdout:4/961: symlink d5/de/d15/da9/db1/dad/d37/l156 0 2026-03-09T16:15:29.607 INFO:tasks.workunit.client.1.vm05.stdout:7/907: dwrite d1/d2/d8/dc/dd4/f5b [0,4194304] 0 2026-03-09T16:15:29.608 INFO:tasks.workunit.client.1.vm05.stdout:6/837: dwrite d17/f4e [0,4194304] 0 2026-03-09T16:15:29.616 INFO:tasks.workunit.client.1.vm05.stdout:7/908: fdatasync d1/d2/d8/dc/d14/f132 0 2026-03-09T16:15:29.618 INFO:tasks.workunit.client.1.vm05.stdout:2/793: read - db/dd/d15/d1f/d21/d87/fbe zero size 2026-03-09T16:15:29.629 INFO:tasks.workunit.client.1.vm05.stdout:5/934: dread f1 [0,4194304] 0 2026-03-09T16:15:29.629 INFO:tasks.workunit.client.1.vm05.stdout:5/935: fdatasync d8/d5e/d11b/d11e/f136 0 2026-03-09T16:15:29.629 INFO:tasks.workunit.client.1.vm05.stdout:5/936: readlink d8/d53/d7e/l12b 0 2026-03-09T16:15:29.630 INFO:tasks.workunit.client.1.vm05.stdout:7/909: dread - d1/d2/d8/dc/d33/fc4 zero size 2026-03-09T16:15:29.631 INFO:tasks.workunit.client.1.vm05.stdout:6/838: creat d17/d5d/d73/d83/d119/f13d x:0 0 0 2026-03-09T16:15:29.634 INFO:tasks.workunit.client.1.vm05.stdout:5/937: chown d8/d18/dbc/dcc/daa/le9 4945716 1 2026-03-09T16:15:29.635 INFO:tasks.workunit.client.1.vm05.stdout:1/949: rename d7/d15/l1a to d7/dd/d21/d44/d5c/l147 0 2026-03-09T16:15:29.637 INFO:tasks.workunit.client.1.vm05.stdout:9/898: getdents d4/d10/d35/d2b/d31/d82/dec 0 2026-03-09T16:15:29.637 INFO:tasks.workunit.client.1.vm05.stdout:8/907: dread d4/d6/db/df/fdc [0,4194304] 0 2026-03-09T16:15:29.637 INFO:tasks.workunit.client.1.vm05.stdout:0/849: getdents d5/d2c/d49/d83/d8b/daf/de8 0 2026-03-09T16:15:29.639 INFO:tasks.workunit.client.1.vm05.stdout:0/850: chown d5/d11/d4f/d68/f6b 538 1 2026-03-09T16:15:29.641 INFO:tasks.workunit.client.1.vm05.stdout:9/899: dwrite d4/f66 [0,4194304] 0 2026-03-09T16:15:29.643 INFO:tasks.workunit.client.1.vm05.stdout:2/794: rename db/dd/d15/d3f/l66 to db/dd/d15/d4c/d56/l101 0 2026-03-09T16:15:29.643 INFO:tasks.workunit.client.1.vm05.stdout:3/823: write d0/d9/f2b [1070761,53201] 0 2026-03-09T16:15:29.646 INFO:tasks.workunit.client.1.vm05.stdout:7/910: truncate d1/d2/d8/dc/d1b/d30/fb8 293649 0 2026-03-09T16:15:29.648 INFO:tasks.workunit.client.1.vm05.stdout:6/839: symlink d17/d22/d27/d34/d42/d53/d87/l13e 0 2026-03-09T16:15:29.648 INFO:tasks.workunit.client.1.vm05.stdout:3/824: fsync d0/d9/d22/d5f/d75/f117 0 2026-03-09T16:15:29.649 INFO:tasks.workunit.client.1.vm05.stdout:0/851: write d5/db/def/df2/f112 [695586,76399] 0 2026-03-09T16:15:29.656 INFO:tasks.workunit.client.1.vm05.stdout:5/938: dread d8/fb [0,4194304] 0 2026-03-09T16:15:29.656 INFO:tasks.workunit.client.1.vm05.stdout:6/840: dread - d17/d22/d27/d34/dd1/d10a/f135 zero size 2026-03-09T16:15:29.657 INFO:tasks.workunit.client.1.vm05.stdout:1/950: dwrite d7/dd/d21/d63/d71/fd0 [0,4194304] 0 2026-03-09T16:15:29.664 INFO:tasks.workunit.client.1.vm05.stdout:4/962: dread d5/de/d15/da9/db1/dad/f1f [0,4194304] 0 2026-03-09T16:15:29.665 INFO:tasks.workunit.client.1.vm05.stdout:4/963: chown d5/d116/c96 71 1 2026-03-09T16:15:29.675 INFO:tasks.workunit.client.1.vm05.stdout:8/908: truncate d4/d6/f58 4900668 0 2026-03-09T16:15:29.677 INFO:tasks.workunit.client.1.vm05.stdout:5/939: rename d8/dd5/f12d to d8/d18/d1b/f13a 0 2026-03-09T16:15:29.677 INFO:tasks.workunit.client.1.vm05.stdout:4/964: mknod d5/de/d15/d21/d39/c157 0 2026-03-09T16:15:29.678 INFO:tasks.workunit.client.1.vm05.stdout:5/940: fdatasync d8/d18/d1b/d47/f4c 0 2026-03-09T16:15:29.678 INFO:tasks.workunit.client.1.vm05.stdout:2/795: link db/dd/d15/d3f/d5b/d60/da2/fde db/dd/d7b/f102 0 2026-03-09T16:15:29.680 INFO:tasks.workunit.client.1.vm05.stdout:9/900: creat d4/d10/d35/d2b/d38/d65/dd6/f122 x:0 0 0 2026-03-09T16:15:29.682 INFO:tasks.workunit.client.1.vm05.stdout:9/901: write d4/d10c/f107 [725785,57037] 0 2026-03-09T16:15:29.682 INFO:tasks.workunit.client.1.vm05.stdout:3/825: mkdir d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/d11f 0 2026-03-09T16:15:29.684 INFO:tasks.workunit.client.1.vm05.stdout:6/841: dread d17/d22/f79 [0,4194304] 0 2026-03-09T16:15:29.685 INFO:tasks.workunit.client.1.vm05.stdout:4/965: truncate d5/de/d15/d21/d27/d3c/d5c/da2/dc9/f11c 564593 0 2026-03-09T16:15:29.685 INFO:tasks.workunit.client.1.vm05.stdout:9/902: read - d4/d10/d35/d36/d48/d60/dae/fc9 zero size 2026-03-09T16:15:29.686 INFO:tasks.workunit.client.1.vm05.stdout:3/826: creat d0/d9/d22/d5f/d7b/da8/f120 x:0 0 0 2026-03-09T16:15:29.686 INFO:tasks.workunit.client.1.vm05.stdout:8/909: truncate d4/d6/d3a/d7c/f11d 595674 0 2026-03-09T16:15:29.687 INFO:tasks.workunit.client.1.vm05.stdout:9/903: write d4/f2e [3945942,99746] 0 2026-03-09T16:15:29.688 INFO:tasks.workunit.client.1.vm05.stdout:2/796: rename db/dd/d15/d46/d67/ff0 to db/dd/d15/d4c/f103 0 2026-03-09T16:15:29.693 INFO:tasks.workunit.client.1.vm05.stdout:3/827: chown d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/f5d 7860 1 2026-03-09T16:15:29.694 INFO:tasks.workunit.client.1.vm05.stdout:8/910: fdatasync d4/d6/db/dc/d5d/d79/f91 0 2026-03-09T16:15:29.696 INFO:tasks.workunit.client.1.vm05.stdout:8/911: dread - d4/d6/d3a/d40/d71/f121 zero size 2026-03-09T16:15:29.696 INFO:tasks.workunit.client.1.vm05.stdout:4/966: dwrite d5/de/f24 [0,4194304] 0 2026-03-09T16:15:29.698 INFO:tasks.workunit.client.1.vm05.stdout:9/904: rename d4/d10/d35/d36/c69 to d4/d119/c123 0 2026-03-09T16:15:29.699 INFO:tasks.workunit.client.1.vm05.stdout:6/842: dwrite d17/d5d/f71 [4194304,4194304] 0 2026-03-09T16:15:29.708 INFO:tasks.workunit.client.1.vm05.stdout:2/797: sync 2026-03-09T16:15:29.709 INFO:tasks.workunit.client.1.vm05.stdout:0/852: write d5/d11/d4f/d70/fd0 [6300020,56561] 0 2026-03-09T16:15:29.711 INFO:tasks.workunit.client.1.vm05.stdout:1/951: write d7/dd/d21/d39/d48/d5d/f134 [4874838,11466] 0 2026-03-09T16:15:29.712 INFO:tasks.workunit.client.1.vm05.stdout:2/798: stat db/dd/d15/d3f/d5b/f69 0 2026-03-09T16:15:29.714 INFO:tasks.workunit.client.1.vm05.stdout:3/828: creat d0/d33/d10a/f121 x:0 0 0 2026-03-09T16:15:29.715 INFO:tasks.workunit.client.1.vm05.stdout:7/911: dwrite d1/d2/d8/dc/d1b/f5a [0,4194304] 0 2026-03-09T16:15:29.717 INFO:tasks.workunit.client.1.vm05.stdout:9/905: creat d4/d10c/f124 x:0 0 0 2026-03-09T16:15:29.719 INFO:tasks.workunit.client.1.vm05.stdout:8/912: mkdir d4/d6/db/dc/d2e/d128 0 2026-03-09T16:15:29.723 INFO:tasks.workunit.client.1.vm05.stdout:6/843: read - d17/d22/d27/d34/dd1/f106 zero size 2026-03-09T16:15:29.730 INFO:tasks.workunit.client.1.vm05.stdout:8/913: chown d4/d6/db/d75 4 1 2026-03-09T16:15:29.731 INFO:tasks.workunit.client.1.vm05.stdout:5/941: dwrite d8/d18/d1b/d47/d4e/d76/d11d/f100 [0,4194304] 0 2026-03-09T16:15:29.731 INFO:tasks.workunit.client.1.vm05.stdout:1/952: creat d7/d27/f148 x:0 0 0 2026-03-09T16:15:29.731 INFO:tasks.workunit.client.1.vm05.stdout:2/799: mknod db/dd/d7b/c104 0 2026-03-09T16:15:29.731 INFO:tasks.workunit.client.1.vm05.stdout:0/853: truncate d5/d11/d4f/d11b/fca 2537571 0 2026-03-09T16:15:29.731 INFO:tasks.workunit.client.1.vm05.stdout:3/829: dread d0/d9/d22/d5f/d7b/d99/f9d [0,4194304] 0 2026-03-09T16:15:29.735 INFO:tasks.workunit.client.1.vm05.stdout:9/906: read d4/d10/d35/f7c [238864,87317] 0 2026-03-09T16:15:29.738 INFO:tasks.workunit.client.1.vm05.stdout:7/912: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d13b 0 2026-03-09T16:15:29.738 INFO:tasks.workunit.client.1.vm05.stdout:4/967: creat d5/de/d82/f158 x:0 0 0 2026-03-09T16:15:29.739 INFO:tasks.workunit.client.1.vm05.stdout:3/830: sync 2026-03-09T16:15:29.746 INFO:tasks.workunit.client.1.vm05.stdout:0/854: readlink d5/d1b/d30/lb0 0 2026-03-09T16:15:29.749 INFO:tasks.workunit.client.1.vm05.stdout:3/831: chown d0/d9/d22/d5f/d90/dae/cec 18806802 1 2026-03-09T16:15:29.750 INFO:tasks.workunit.client.1.vm05.stdout:6/844: symlink d17/d22/d9d/da9/d128/l13f 0 2026-03-09T16:15:29.750 INFO:tasks.workunit.client.1.vm05.stdout:5/942: truncate d8/d5e/f72 1937176 0 2026-03-09T16:15:29.752 INFO:tasks.workunit.client.1.vm05.stdout:3/832: write d0/d9/d22/d5f/d75/d76/d88/f9c [3306797,94243] 0 2026-03-09T16:15:29.772 INFO:tasks.workunit.client.1.vm05.stdout:3/833: dread d0/d9/d8b/f91 [0,4194304] 0 2026-03-09T16:15:29.797 INFO:tasks.workunit.client.1.vm05.stdout:8/914: write d4/d6/d3a/d3c/f3f [2534990,107484] 0 2026-03-09T16:15:29.798 INFO:tasks.workunit.client.1.vm05.stdout:8/915: chown d4/d6/db/dc/d5d/da0/dbf 2 1 2026-03-09T16:15:29.799 INFO:tasks.workunit.client.1.vm05.stdout:8/916: fdatasync d4/d6/d3a/f28 0 2026-03-09T16:15:29.801 INFO:tasks.workunit.client.1.vm05.stdout:1/953: truncate d7/dd/de/f109 1959338 0 2026-03-09T16:15:29.805 INFO:tasks.workunit.client.1.vm05.stdout:3/834: mkdir d0/d9/d22/d5f/d75/d76/d122 0 2026-03-09T16:15:29.806 INFO:tasks.workunit.client.1.vm05.stdout:8/917: symlink d4/d6/db/d59/db0/dd6/l129 0 2026-03-09T16:15:29.809 INFO:tasks.workunit.client.1.vm05.stdout:2/800: rename db/dd/d15/fd4 to db/dd/d15/f105 0 2026-03-09T16:15:29.810 INFO:tasks.workunit.client.1.vm05.stdout:3/835: unlink d0/d33/f92 0 2026-03-09T16:15:29.811 INFO:tasks.workunit.client.1.vm05.stdout:4/968: rename d5/de/d15/f1b to d5/de/d82/f159 0 2026-03-09T16:15:29.811 INFO:tasks.workunit.client.1.vm05.stdout:8/918: dread d4/d6/db/df/fdc [0,4194304] 0 2026-03-09T16:15:29.812 INFO:tasks.workunit.client.1.vm05.stdout:3/836: chown d0/d9/d22/d5f/d7b/da8/cd0 41824 1 2026-03-09T16:15:29.813 INFO:tasks.workunit.client.1.vm05.stdout:5/943: creat d8/d18/d1b/d47/f13b x:0 0 0 2026-03-09T16:15:29.814 INFO:tasks.workunit.client.1.vm05.stdout:5/944: readlink d8/d59/d5b/l63 0 2026-03-09T16:15:29.814 INFO:tasks.workunit.client.1.vm05.stdout:3/837: write d0/d9/f2f [1014913,52136] 0 2026-03-09T16:15:29.814 INFO:tasks.workunit.client.1.vm05.stdout:8/919: stat d4/d6/db/dc/ce0 0 2026-03-09T16:15:29.816 INFO:tasks.workunit.client.1.vm05.stdout:3/838: chown d0/d9/d22/d6b 0 1 2026-03-09T16:15:29.818 INFO:tasks.workunit.client.1.vm05.stdout:4/969: truncate d5/de/d15/da9/db1/dad/d90/dd8/fe2 1069603 0 2026-03-09T16:15:29.819 INFO:tasks.workunit.client.1.vm05.stdout:5/945: sync 2026-03-09T16:15:29.820 INFO:tasks.workunit.client.1.vm05.stdout:4/970: fsync d5/de/d15/da9/db1/dad/d37/f51 0 2026-03-09T16:15:29.822 INFO:tasks.workunit.client.1.vm05.stdout:2/801: link db/dd/d15/d3f/d5b/d60/da2/fa9 db/dd/d15/d1f/f106 0 2026-03-09T16:15:29.822 INFO:tasks.workunit.client.1.vm05.stdout:5/946: creat d8/dd5/f13c x:0 0 0 2026-03-09T16:15:29.823 INFO:tasks.workunit.client.1.vm05.stdout:2/802: stat db/dd/d15/d3f/d5b/d60/d95/de7 0 2026-03-09T16:15:29.824 INFO:tasks.workunit.client.1.vm05.stdout:6/845: getdents d17/d22/d9d/da9/d128 0 2026-03-09T16:15:29.826 INFO:tasks.workunit.client.1.vm05.stdout:2/803: chown db/dd/d15/d3f/d5b/d60/c8b 177 1 2026-03-09T16:15:29.831 INFO:tasks.workunit.client.1.vm05.stdout:3/839: dread d0/d9/f93 [0,4194304] 0 2026-03-09T16:15:29.832 INFO:tasks.workunit.client.1.vm05.stdout:4/971: mknod d5/de/d15/d21/d27/d3c/d5c/c15a 0 2026-03-09T16:15:29.835 INFO:tasks.workunit.client.1.vm05.stdout:5/947: dwrite d8/d18/d1b/d47/d4e/ff1 [0,4194304] 0 2026-03-09T16:15:29.838 INFO:tasks.workunit.client.1.vm05.stdout:5/948: fdatasync d8/d18/d1b/d47/f13b 0 2026-03-09T16:15:29.840 INFO:tasks.workunit.client.1.vm05.stdout:1/954: dread d7/d15/d16/f74 [0,4194304] 0 2026-03-09T16:15:29.852 INFO:tasks.workunit.client.1.vm05.stdout:9/907: write d4/d10/d35/d36/f67 [444321,86305] 0 2026-03-09T16:15:29.853 INFO:tasks.workunit.client.1.vm05.stdout:0/855: write d5/db/d5b/d82/fb7 [14106,119095] 0 2026-03-09T16:15:29.853 INFO:tasks.workunit.client.1.vm05.stdout:7/913: write d1/d2/d8/dc/d1b/d30/d7d/f103 [632104,76618] 0 2026-03-09T16:15:29.856 INFO:tasks.workunit.client.1.vm05.stdout:1/955: truncate d7/dd/d21/d39/d48/d8c/dd8/d103/d107/f141 1036717 0 2026-03-09T16:15:29.858 INFO:tasks.workunit.client.1.vm05.stdout:5/949: dwrite d8/d18/d1b/d47/f13b [0,4194304] 0 2026-03-09T16:15:29.868 INFO:tasks.workunit.client.1.vm05.stdout:6/846: creat d17/d22/d9d/f140 x:0 0 0 2026-03-09T16:15:29.868 INFO:tasks.workunit.client.1.vm05.stdout:8/920: dread d4/d6/db/dc/f30 [0,4194304] 0 2026-03-09T16:15:29.868 INFO:tasks.workunit.client.1.vm05.stdout:4/972: dwrite d5/de/d15/da9/fc6 [4194304,4194304] 0 2026-03-09T16:15:29.868 INFO:tasks.workunit.client.1.vm05.stdout:2/804: creat db/dd/f107 x:0 0 0 2026-03-09T16:15:29.870 INFO:tasks.workunit.client.1.vm05.stdout:5/950: stat d8/d59/d5b/d8b/da0/lec 0 2026-03-09T16:15:29.876 INFO:tasks.workunit.client.1.vm05.stdout:7/914: symlink d1/d2/d11/d86/d8a/l13c 0 2026-03-09T16:15:29.876 INFO:tasks.workunit.client.1.vm05.stdout:4/973: dwrite d5/fb [0,4194304] 0 2026-03-09T16:15:29.880 INFO:tasks.workunit.client.1.vm05.stdout:8/921: truncate d4/d6/db/dc/d3b/fc1 291443 0 2026-03-09T16:15:29.889 INFO:tasks.workunit.client.1.vm05.stdout:9/908: link d4/d10/d35/d36/c3b d4/d10/d35/d2b/d38/d65/c125 0 2026-03-09T16:15:29.889 INFO:tasks.workunit.client.1.vm05.stdout:4/974: fdatasync d5/de/d15/da9/db1/dad/d90/f105 0 2026-03-09T16:15:29.890 INFO:tasks.workunit.client.1.vm05.stdout:2/805: symlink db/dd/d15/d1f/l108 0 2026-03-09T16:15:29.890 INFO:tasks.workunit.client.1.vm05.stdout:8/922: creat d4/d6/db/dc/d5d/da0/dd7/f12a x:0 0 0 2026-03-09T16:15:29.890 INFO:tasks.workunit.client.1.vm05.stdout:1/956: dread d7/dd/de/d52/f7d [0,4194304] 0 2026-03-09T16:15:29.891 INFO:tasks.workunit.client.1.vm05.stdout:8/923: chown d4/d6/d3a/d40/f76 38953130 1 2026-03-09T16:15:29.891 INFO:tasks.workunit.client.1.vm05.stdout:1/957: stat d7/d27/f57 0 2026-03-09T16:15:29.902 INFO:tasks.workunit.client.1.vm05.stdout:1/958: stat d7/dd/d21/d39/d87/db9/d138/dfc 0 2026-03-09T16:15:29.903 INFO:tasks.workunit.client.1.vm05.stdout:5/951: dread d8/d59/d5b/d8b/fd4 [0,4194304] 0 2026-03-09T16:15:29.905 INFO:tasks.workunit.client.1.vm05.stdout:9/909: read - d4/d10c/ff5 zero size 2026-03-09T16:15:29.906 INFO:tasks.workunit.client.1.vm05.stdout:3/840: write d0/d9/d22/d5f/d7b/d99/f102 [504165,83794] 0 2026-03-09T16:15:29.907 INFO:tasks.workunit.client.1.vm05.stdout:5/952: chown d8/d18/d1b/d47/d48/d73/dfb/c116 2084239 1 2026-03-09T16:15:29.907 INFO:tasks.workunit.client.1.vm05.stdout:8/924: unlink d4/d6/d3a/cd2 0 2026-03-09T16:15:29.908 INFO:tasks.workunit.client.1.vm05.stdout:9/910: dread - d4/d10c/f124 zero size 2026-03-09T16:15:29.908 INFO:tasks.workunit.client.1.vm05.stdout:1/959: mkdir d7/dd/d21/d63/d71/ddc/df8/d149 0 2026-03-09T16:15:29.908 INFO:tasks.workunit.client.1.vm05.stdout:3/841: chown d0/d9/d97/dac/lb6 0 1 2026-03-09T16:15:29.909 INFO:tasks.workunit.client.1.vm05.stdout:3/842: chown d0/d33/d10a/f121 561 1 2026-03-09T16:15:29.911 INFO:tasks.workunit.client.1.vm05.stdout:9/911: creat d4/d10/d35/d36/d48/f126 x:0 0 0 2026-03-09T16:15:29.912 INFO:tasks.workunit.client.1.vm05.stdout:1/960: dread - d7/dd/d21/d63/d71/ddc/df8/f10b zero size 2026-03-09T16:15:29.913 INFO:tasks.workunit.client.1.vm05.stdout:8/925: rename d4/d6/db/dc/d5d/da0/dd7/dd8 to d4/d6/db/df/d80/d12b 0 2026-03-09T16:15:29.914 INFO:tasks.workunit.client.1.vm05.stdout:1/961: dread d7/dd/de/d52/f7d [0,4194304] 0 2026-03-09T16:15:29.917 INFO:tasks.workunit.client.1.vm05.stdout:3/843: mkdir d0/d9/d22/d5f/d75/d76/d88/da3/df7/d123 0 2026-03-09T16:15:29.917 INFO:tasks.workunit.client.1.vm05.stdout:9/912: rename d4/d10/l3d to d4/d10c/ddd/l127 0 2026-03-09T16:15:29.920 INFO:tasks.workunit.client.1.vm05.stdout:8/926: mkdir d4/d6/d3a/d3c/d12c 0 2026-03-09T16:15:29.920 INFO:tasks.workunit.client.1.vm05.stdout:9/913: unlink d4/d10/d35/d36/d48/d54/d59/lf3 0 2026-03-09T16:15:29.920 INFO:tasks.workunit.client.1.vm05.stdout:3/844: stat d0/d9/cb 0 2026-03-09T16:15:29.921 INFO:tasks.workunit.client.1.vm05.stdout:8/927: write d4/d6/db/dc/d2e/f46 [1361287,123161] 0 2026-03-09T16:15:29.923 INFO:tasks.workunit.client.1.vm05.stdout:8/928: chown d4/d6/db/dc/d2e/f47 3403 1 2026-03-09T16:15:29.924 INFO:tasks.workunit.client.1.vm05.stdout:9/914: rmdir d4/d10/d35/d36/d48/d60 39 2026-03-09T16:15:29.924 INFO:tasks.workunit.client.1.vm05.stdout:8/929: write d4/d6/d3a/d40/f7b [3355917,77467] 0 2026-03-09T16:15:29.926 INFO:tasks.workunit.client.1.vm05.stdout:3/845: creat d0/d9/d22/d5f/d75/d76/d88/da3/f124 x:0 0 0 2026-03-09T16:15:29.927 INFO:tasks.workunit.client.1.vm05.stdout:8/930: dread - d4/d6/d3a/d15/f63 zero size 2026-03-09T16:15:29.927 INFO:tasks.workunit.client.1.vm05.stdout:0/856: write d5/fd4 [524373,31529] 0 2026-03-09T16:15:29.931 INFO:tasks.workunit.client.1.vm05.stdout:9/915: creat d4/d10/d35/d2b/dc1/f128 x:0 0 0 2026-03-09T16:15:29.935 INFO:tasks.workunit.client.1.vm05.stdout:8/931: creat d4/d6/d3a/f12d x:0 0 0 2026-03-09T16:15:29.935 INFO:tasks.workunit.client.1.vm05.stdout:7/915: truncate d1/d2/d8/dc/dd4/ff4 2873384 0 2026-03-09T16:15:29.937 INFO:tasks.workunit.client.1.vm05.stdout:2/806: write db/dd/d15/d1f/d20/d23/fbb [1964349,48802] 0 2026-03-09T16:15:29.939 INFO:tasks.workunit.client.1.vm05.stdout:6/847: dwrite d17/d22/d27/df8/d112/f113 [0,4194304] 0 2026-03-09T16:15:29.940 INFO:tasks.workunit.client.1.vm05.stdout:4/975: truncate d5/de/d15/da9/db1/f64 1707924 0 2026-03-09T16:15:29.942 INFO:tasks.workunit.client.1.vm05.stdout:5/953: write d8/d18/fc5 [367804,110991] 0 2026-03-09T16:15:29.946 INFO:tasks.workunit.client.1.vm05.stdout:8/932: sync 2026-03-09T16:15:29.947 INFO:tasks.workunit.client.1.vm05.stdout:1/962: dread d7/dd/de/d52/f58 [0,4194304] 0 2026-03-09T16:15:29.957 INFO:tasks.workunit.client.1.vm05.stdout:3/846: unlink d0/d9/d22/d5f/d7b/f111 0 2026-03-09T16:15:29.957 INFO:tasks.workunit.client.1.vm05.stdout:0/857: mknod d5/db/d48/c124 0 2026-03-09T16:15:29.957 INFO:tasks.workunit.client.1.vm05.stdout:5/954: dwrite d8/d18/dbc/dcc/daa/fb1 [0,4194304] 0 2026-03-09T16:15:29.958 INFO:tasks.workunit.client.1.vm05.stdout:6/848: truncate d17/d5d/f102 6015616 0 2026-03-09T16:15:29.959 INFO:tasks.workunit.client.1.vm05.stdout:7/916: mknod d1/d2/d8/dc/d1b/d30/d4b/db2/de9/c13d 0 2026-03-09T16:15:29.959 INFO:tasks.workunit.client.1.vm05.stdout:9/916: readlink d4/d10/d35/d2b/d38/d65/dd6/de3/l5f 0 2026-03-09T16:15:29.966 INFO:tasks.workunit.client.1.vm05.stdout:8/933: creat d4/d6/db/d9b/f12e x:0 0 0 2026-03-09T16:15:29.971 INFO:tasks.workunit.client.1.vm05.stdout:8/934: stat f0 0 2026-03-09T16:15:29.974 INFO:tasks.workunit.client.1.vm05.stdout:5/955: chown d8/d18/d1b/d47/d4e/d76/d8f/df9/c114 5653655 1 2026-03-09T16:15:29.975 INFO:tasks.workunit.client.1.vm05.stdout:4/976: unlink d5/de/d15/da9/db1/dad/d37/d60/dbf/d7d/l10b 0 2026-03-09T16:15:29.976 INFO:tasks.workunit.client.1.vm05.stdout:3/847: creat d0/d9/d97/dac/d113/f125 x:0 0 0 2026-03-09T16:15:29.977 INFO:tasks.workunit.client.1.vm05.stdout:9/917: fsync d4/d119/fd4 0 2026-03-09T16:15:29.978 INFO:tasks.workunit.client.1.vm05.stdout:9/918: fsync d4/f2e 0 2026-03-09T16:15:29.978 INFO:tasks.workunit.client.1.vm05.stdout:7/917: creat d1/d2/d8/dc/d1b/d11d/f13e x:0 0 0 2026-03-09T16:15:29.979 INFO:tasks.workunit.client.1.vm05.stdout:3/848: write d0/da9/fff [569798,76270] 0 2026-03-09T16:15:29.982 INFO:tasks.workunit.client.1.vm05.stdout:5/956: mknod d8/d18/dbc/dcc/daa/d43/d119/c13d 0 2026-03-09T16:15:29.983 INFO:tasks.workunit.client.1.vm05.stdout:1/963: rename d7/dd/d21/d63/d71/ddc/df8/f110 to d7/dd/d21/d39/d87/f14a 0 2026-03-09T16:15:29.983 INFO:tasks.workunit.client.1.vm05.stdout:6/849: getdents d17/d22/d27/df8/d112 0 2026-03-09T16:15:29.984 INFO:tasks.workunit.client.1.vm05.stdout:7/918: truncate d1/d2/d8/dc/d1b/d30/d7d/d114/f92 289049 0 2026-03-09T16:15:29.987 INFO:tasks.workunit.client.1.vm05.stdout:5/957: symlink d8/d18/dbc/l13e 0 2026-03-09T16:15:29.988 INFO:tasks.workunit.client.1.vm05.stdout:6/850: mkdir d17/d22/d27/d58/db8/d141 0 2026-03-09T16:15:29.989 INFO:tasks.workunit.client.1.vm05.stdout:4/977: truncate d5/de/d15/f25 800495 0 2026-03-09T16:15:29.989 INFO:tasks.workunit.client.1.vm05.stdout:5/958: symlink d8/dc8/l13f 0 2026-03-09T16:15:29.990 INFO:tasks.workunit.client.1.vm05.stdout:5/959: stat d8/d18/dbc/dcc/daa/f9d 0 2026-03-09T16:15:29.991 INFO:tasks.workunit.client.1.vm05.stdout:7/919: rename d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d116/f12f to d1/d2/d8/dc/d1b/de6/f13f 0 2026-03-09T16:15:29.996 INFO:tasks.workunit.client.1.vm05.stdout:8/935: dread d4/d6/d3a/d3c/f8d [0,4194304] 0 2026-03-09T16:15:29.998 INFO:tasks.workunit.client.1.vm05.stdout:4/978: dwrite d5/de/f16 [0,4194304] 0 2026-03-09T16:15:30.002 INFO:tasks.workunit.client.1.vm05.stdout:5/960: dwrite d8/f11 [4194304,4194304] 0 2026-03-09T16:15:30.010 INFO:tasks.workunit.client.1.vm05.stdout:9/919: dread d4/f6 [0,4194304] 0 2026-03-09T16:15:30.013 INFO:tasks.workunit.client.1.vm05.stdout:9/920: write d4/f4a [820963,71772] 0 2026-03-09T16:15:30.014 INFO:tasks.workunit.client.1.vm05.stdout:4/979: dwrite d5/de/d15/d21/d39/f42 [0,4194304] 0 2026-03-09T16:15:30.019 INFO:tasks.workunit.client.1.vm05.stdout:2/807: write db/dd/d15/d46/fa6 [567924,23911] 0 2026-03-09T16:15:30.021 INFO:tasks.workunit.client.1.vm05.stdout:6/851: creat d17/d22/d9d/da5/d122/f142 x:0 0 0 2026-03-09T16:15:30.027 INFO:tasks.workunit.client.1.vm05.stdout:0/858: write d5/d11/f90 [281432,92087] 0 2026-03-09T16:15:30.030 INFO:tasks.workunit.client.1.vm05.stdout:7/920: symlink d1/d2/d11/d86/d8a/d91/l140 0 2026-03-09T16:15:30.030 INFO:tasks.workunit.client.1.vm05.stdout:8/936: creat d4/de9/d10c/f12f x:0 0 0 2026-03-09T16:15:30.031 INFO:tasks.workunit.client.1.vm05.stdout:5/961: unlink d8/d18/d1b/d47/d4e/d76/d8f/d12a/cf8 0 2026-03-09T16:15:30.032 INFO:tasks.workunit.client.1.vm05.stdout:5/962: write d8/d18/d1b/d47/f13b [4529233,98463] 0 2026-03-09T16:15:30.033 INFO:tasks.workunit.client.1.vm05.stdout:2/808: dread db/dd/d15/d1f/d20/fee [0,4194304] 0 2026-03-09T16:15:30.034 INFO:tasks.workunit.client.1.vm05.stdout:9/921: fsync d4/d10/d35/d36/d48/d54/fd9 0 2026-03-09T16:15:30.034 INFO:tasks.workunit.client.1.vm05.stdout:7/921: chown d1/d2/d8/dc/d1b/de6/f106 58006436 1 2026-03-09T16:15:30.037 INFO:tasks.workunit.client.1.vm05.stdout:4/980: chown d5/de/d15/d21/d27/f29 120 1 2026-03-09T16:15:30.039 INFO:tasks.workunit.client.1.vm05.stdout:4/981: read d5/de/d15/d21/d27/d3c/d5c/d5f/f112 [196684,122406] 0 2026-03-09T16:15:30.041 INFO:tasks.workunit.client.1.vm05.stdout:6/852: dread - d17/d22/d27/d8a/fd0 zero size 2026-03-09T16:15:30.044 INFO:tasks.workunit.client.1.vm05.stdout:1/964: dwrite d7/dd/d21/d39/d87/db9/fe1 [4194304,4194304] 0 2026-03-09T16:15:30.047 INFO:tasks.workunit.client.1.vm05.stdout:3/849: dwrite d0/f60 [0,4194304] 0 2026-03-09T16:15:30.048 INFO:tasks.workunit.client.1.vm05.stdout:5/963: dwrite d8/d18/d1b/f2d [0,4194304] 0 2026-03-09T16:15:30.051 INFO:tasks.workunit.client.1.vm05.stdout:5/964: write d8/d18/dbc/dcc/daa/f10c [405558,78921] 0 2026-03-09T16:15:30.055 INFO:tasks.workunit.client.1.vm05.stdout:2/809: rename db/dd/d15/d46/d8d to db/dd/d15/d3f/d5b/d60/d95/d109 0 2026-03-09T16:15:30.058 INFO:tasks.workunit.client.1.vm05.stdout:2/810: stat db/dd/d15/d1f/d20/d23/faf 0 2026-03-09T16:15:30.066 INFO:tasks.workunit.client.1.vm05.stdout:1/965: mknod d7/d62/d72/c14b 0 2026-03-09T16:15:30.069 INFO:tasks.workunit.client.1.vm05.stdout:5/965: truncate d8/d59/d5b/f66 3743553 0 2026-03-09T16:15:30.069 INFO:tasks.workunit.client.1.vm05.stdout:0/859: creat d5/d2c/d49/d83/d8b/dd5/d121/f125 x:0 0 0 2026-03-09T16:15:30.069 INFO:tasks.workunit.client.1.vm05.stdout:7/922: mknod d1/d2/d8/dc/d33/d100/c141 0 2026-03-09T16:15:30.071 INFO:tasks.workunit.client.1.vm05.stdout:7/923: fdatasync d1/d2/d8/dc/dd4/da8/f11c 0 2026-03-09T16:15:30.075 INFO:tasks.workunit.client.1.vm05.stdout:6/853: dread d17/d5d/f84 [0,4194304] 0 2026-03-09T16:15:30.075 INFO:tasks.workunit.client.1.vm05.stdout:7/924: read - d1/d2/d8/dc/d14/f132 zero size 2026-03-09T16:15:30.076 INFO:tasks.workunit.client.1.vm05.stdout:5/966: dwrite d8/d18/d1b/d78/d90/fc4 [4194304,4194304] 0 2026-03-09T16:15:30.082 INFO:tasks.workunit.client.1.vm05.stdout:1/966: symlink d7/d15/d6e/l14c 0 2026-03-09T16:15:30.093 INFO:tasks.workunit.client.1.vm05.stdout:7/925: dread - d1/d2/d11/d86/d8a/d91/f112 zero size 2026-03-09T16:15:30.095 INFO:tasks.workunit.client.1.vm05.stdout:8/937: getdents d4/d6/d3a 0 2026-03-09T16:15:30.096 INFO:tasks.workunit.client.1.vm05.stdout:6/854: dread - d17/d22/d27/d58/db8/ff3 zero size 2026-03-09T16:15:30.096 INFO:tasks.workunit.client.1.vm05.stdout:5/967: creat d8/d18/d1b/d47/d4e/d76/d11d/f140 x:0 0 0 2026-03-09T16:15:30.098 INFO:tasks.workunit.client.1.vm05.stdout:6/855: mkdir d17/d22/d27/df8/d112/d143 0 2026-03-09T16:15:30.101 INFO:tasks.workunit.client.1.vm05.stdout:1/967: link d7/dd/d21/d44/c127 d7/dbe/dca/d133/dfe/c14d 0 2026-03-09T16:15:30.101 INFO:tasks.workunit.client.1.vm05.stdout:5/968: truncate d8/d18/dbc/dcc/f94 967209 0 2026-03-09T16:15:30.102 INFO:tasks.workunit.client.1.vm05.stdout:5/969: chown d8/d59/d5b/cba 32204031 1 2026-03-09T16:15:30.102 INFO:tasks.workunit.client.1.vm05.stdout:1/968: symlink d7/dd/d21/d63/d71/l14e 0 2026-03-09T16:15:30.104 INFO:tasks.workunit.client.1.vm05.stdout:5/970: rmdir d8/dd5 39 2026-03-09T16:15:30.104 INFO:tasks.workunit.client.1.vm05.stdout:8/938: link d4/d6/d3a/d40/f76 d4/d6/d3a/d7c/f130 0 2026-03-09T16:15:30.116 INFO:tasks.workunit.client.1.vm05.stdout:7/926: sync 2026-03-09T16:15:30.117 INFO:tasks.workunit.client.1.vm05.stdout:9/922: dread d4/d10/d35/d36/f85 [4194304,4194304] 0 2026-03-09T16:15:30.119 INFO:tasks.workunit.client.1.vm05.stdout:9/923: read - d4/d10/d35/d2b/d38/fda zero size 2026-03-09T16:15:30.122 INFO:tasks.workunit.client.1.vm05.stdout:9/924: chown d4/d10/d35/d36/d48/d54/d59/l79 251972 1 2026-03-09T16:15:30.133 INFO:tasks.workunit.client.1.vm05.stdout:7/927: dread d1/d2/d8/dc/d1b/d30/d4b/d65/f113 [0,4194304] 0 2026-03-09T16:15:30.133 INFO:tasks.workunit.client.1.vm05.stdout:8/939: link d4/d6/d3a/f49 d4/d6/f131 0 2026-03-09T16:15:30.133 INFO:tasks.workunit.client.1.vm05.stdout:9/925: creat d4/d10/d35/d36/d48/d60/dcb/f129 x:0 0 0 2026-03-09T16:15:30.133 INFO:tasks.workunit.client.1.vm05.stdout:8/940: truncate d4/d6/db/df/d4f/d9f/fbb 1010785 0 2026-03-09T16:15:30.133 INFO:tasks.workunit.client.1.vm05.stdout:8/941: chown d4/d6/d3a/d40 102 1 2026-03-09T16:15:30.160 INFO:tasks.workunit.client.1.vm05.stdout:3/850: write d0/d9/d22/d5f/d90/fa2 [5110308,18616] 0 2026-03-09T16:15:30.161 INFO:tasks.workunit.client.1.vm05.stdout:3/851: rename d0/d9/d22/d5f/dfd to d0/d9/d22/d5f/dfd/d126 22 2026-03-09T16:15:30.171 INFO:tasks.workunit.client.1.vm05.stdout:3/852: creat d0/d9/d22/d6b/f127 x:0 0 0 2026-03-09T16:15:30.174 INFO:tasks.workunit.client.1.vm05.stdout:2/811: dwrite db/dd/d15/d46/f4e [4194304,4194304] 0 2026-03-09T16:15:30.175 INFO:tasks.workunit.client.1.vm05.stdout:2/812: chown db/dd/d15/d3f/d5b/d60/da2 115760345 1 2026-03-09T16:15:30.177 INFO:tasks.workunit.client.1.vm05.stdout:4/982: dwrite d5/de/d15/d21/d27/d3c/d5c/da2/dc9/f11c [0,4194304] 0 2026-03-09T16:15:30.190 INFO:tasks.workunit.client.1.vm05.stdout:3/853: dwrite d0/da9/fe3 [0,4194304] 0 2026-03-09T16:15:30.192 INFO:tasks.workunit.client.1.vm05.stdout:2/813: creat db/dd/d15/d46/d67/f10a x:0 0 0 2026-03-09T16:15:30.192 INFO:tasks.workunit.client.1.vm05.stdout:0/860: write d5/d1b/d3b/f6f [3061984,12435] 0 2026-03-09T16:15:30.193 INFO:tasks.workunit.client.1.vm05.stdout:2/814: stat db/dd/d15/d46/d67 0 2026-03-09T16:15:30.196 INFO:tasks.workunit.client.1.vm05.stdout:2/815: dread - db/dd/d15/d3f/d5b/d60/d95/d109/fcf zero size 2026-03-09T16:15:30.196 INFO:tasks.workunit.client.1.vm05.stdout:2/816: write db/dd/d15/d3f/d5b/d60/d6a/f8a [1678698,69572] 0 2026-03-09T16:15:30.196 INFO:tasks.workunit.client.1.vm05.stdout:2/817: fsync db/dd/d15/d3f/f100 0 2026-03-09T16:15:30.196 INFO:tasks.workunit.client.1.vm05.stdout:2/818: readlink l4 0 2026-03-09T16:15:30.206 INFO:tasks.workunit.client.1.vm05.stdout:0/861: rmdir d5/db/d5f 39 2026-03-09T16:15:30.207 INFO:tasks.workunit.client.1.vm05.stdout:1/969: write d7/d15/d16/f13b [819601,83588] 0 2026-03-09T16:15:30.208 INFO:tasks.workunit.client.1.vm05.stdout:5/971: write d8/d18/d1b/d78/f129 [533296,47282] 0 2026-03-09T16:15:30.208 INFO:tasks.workunit.client.1.vm05.stdout:6/856: dwrite d17/d22/d27/d58/db8/fcf [0,4194304] 0 2026-03-09T16:15:30.209 INFO:tasks.workunit.client.1.vm05.stdout:7/928: write d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f88 [462520,34580] 0 2026-03-09T16:15:30.210 INFO:tasks.workunit.client.1.vm05.stdout:7/929: stat d1/d2/d11/d86/lca 0 2026-03-09T16:15:30.211 INFO:tasks.workunit.client.1.vm05.stdout:7/930: dread - d1/d2/d8/dc/d1b/d11d/f13e zero size 2026-03-09T16:15:30.214 INFO:tasks.workunit.client.1.vm05.stdout:1/970: fdatasync d7/d15/d16/f74 0 2026-03-09T16:15:30.214 INFO:tasks.workunit.client.1.vm05.stdout:6/857: mknod d17/d22/d27/d34/d42/d53/d9f/c144 0 2026-03-09T16:15:30.216 INFO:tasks.workunit.client.1.vm05.stdout:8/942: dwrite d4/d6/db/dc/d5d/f7a [0,4194304] 0 2026-03-09T16:15:30.218 INFO:tasks.workunit.client.1.vm05.stdout:2/819: sync 2026-03-09T16:15:30.223 INFO:tasks.workunit.client.1.vm05.stdout:9/926: dwrite d4/d10/d35/d2b/f45 [0,4194304] 0 2026-03-09T16:15:30.233 INFO:tasks.workunit.client.1.vm05.stdout:9/927: mkdir d4/d10/d35/d36/d48/d54/db0/d12a 0 2026-03-09T16:15:30.235 INFO:tasks.workunit.client.1.vm05.stdout:9/928: fdatasync d4/d10/d35/d36/d48/d54/db0/f110 0 2026-03-09T16:15:30.237 INFO:tasks.workunit.client.1.vm05.stdout:3/854: dread d0/d33/f64 [0,4194304] 0 2026-03-09T16:15:30.243 INFO:tasks.workunit.client.1.vm05.stdout:7/931: mknod d1/d2/d8/dc/d1b/d30/c142 0 2026-03-09T16:15:30.243 INFO:tasks.workunit.client.1.vm05.stdout:8/943: rename d4/d6/d3a/d15/f120 to d4/d6/f132 0 2026-03-09T16:15:30.243 INFO:tasks.workunit.client.1.vm05.stdout:9/929: dwrite d4/d10/f8d [0,4194304] 0 2026-03-09T16:15:30.246 INFO:tasks.workunit.client.1.vm05.stdout:5/972: getdents d8/d5e/d11b 0 2026-03-09T16:15:30.248 INFO:tasks.workunit.client.1.vm05.stdout:6/858: getdents d17/d22/d27/d8a/d8b 0 2026-03-09T16:15:30.265 INFO:tasks.workunit.client.1.vm05.stdout:5/973: creat d8/d18/dbc/dcc/daa/d43/f141 x:0 0 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:3/855: dwrite d0/d9/d22/d5f/dfd/f110 [0,4194304] 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:9/930: unlink d4/d10/d35/c1b 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:6/859: mkdir d17/d5d/d73/d83/d145 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:7/932: creat d1/d2/d11/d86/f143 x:0 0 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:6/860: dread - d17/d22/d27/d34/d42/d53/d87/f121 zero size 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:2/820: rename db/dd/d15/d1f/d20/d23/l1d to db/dd/d15/d3f/d5b/l10b 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:5/974: truncate d8/d18/f3a 842984 0 2026-03-09T16:15:30.266 INFO:tasks.workunit.client.1.vm05.stdout:9/931: rmdir d4/d10/d35/d36/d48/d60/dcb/dd2 39 2026-03-09T16:15:30.267 INFO:tasks.workunit.client.1.vm05.stdout:8/944: rename d4/d6/d3a/d67/ff0 to d4/d6/db/dc/d2e/d85/dc7/dda/f133 0 2026-03-09T16:15:30.269 INFO:tasks.workunit.client.1.vm05.stdout:2/821: creat db/dd/d15/d4c/d56/f10c x:0 0 0 2026-03-09T16:15:30.269 INFO:tasks.workunit.client.1.vm05.stdout:6/861: truncate d17/d22/d27/dd8/f10b 208433 0 2026-03-09T16:15:30.270 INFO:tasks.workunit.client.1.vm05.stdout:3/856: creat d0/d9/d22/d5f/f128 x:0 0 0 2026-03-09T16:15:30.270 INFO:tasks.workunit.client.1.vm05.stdout:9/932: truncate d4/d10/d35/d36/d48/fb7 1160251 0 2026-03-09T16:15:30.272 INFO:tasks.workunit.client.1.vm05.stdout:8/945: mkdir d4/d6/d53/d134 0 2026-03-09T16:15:30.275 INFO:tasks.workunit.client.1.vm05.stdout:3/857: read - d0/d9/d22/d5f/fd1 zero size 2026-03-09T16:15:30.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:29 vm05.local ceph-mon[58702]: from='client.24467 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:30.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:29 vm05.local ceph-mon[58702]: Upgrade: Updating prometheus.vm03 2026-03-09T16:15:30.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:29 vm05.local ceph-mon[58702]: Deploying daemon prometheus.vm03 on vm03 2026-03-09T16:15:30.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:29 vm05.local ceph-mon[58702]: pgmap v24: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 58 MiB/s rd, 166 MiB/s wr, 368 op/s 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:8/946: mkdir d4/d6/db/da6/d135 0 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:8/947: chown d4/d6/db/dc/d5d/l94 2 1 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:6/862: symlink d17/d22/d27/d34/d42/d53/d87/d137/l146 0 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:3/858: creat d0/d9/d97/dac/f129 x:0 0 0 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:6/863: chown d17/ff9 25 1 2026-03-09T16:15:30.281 INFO:tasks.workunit.client.1.vm05.stdout:8/948: mknod d4/d6/db/d59/db0/dd6/c136 0 2026-03-09T16:15:30.282 INFO:tasks.workunit.client.1.vm05.stdout:7/933: read d1/d2/d8/d31/fc5 [716701,74286] 0 2026-03-09T16:15:30.285 INFO:tasks.workunit.client.1.vm05.stdout:8/949: fsync d4/d6/db/df/d10f/f124 0 2026-03-09T16:15:30.286 INFO:tasks.workunit.client.1.vm05.stdout:6/864: truncate d17/d22/d27/d34/d42/d53/ffd 255986 0 2026-03-09T16:15:30.288 INFO:tasks.workunit.client.1.vm05.stdout:6/865: link d17/d5d/d73/f9e d17/d22/d9d/da5/f147 0 2026-03-09T16:15:30.292 INFO:tasks.workunit.client.1.vm05.stdout:6/866: truncate d17/d22/d9d/da5/fd9 735697 0 2026-03-09T16:15:30.297 INFO:tasks.workunit.client.1.vm05.stdout:3/859: dread d0/d33/f7d [0,4194304] 0 2026-03-09T16:15:30.301 INFO:tasks.workunit.client.1.vm05.stdout:4/983: write d5/de/d15/d21/f50 [1437119,91981] 0 2026-03-09T16:15:30.307 INFO:tasks.workunit.client.1.vm05.stdout:4/984: mkdir d5/de/d15/d21/d27/d3c/d142/d15b 0 2026-03-09T16:15:30.319 INFO:tasks.workunit.client.1.vm05.stdout:4/985: creat d5/de/d15/da9/df6/f15c x:0 0 0 2026-03-09T16:15:30.319 INFO:tasks.workunit.client.1.vm05.stdout:0/862: dwrite d5/d1b/f50 [0,4194304] 0 2026-03-09T16:15:30.319 INFO:tasks.workunit.client.1.vm05.stdout:4/986: creat d5/de/d15/da9/db1/dad/d37/d60/dbf/f15d x:0 0 0 2026-03-09T16:15:30.319 INFO:tasks.workunit.client.1.vm05.stdout:4/987: write d5/de/d15/da9/fc6 [382484,53397] 0 2026-03-09T16:15:30.326 INFO:tasks.workunit.client.1.vm05.stdout:4/988: truncate d5/de/d15/d21/d27/d3c/d5c/ff1 5194607 0 2026-03-09T16:15:30.327 INFO:tasks.workunit.client.1.vm05.stdout:6/867: dread d17/d22/d9d/fb2 [0,4194304] 0 2026-03-09T16:15:30.327 INFO:tasks.workunit.client.1.vm05.stdout:4/989: truncate d5/d116/f150 242744 0 2026-03-09T16:15:30.328 INFO:tasks.workunit.client.1.vm05.stdout:0/863: symlink d5/d1b/d30/d107/l126 0 2026-03-09T16:15:30.329 INFO:tasks.workunit.client.1.vm05.stdout:0/864: chown d5/db 23994 1 2026-03-09T16:15:30.329 INFO:tasks.workunit.client.1.vm05.stdout:6/868: symlink d17/d22/dce/l148 0 2026-03-09T16:15:30.332 INFO:tasks.workunit.client.1.vm05.stdout:4/990: dread d5/de/f16 [0,4194304] 0 2026-03-09T16:15:30.333 INFO:tasks.workunit.client.1.vm05.stdout:6/869: fdatasync d17/d5d/d73/f9e 0 2026-03-09T16:15:30.344 INFO:tasks.workunit.client.1.vm05.stdout:1/971: write d7/d15/d16/f66 [2562200,124533] 0 2026-03-09T16:15:30.345 INFO:tasks.workunit.client.1.vm05.stdout:5/975: write d8/d53/d7a/f92 [1872083,55770] 0 2026-03-09T16:15:30.358 INFO:tasks.workunit.client.1.vm05.stdout:8/950: write d4/d6/f29 [189019,65311] 0 2026-03-09T16:15:30.359 INFO:tasks.workunit.client.1.vm05.stdout:3/860: write d0/d9/d97/dc2/ffc [940100,63330] 0 2026-03-09T16:15:30.362 INFO:tasks.workunit.client.1.vm05.stdout:6/870: sync 2026-03-09T16:15:30.367 INFO:tasks.workunit.client.1.vm05.stdout:7/934: dwrite d1/d2/d8/dc/d1b/f62 [8388608,4194304] 0 2026-03-09T16:15:30.369 INFO:tasks.workunit.client.1.vm05.stdout:2/822: dwrite db/dd/d15/d1f/d20/d23/faf [4194304,4194304] 0 2026-03-09T16:15:30.373 INFO:tasks.workunit.client.1.vm05.stdout:7/935: fsync d1/d2/d8/dc/d1b/d71/f46 0 2026-03-09T16:15:30.374 INFO:tasks.workunit.client.1.vm05.stdout:9/933: dwrite d4/d10c/f9b [0,4194304] 0 2026-03-09T16:15:30.374 INFO:tasks.workunit.client.1.vm05.stdout:2/823: sync 2026-03-09T16:15:30.378 INFO:tasks.workunit.client.1.vm05.stdout:0/865: dwrite d5/d2c/f84 [0,4194304] 0 2026-03-09T16:15:30.380 INFO:tasks.workunit.client.1.vm05.stdout:7/936: sync 2026-03-09T16:15:30.382 INFO:tasks.workunit.client.1.vm05.stdout:6/871: mknod d17/d22/d27/d34/d42/d53/c149 0 2026-03-09T16:15:30.386 INFO:tasks.workunit.client.1.vm05.stdout:4/991: dwrite d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/fd1 [4194304,4194304] 0 2026-03-09T16:15:30.390 INFO:tasks.workunit.client.1.vm05.stdout:1/972: getdents d7/dd/d21/d39/d48/d5d 0 2026-03-09T16:15:30.393 INFO:tasks.workunit.client.1.vm05.stdout:0/866: creat d5/d97/f127 x:0 0 0 2026-03-09T16:15:30.393 INFO:tasks.workunit.client.1.vm05.stdout:7/937: mkdir d1/d2/d8/dc/dd4/da8/d144 0 2026-03-09T16:15:30.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:29 vm03.local ceph-mon[51019]: from='client.24467 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:30.394 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:29 vm03.local ceph-mon[51019]: Upgrade: Updating prometheus.vm03 2026-03-09T16:15:30.394 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:29 vm03.local ceph-mon[51019]: Deploying daemon prometheus.vm03 on vm03 2026-03-09T16:15:30.394 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:29 vm03.local ceph-mon[51019]: pgmap v24: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 58 MiB/s rd, 166 MiB/s wr, 368 op/s 2026-03-09T16:15:30.394 INFO:tasks.workunit.client.1.vm05.stdout:2/824: mknod db/dd/d15/d3f/d5b/d60/d95/d109/dcd/c10d 0 2026-03-09T16:15:30.399 INFO:tasks.workunit.client.1.vm05.stdout:2/825: chown db/dd/d15/d46/df3 2533563 1 2026-03-09T16:15:30.400 INFO:tasks.workunit.client.1.vm05.stdout:7/938: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108/f10a zero size 2026-03-09T16:15:30.401 INFO:tasks.workunit.client.1.vm05.stdout:7/939: dread - d1/d2/d11/d86/d8a/d91/f112 zero size 2026-03-09T16:15:30.402 INFO:tasks.workunit.client.1.vm05.stdout:4/992: rename d5/d9c/d124/f14e to d5/de/d15/d21/d27/d3c/d5c/d5f/db6/f15e 0 2026-03-09T16:15:30.402 INFO:tasks.workunit.client.1.vm05.stdout:1/973: rmdir d7/dbe/ded 39 2026-03-09T16:15:30.404 INFO:tasks.workunit.client.1.vm05.stdout:2/826: fsync db/dd/d15/f48 0 2026-03-09T16:15:30.408 INFO:tasks.workunit.client.1.vm05.stdout:3/861: link d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3/lbd d0/d33/d10a/l12a 0 2026-03-09T16:15:30.420 INFO:tasks.workunit.client.1.vm05.stdout:6/872: creat d17/d22/f14a x:0 0 0 2026-03-09T16:15:30.421 INFO:tasks.workunit.client.1.vm05.stdout:0/867: truncate d5/d11/d4f/d70/fbf 290189 0 2026-03-09T16:15:30.421 INFO:tasks.workunit.client.1.vm05.stdout:5/976: dwrite d8/d18/d1b/d6b/f113 [4194304,4194304] 0 2026-03-09T16:15:30.421 INFO:tasks.workunit.client.1.vm05.stdout:4/993: read - d5/de/d15/d21/d27/d3c/d142/f11f zero size 2026-03-09T16:15:30.421 INFO:tasks.workunit.client.1.vm05.stdout:7/940: fsync d1/d2/d8/dc/d9c/f6b 0 2026-03-09T16:15:30.429 INFO:tasks.workunit.client.1.vm05.stdout:7/941: read d1/d2/d8/dc/d9c/f6b [1982898,19631] 0 2026-03-09T16:15:30.431 INFO:tasks.workunit.client.1.vm05.stdout:2/827: dread db/dd/d15/d3f/d5b/d60/d95/d109/dcd/fed [0,4194304] 0 2026-03-09T16:15:30.435 INFO:tasks.workunit.client.1.vm05.stdout:0/868: read d5/d2c/f7f [1331080,93794] 0 2026-03-09T16:15:30.436 INFO:tasks.workunit.client.1.vm05.stdout:1/974: rename d7/d62/db6/c13e to d7/dd/d21/d63/d71/ddc/df8/c14f 0 2026-03-09T16:15:30.437 INFO:tasks.workunit.client.1.vm05.stdout:9/934: creat d4/d10/d35/d2b/d31/d82/df8/f12b x:0 0 0 2026-03-09T16:15:30.440 INFO:tasks.workunit.client.1.vm05.stdout:0/869: rmdir d5/d11/d4f 39 2026-03-09T16:15:30.441 INFO:tasks.workunit.client.1.vm05.stdout:9/935: creat d4/d119/f12c x:0 0 0 2026-03-09T16:15:30.445 INFO:tasks.workunit.client.1.vm05.stdout:7/942: symlink d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/d107/l145 0 2026-03-09T16:15:30.445 INFO:tasks.workunit.client.1.vm05.stdout:2/828: mknod db/dd/d15/c10e 0 2026-03-09T16:15:30.445 INFO:tasks.workunit.client.1.vm05.stdout:1/975: symlink d7/dd/de/d52/l150 0 2026-03-09T16:15:30.447 INFO:tasks.workunit.client.1.vm05.stdout:4/994: rename d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/le7 to d5/de/d15/d21/d27/d3c/d12f/l15f 0 2026-03-09T16:15:30.447 INFO:tasks.workunit.client.1.vm05.stdout:5/977: read d8/d18/dbc/dcc/daa/fa5 [3545201,14972] 0 2026-03-09T16:15:30.448 INFO:tasks.workunit.client.1.vm05.stdout:1/976: mkdir d7/d62/db6/d151 0 2026-03-09T16:15:30.449 INFO:tasks.workunit.client.1.vm05.stdout:9/936: mkdir d4/d10/d35/d2b/d38/d65/dd6/d12d 0 2026-03-09T16:15:30.451 INFO:tasks.workunit.client.1.vm05.stdout:0/870: getdents d5/d2c/d49/d83 0 2026-03-09T16:15:30.453 INFO:tasks.workunit.client.1.vm05.stdout:4/995: dread d5/de/d15/d21/d27/d3c/d5c/d5f/d4e/d103/fd1 [4194304,4194304] 0 2026-03-09T16:15:30.456 INFO:tasks.workunit.client.1.vm05.stdout:2/829: mkdir db/dd/d15/d3f/d5b/d60/d95/de7/d10f 0 2026-03-09T16:15:30.460 INFO:tasks.workunit.client.1.vm05.stdout:8/951: write d4/d6/d3a/f49 [1675270,93600] 0 2026-03-09T16:15:30.460 INFO:tasks.workunit.client.1.vm05.stdout:5/978: dwrite d8/d18/d1b/d6b/f113 [0,4194304] 0 2026-03-09T16:15:30.464 INFO:tasks.workunit.client.1.vm05.stdout:7/943: sync 2026-03-09T16:15:30.466 INFO:tasks.workunit.client.1.vm05.stdout:4/996: creat d5/de/d15/da9/db1/dad/d37/dfb/f160 x:0 0 0 2026-03-09T16:15:30.466 INFO:tasks.workunit.client.1.vm05.stdout:5/979: dread d8/d18/d1b/d47/d4e/ff1 [0,4194304] 0 2026-03-09T16:15:30.469 INFO:tasks.workunit.client.1.vm05.stdout:1/977: dread d7/dd/d21/d39/d87/db9/d138/ff5 [0,4194304] 0 2026-03-09T16:15:30.474 INFO:tasks.workunit.client.1.vm05.stdout:4/997: fdatasync d5/de/d15/d21/d27/d3c/d5c/f107 0 2026-03-09T16:15:30.475 INFO:tasks.workunit.client.1.vm05.stdout:4/998: write d5/de/d15/d21/d27/d3c/d5c/dfc/f141 [894083,18957] 0 2026-03-09T16:15:30.477 INFO:tasks.workunit.client.1.vm05.stdout:4/999: write d5/de/d15/da9/db1/dad/d37/dfb/f160 [768005,123237] 0 2026-03-09T16:15:30.478 INFO:tasks.workunit.client.1.vm05.stdout:1/978: read d7/dd/d21/d39/d87/f14a [203661,7081] 0 2026-03-09T16:15:30.479 INFO:tasks.workunit.client.1.vm05.stdout:2/830: creat db/dd/d15/d1f/d20/f110 x:0 0 0 2026-03-09T16:15:30.479 INFO:tasks.workunit.client.1.vm05.stdout:5/980: mknod d8/d18/d1b/d47/d48/d73/d80/de4/d133/c142 0 2026-03-09T16:15:30.481 INFO:tasks.workunit.client.1.vm05.stdout:2/831: stat db/dd/d15/d1f/c83 0 2026-03-09T16:15:30.482 INFO:tasks.workunit.client.1.vm05.stdout:5/981: dread d8/d18/dbc/dcc/daa/f35 [0,4194304] 0 2026-03-09T16:15:30.487 INFO:tasks.workunit.client.1.vm05.stdout:1/979: link d7/dd/d21/d63/le9 d7/d15/d6e/dbc/l152 0 2026-03-09T16:15:30.487 INFO:tasks.workunit.client.1.vm05.stdout:3/862: write d0/d9/d22/d5f/fd1 [415452,20748] 0 2026-03-09T16:15:30.492 INFO:tasks.workunit.client.1.vm05.stdout:6/873: dwrite d17/f95 [0,4194304] 0 2026-03-09T16:15:30.497 INFO:tasks.workunit.client.1.vm05.stdout:1/980: symlink d7/dd/d21/d63/d71/ddc/df8/d149/l153 0 2026-03-09T16:15:30.497 INFO:tasks.workunit.client.1.vm05.stdout:9/937: write d4/d10/d35/ff1 [901221,7762] 0 2026-03-09T16:15:30.497 INFO:tasks.workunit.client.1.vm05.stdout:0/871: write d5/d11/d4f/d68/fcb [48823,129504] 0 2026-03-09T16:15:30.497 INFO:tasks.workunit.client.1.vm05.stdout:7/944: write d1/d2/d8/dc/d14/fe4 [951399,92789] 0 2026-03-09T16:15:30.502 INFO:tasks.workunit.client.1.vm05.stdout:8/952: write d4/d6/db/df/d80/f9c [1141702,15177] 0 2026-03-09T16:15:30.502 INFO:tasks.workunit.client.1.vm05.stdout:3/863: unlink d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3/lbd 0 2026-03-09T16:15:30.509 INFO:tasks.workunit.client.1.vm05.stdout:6/874: creat d17/d22/d27/d34/d42/d53/d87/df6/f14b x:0 0 0 2026-03-09T16:15:30.509 INFO:tasks.workunit.client.1.vm05.stdout:1/981: mknod d7/dd/d21/d63/d71/c154 0 2026-03-09T16:15:30.510 INFO:tasks.workunit.client.1.vm05.stdout:7/945: creat d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f146 x:0 0 0 2026-03-09T16:15:30.512 INFO:tasks.workunit.client.1.vm05.stdout:1/982: symlink d7/dd/d21/d44/dcc/l155 0 2026-03-09T16:15:30.516 INFO:tasks.workunit.client.1.vm05.stdout:7/946: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/d147 0 2026-03-09T16:15:30.521 INFO:tasks.workunit.client.1.vm05.stdout:3/864: rename d0/d33/d10a to d0/d9/d22/d12b 0 2026-03-09T16:15:30.522 INFO:tasks.workunit.client.1.vm05.stdout:0/872: rmdir d5/db/d5b/de9 0 2026-03-09T16:15:30.522 INFO:tasks.workunit.client.1.vm05.stdout:1/983: mkdir d7/dbe/dca/d133/dfe/d156 0 2026-03-09T16:15:30.524 INFO:tasks.workunit.client.1.vm05.stdout:8/953: rename d4/d6/db/dc/d2e/d85/dc7 to d4/d6/db/df/d80/d137 0 2026-03-09T16:15:30.526 INFO:tasks.workunit.client.1.vm05.stdout:2/832: dread db/dd/d15/d46/f4e [0,4194304] 0 2026-03-09T16:15:30.526 INFO:tasks.workunit.client.1.vm05.stdout:5/982: truncate d8/d18/d1b/d47/f13b 2501870 0 2026-03-09T16:15:30.526 INFO:tasks.workunit.client.1.vm05.stdout:2/833: dread - db/dd/d15/d3f/f100 zero size 2026-03-09T16:15:30.527 INFO:tasks.workunit.client.1.vm05.stdout:2/834: dread - db/fa5 zero size 2026-03-09T16:15:30.528 INFO:tasks.workunit.client.1.vm05.stdout:2/835: read - db/dd/d15/d1f/d20/d23/f7a zero size 2026-03-09T16:15:30.528 INFO:tasks.workunit.client.1.vm05.stdout:2/836: fdatasync db/ffa 0 2026-03-09T16:15:30.532 INFO:tasks.workunit.client.1.vm05.stdout:6/875: dwrite d17/d5d/d73/d83/f9a [0,4194304] 0 2026-03-09T16:15:30.535 INFO:tasks.workunit.client.1.vm05.stdout:1/984: read d7/dbe/dca/f99 [4065016,110457] 0 2026-03-09T16:15:30.537 INFO:tasks.workunit.client.1.vm05.stdout:8/954: symlink d4/d6/d3a/d40/d71/l138 0 2026-03-09T16:15:30.538 INFO:tasks.workunit.client.1.vm05.stdout:8/955: chown d4/d6/db/d59/db0/f114 13496 1 2026-03-09T16:15:30.541 INFO:tasks.workunit.client.1.vm05.stdout:5/983: mknod d8/d18/d1b/d47/d4e/d76/d8f/c143 0 2026-03-09T16:15:30.543 INFO:tasks.workunit.client.1.vm05.stdout:3/865: mkdir d0/d9/d22/d12c 0 2026-03-09T16:15:30.546 INFO:tasks.workunit.client.1.vm05.stdout:0/873: mknod d5/d11/d4f/ddc/d10a/c128 0 2026-03-09T16:15:30.546 INFO:tasks.workunit.client.1.vm05.stdout:5/984: dwrite d8/f11 [0,4194304] 0 2026-03-09T16:15:30.554 INFO:tasks.workunit.client.1.vm05.stdout:6/876: symlink d17/d22/d27/d44/d125/l14c 0 2026-03-09T16:15:30.554 INFO:tasks.workunit.client.1.vm05.stdout:5/985: write d8/d18/d1b/d47/f4c [5229638,11243] 0 2026-03-09T16:15:30.554 INFO:tasks.workunit.client.1.vm05.stdout:5/986: truncate d8/d1d/f125 1004298 0 2026-03-09T16:15:30.558 INFO:tasks.workunit.client.1.vm05.stdout:2/837: symlink db/dd/d15/d1f/d20/l111 0 2026-03-09T16:15:30.558 INFO:tasks.workunit.client.1.vm05.stdout:3/866: creat d0/d9/d22/d5f/d75/d76/d88/d89/f12d x:0 0 0 2026-03-09T16:15:30.559 INFO:tasks.workunit.client.1.vm05.stdout:9/938: rename d4/d10/d35/d36/d48/f68 to d4/d10/d35/f12e 0 2026-03-09T16:15:30.559 INFO:tasks.workunit.client.1.vm05.stdout:0/874: dread - d5/db/d5f/deb/f104 zero size 2026-03-09T16:15:30.560 INFO:tasks.workunit.client.1.vm05.stdout:6/877: rmdir d17/d22/d27/d8a/d8b 39 2026-03-09T16:15:30.561 INFO:tasks.workunit.client.1.vm05.stdout:5/987: mknod d8/d95/c144 0 2026-03-09T16:15:30.561 INFO:tasks.workunit.client.1.vm05.stdout:5/988: dread - d8/d18/d1b/f13a zero size 2026-03-09T16:15:30.563 INFO:tasks.workunit.client.1.vm05.stdout:2/838: unlink db/dd/d15/d4c/fe2 0 2026-03-09T16:15:30.565 INFO:tasks.workunit.client.1.vm05.stdout:6/878: fdatasync d17/d5d/d73/f9e 0 2026-03-09T16:15:30.566 INFO:tasks.workunit.client.1.vm05.stdout:5/989: symlink d8/d18/dbc/l145 0 2026-03-09T16:15:30.568 INFO:tasks.workunit.client.1.vm05.stdout:5/990: chown d8/d18/d1b/f32 3074882 1 2026-03-09T16:15:30.570 INFO:tasks.workunit.client.1.vm05.stdout:2/839: creat db/dd/d15/d3f/d5b/d60/d95/d109/f112 x:0 0 0 2026-03-09T16:15:30.571 INFO:tasks.workunit.client.1.vm05.stdout:5/991: truncate d8/d18/d1b/d47/d48/d73/d80/fe5 8934052 0 2026-03-09T16:15:30.580 INFO:tasks.workunit.client.1.vm05.stdout:8/956: getdents d4/d6/db/dc/d5d/d79 0 2026-03-09T16:15:30.583 INFO:tasks.workunit.client.1.vm05.stdout:1/985: dwrite d7/dd/d21/d39/d87/db9/d138/f65 [0,4194304] 0 2026-03-09T16:15:30.585 INFO:tasks.workunit.client.1.vm05.stdout:6/879: write d17/d22/d9d/da9/ff0 [711977,127336] 0 2026-03-09T16:15:30.593 INFO:tasks.workunit.client.1.vm05.stdout:7/947: rename d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c7b to d1/d2/d8/d31/d8d/c148 0 2026-03-09T16:15:30.598 INFO:tasks.workunit.client.1.vm05.stdout:7/948: dwrite d1/d2/d8/dc/d33/f9d [0,4194304] 0 2026-03-09T16:15:30.599 INFO:tasks.workunit.client.1.vm05.stdout:7/949: write d1/d2/d8/d31/d8d/f6f [1275676,124489] 0 2026-03-09T16:15:30.602 INFO:tasks.workunit.client.1.vm05.stdout:8/957: symlink d4/d6/d3a/d15/l139 0 2026-03-09T16:15:30.603 INFO:tasks.workunit.client.1.vm05.stdout:8/958: dread - d4/d6/d3a/d15/f65 zero size 2026-03-09T16:15:30.607 INFO:tasks.workunit.client.1.vm05.stdout:9/939: creat d4/d10/d35/d36/f12f x:0 0 0 2026-03-09T16:15:30.608 INFO:tasks.workunit.client.1.vm05.stdout:8/959: fdatasync d4/d6/d3a/d15/f93 0 2026-03-09T16:15:30.610 INFO:tasks.workunit.client.1.vm05.stdout:0/875: rename d5/d1b/ffe to d5/db/d5b/d82/f129 0 2026-03-09T16:15:30.612 INFO:tasks.workunit.client.1.vm05.stdout:1/986: creat d7/dd/de/d96/f157 x:0 0 0 2026-03-09T16:15:30.613 INFO:tasks.workunit.client.1.vm05.stdout:9/940: creat d4/d119/f130 x:0 0 0 2026-03-09T16:15:30.614 INFO:tasks.workunit.client.1.vm05.stdout:8/960: mkdir d4/d6/db/dc/d5d/da0/dd7/d13a 0 2026-03-09T16:15:30.617 INFO:tasks.workunit.client.1.vm05.stdout:3/867: rename d0/d9/d22/d5f/d75/d76/d122 to d0/d9/d22/d5f/d75/d76/d88/d10c/d12e 0 2026-03-09T16:15:30.618 INFO:tasks.workunit.client.1.vm05.stdout:0/876: dwrite d5/d2c/dff/f59 [0,4194304] 0 2026-03-09T16:15:30.619 INFO:tasks.workunit.client.1.vm05.stdout:1/987: dread d7/dd/de/d52/f58 [0,4194304] 0 2026-03-09T16:15:30.622 INFO:tasks.workunit.client.1.vm05.stdout:3/868: write d0/d33/f64 [342963,2543] 0 2026-03-09T16:15:30.631 INFO:tasks.workunit.client.1.vm05.stdout:2/840: rename db/dd/d15/c2c to db/dd/d98/c113 0 2026-03-09T16:15:30.637 INFO:tasks.workunit.client.1.vm05.stdout:5/992: dwrite d8/d18/dbc/dcc/daa/f52 [4194304,4194304] 0 2026-03-09T16:15:30.647 INFO:tasks.workunit.client.1.vm05.stdout:0/877: unlink d5/c15 0 2026-03-09T16:15:30.647 INFO:tasks.workunit.client.1.vm05.stdout:6/880: dread d17/d22/d27/d44/f48 [0,4194304] 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:9/941: mknod d4/d10/d35/d2b/d31/d82/dc5/c131 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:1/988: creat d7/dbe/dca/d133/dfe/d156/f158 x:0 0 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:7/950: rename d1/d2/d11/d86/d8a/l10f to d1/d2/d11/d86/da2/db6/l149 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:3/869: mknod d0/d9/d97/c12f 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:1/989: write d7/dd/d21/d63/d71/ddc/df8/f12d [1094555,77717] 0 2026-03-09T16:15:30.648 INFO:tasks.workunit.client.1.vm05.stdout:3/870: stat d0/d9/d22/d5f/f11c 0 2026-03-09T16:15:30.651 INFO:tasks.workunit.client.1.vm05.stdout:8/961: rename d4/d6/db/df/dd1 to d4/d6/d3a/d40/d6a/d97/d13b 0 2026-03-09T16:15:30.653 INFO:tasks.workunit.client.1.vm05.stdout:9/942: fdatasync d4/d10/d35/d2b/d38/d65/dd6/de3/f93 0 2026-03-09T16:15:30.657 INFO:tasks.workunit.client.1.vm05.stdout:0/878: mkdir d5/db/d77/d12a 0 2026-03-09T16:15:30.658 INFO:tasks.workunit.client.1.vm05.stdout:7/951: creat d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d108/f14a x:0 0 0 2026-03-09T16:15:30.659 INFO:tasks.workunit.client.1.vm05.stdout:6/881: creat d17/d22/d27/d34/d4b/f14d x:0 0 0 2026-03-09T16:15:30.661 INFO:tasks.workunit.client.1.vm05.stdout:6/882: mknod d17/d22/d27/df8/c14e 0 2026-03-09T16:15:30.663 INFO:tasks.workunit.client.1.vm05.stdout:1/990: dwrite d7/d27/f148 [0,4194304] 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:3/871: truncate d0/d9/d22/d12b/f121 214169 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:9/943: read d4/d10/d35/d36/d48/fb8 [243574,42833] 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:6/883: mknod d17/d22/d27/d34/d42/d53/d87/c14f 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:3/872: chown d0/d9/d22/d5f/d75/d76/d88/da3/df7/l7a 4585852 1 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:1/991: creat d7/d15/d16/dc2/f159 x:0 0 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:6/884: stat d17/d22/d27/d34/d42/d65/lbc 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:3/873: stat d0/d9 0 2026-03-09T16:15:30.672 INFO:tasks.workunit.client.1.vm05.stdout:7/952: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d14b 0 2026-03-09T16:15:30.673 INFO:tasks.workunit.client.1.vm05.stdout:5/993: sync 2026-03-09T16:15:30.673 INFO:tasks.workunit.client.1.vm05.stdout:6/885: chown d17/d22/d9d/da9/d128/l13f 15 1 2026-03-09T16:15:30.674 INFO:tasks.workunit.client.1.vm05.stdout:9/944: unlink d4/d10/d35/d2b/d38/fa6 0 2026-03-09T16:15:30.674 INFO:tasks.workunit.client.1.vm05.stdout:5/994: dread f5 [0,4194304] 0 2026-03-09T16:15:30.675 INFO:tasks.workunit.client.1.vm05.stdout:3/874: rmdir d0/dce 39 2026-03-09T16:15:30.676 INFO:tasks.workunit.client.1.vm05.stdout:7/953: symlink d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/ddb/d107/l14c 0 2026-03-09T16:15:30.677 INFO:tasks.workunit.client.1.vm05.stdout:6/886: mkdir d17/d5d/d73/d150 0 2026-03-09T16:15:30.679 INFO:tasks.workunit.client.1.vm05.stdout:5/995: write d8/d18/d1b/d6b/f113 [4171372,101494] 0 2026-03-09T16:15:30.679 INFO:tasks.workunit.client.1.vm05.stdout:3/875: unlink d0/d9/d22/d5f/d75/d76/d88/da3/df7/l4f 0 2026-03-09T16:15:30.683 INFO:tasks.workunit.client.1.vm05.stdout:6/887: truncate d17/d22/d27/d34/dd1/d10a/f135 681529 0 2026-03-09T16:15:30.683 INFO:tasks.workunit.client.1.vm05.stdout:3/876: truncate d0/d33/f77 4345543 0 2026-03-09T16:15:30.685 INFO:tasks.workunit.client.1.vm05.stdout:1/992: dread d7/dd/d21/d63/d71/f7b [0,4194304] 0 2026-03-09T16:15:30.685 INFO:tasks.workunit.client.1.vm05.stdout:6/888: write d17/d22/d27/d34/d42/d53/d87/df6/f14b [891035,21030] 0 2026-03-09T16:15:30.688 INFO:tasks.workunit.client.1.vm05.stdout:2/841: write db/dd/d15/d1f/d21/d87/f99 [1973079,66880] 0 2026-03-09T16:15:30.688 INFO:tasks.workunit.client.1.vm05.stdout:7/954: mknod d1/d2/d11/d86/c14d 0 2026-03-09T16:15:30.689 INFO:tasks.workunit.client.1.vm05.stdout:6/889: chown d17/d22/d27/df8/d112/d143 106709900 1 2026-03-09T16:15:30.689 INFO:tasks.workunit.client.1.vm05.stdout:5/996: symlink d8/d18/d1b/d78/l146 0 2026-03-09T16:15:30.689 INFO:tasks.workunit.client.1.vm05.stdout:9/945: sync 2026-03-09T16:15:30.691 INFO:tasks.workunit.client.1.vm05.stdout:6/890: write d17/f95 [1481138,104866] 0 2026-03-09T16:15:30.694 INFO:tasks.workunit.client.1.vm05.stdout:3/877: sync 2026-03-09T16:15:30.694 INFO:tasks.workunit.client.1.vm05.stdout:6/891: truncate d17/f4e 5171254 0 2026-03-09T16:15:30.703 INFO:tasks.workunit.client.1.vm05.stdout:3/878: write d0/d9/d8b/fc7 [1089864,5403] 0 2026-03-09T16:15:30.705 INFO:tasks.workunit.client.1.vm05.stdout:9/946: rmdir d4/d119 39 2026-03-09T16:15:30.706 INFO:tasks.workunit.client.1.vm05.stdout:6/892: stat d17/d22/d27/d44/d125 0 2026-03-09T16:15:30.709 INFO:tasks.workunit.client.1.vm05.stdout:0/879: dwrite d5/d2c/dff/f60 [4194304,4194304] 0 2026-03-09T16:15:30.710 INFO:tasks.workunit.client.1.vm05.stdout:8/962: dwrite f0 [0,4194304] 0 2026-03-09T16:15:30.714 INFO:tasks.workunit.client.1.vm05.stdout:2/842: dwrite db/dd/d15/d1f/d20/f3d [0,4194304] 0 2026-03-09T16:15:30.716 INFO:tasks.workunit.client.1.vm05.stdout:7/955: mknod d1/d2/d8/dc/d1b/d71/d3c/d11f/c14e 0 2026-03-09T16:15:30.717 INFO:tasks.workunit.client.1.vm05.stdout:5/997: dread - d8/d18/d1b/d47/d48/d73/dfb/f109 zero size 2026-03-09T16:15:30.718 INFO:tasks.workunit.client.1.vm05.stdout:1/993: creat d7/d62/d72/dbf/f15a x:0 0 0 2026-03-09T16:15:30.723 INFO:tasks.workunit.client.1.vm05.stdout:6/893: dread d17/d5d/f84 [0,4194304] 0 2026-03-09T16:15:30.730 INFO:tasks.workunit.client.1.vm05.stdout:3/879: truncate d0/d9/d22/d5f/d7b/fb7 810309 0 2026-03-09T16:15:30.730 INFO:tasks.workunit.client.1.vm05.stdout:0/880: symlink d5/d109/l12b 0 2026-03-09T16:15:30.733 INFO:tasks.workunit.client.1.vm05.stdout:3/880: chown d0/d9/d22/d5f/d75/d76/l8e 1502713 1 2026-03-09T16:15:30.739 INFO:tasks.workunit.client.1.vm05.stdout:7/956: dread - d1/d2/d8/dc/d14/f120 zero size 2026-03-09T16:15:30.740 INFO:tasks.workunit.client.1.vm05.stdout:3/881: dread - d0/d9/d97/dac/f129 zero size 2026-03-09T16:15:30.740 INFO:tasks.workunit.client.1.vm05.stdout:9/947: truncate d4/d10/d35/d36/d48/fb7 1007396 0 2026-03-09T16:15:30.741 INFO:tasks.workunit.client.1.vm05.stdout:9/948: stat d4/d10/d35/d36/fce 0 2026-03-09T16:15:30.742 INFO:tasks.workunit.client.1.vm05.stdout:0/881: fsync d5/d9e/fa2 0 2026-03-09T16:15:30.742 INFO:tasks.workunit.client.1.vm05.stdout:1/994: creat d7/d62/db6/d151/f15b x:0 0 0 2026-03-09T16:15:30.743 INFO:tasks.workunit.client.1.vm05.stdout:5/998: rename d8/d18/d1b/d47/d4e/d76/d8f/dab/l128 to d8/d18/dbc/l147 0 2026-03-09T16:15:30.743 INFO:tasks.workunit.client.1.vm05.stdout:1/995: fsync d7/dd/d21/d39/d87/db9/d138/f65 0 2026-03-09T16:15:30.744 INFO:tasks.workunit.client.1.vm05.stdout:1/996: chown d7/d27/c9d 7978849 1 2026-03-09T16:15:30.758 INFO:tasks.workunit.client.1.vm05.stdout:3/882: unlink d0/d9/d22/d5f/d7b/da8/cd0 0 2026-03-09T16:15:30.760 INFO:tasks.workunit.client.1.vm05.stdout:7/957: creat d1/d2/d11/d86/d8a/f14f x:0 0 0 2026-03-09T16:15:30.764 INFO:tasks.workunit.client.1.vm05.stdout:6/894: dread d17/f31 [0,4194304] 0 2026-03-09T16:15:30.775 INFO:tasks.workunit.client.1.vm05.stdout:1/997: dread d7/d15/d16/f13b [0,4194304] 0 2026-03-09T16:15:30.775 INFO:tasks.workunit.client.1.vm05.stdout:5/999: dread d8/d18/dbc/dcc/daa/f110 [0,4194304] 0 2026-03-09T16:15:30.775 INFO:tasks.workunit.client.1.vm05.stdout:0/882: mknod d5/db/def/c12c 0 2026-03-09T16:15:30.776 INFO:tasks.workunit.client.1.vm05.stdout:1/998: chown d7/dd/d21/d63/d71/ddc/df8/cea 8586467 1 2026-03-09T16:15:30.776 INFO:tasks.workunit.client.1.vm05.stdout:3/883: truncate d0/d9/d22/d5f/f66 1834714 0 2026-03-09T16:15:30.778 INFO:tasks.workunit.client.1.vm05.stdout:3/884: fdatasync d0/d9/d22/d5f/d7b/da8/f120 0 2026-03-09T16:15:30.779 INFO:tasks.workunit.client.1.vm05.stdout:1/999: fdatasync d7/dd/d21/d39/d48/d8c/dd8/d103/f10a 0 2026-03-09T16:15:30.780 INFO:tasks.workunit.client.1.vm05.stdout:6/895: dread - d17/d22/d27/d8a/d8b/ff1 zero size 2026-03-09T16:15:30.789 INFO:tasks.workunit.client.1.vm05.stdout:8/963: truncate d4/d6/db/dc/d5d/f7a 3607791 0 2026-03-09T16:15:30.789 INFO:tasks.workunit.client.1.vm05.stdout:2/843: dwrite db/dd/d15/d3f/d5b/f7d [0,4194304] 0 2026-03-09T16:15:30.792 INFO:tasks.workunit.client.1.vm05.stdout:2/844: readlink db/dd/d15/d1f/d21/l40 0 2026-03-09T16:15:30.792 INFO:tasks.workunit.client.1.vm05.stdout:8/964: dread d4/d6/db/dc/d5d/fbd [0,4194304] 0 2026-03-09T16:15:30.793 INFO:tasks.workunit.client.1.vm05.stdout:8/965: write d4/f3e [3897112,112441] 0 2026-03-09T16:15:30.799 INFO:tasks.workunit.client.1.vm05.stdout:9/949: write d4/d10/d35/d36/f49 [283526,10476] 0 2026-03-09T16:15:30.802 INFO:tasks.workunit.client.1.vm05.stdout:7/958: symlink d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d14b/l150 0 2026-03-09T16:15:30.805 INFO:tasks.workunit.client.1.vm05.stdout:6/896: symlink d17/d22/d27/d34/d42/d65/d117/l151 0 2026-03-09T16:15:30.806 INFO:tasks.workunit.client.1.vm05.stdout:6/897: chown d17/d22/d27/d34/dd1/d10a/f135 1383643209 1 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:2/845: truncate db/dd/d15/f90 613643 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:8/966: symlink d4/d6/db/df/d80/d12b/l13c 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:2/846: read db/f17 [216637,75112] 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:8/967: stat d4/d6/db/df/d10f 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:2/847: creat db/dd/d15/d4c/d56/f114 x:0 0 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:0/883: link d5/db/def/ff8 d5/d11/d4f/ddc/d10a/f12d 0 2026-03-09T16:15:30.816 INFO:tasks.workunit.client.1.vm05.stdout:7/959: truncate d1/d2/d8/dc/d33/f57 1853722 0 2026-03-09T16:15:30.819 INFO:tasks.workunit.client.1.vm05.stdout:6/898: mknod d17/c152 0 2026-03-09T16:15:30.819 INFO:tasks.workunit.client.1.vm05.stdout:7/960: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/db1/d12b/f139 zero size 2026-03-09T16:15:30.819 INFO:tasks.workunit.client.1.vm05.stdout:6/899: readlink d17/d22/d27/d34/d4b/laa 0 2026-03-09T16:15:30.820 INFO:tasks.workunit.client.1.vm05.stdout:7/961: chown d1/lee 0 1 2026-03-09T16:15:30.821 INFO:tasks.workunit.client.1.vm05.stdout:7/962: fdatasync d1/d2/d8/dc/d1b/d30/d4b/d65/f83 0 2026-03-09T16:15:30.827 INFO:tasks.workunit.client.1.vm05.stdout:8/968: symlink d4/d6/d9a/d122/l13d 0 2026-03-09T16:15:30.827 INFO:tasks.workunit.client.1.vm05.stdout:8/969: chown d4/d6/db/f5e 2983 1 2026-03-09T16:15:30.829 INFO:tasks.workunit.client.1.vm05.stdout:8/970: dread f0 [0,4194304] 0 2026-03-09T16:15:30.843 INFO:tasks.workunit.client.1.vm05.stdout:3/885: write d0/d9/f93 [2789356,38956] 0 2026-03-09T16:15:30.850 INFO:tasks.workunit.client.1.vm05.stdout:9/950: dwrite d4/d10/f18 [0,4194304] 0 2026-03-09T16:15:30.852 INFO:tasks.workunit.client.1.vm05.stdout:0/884: symlink d5/d11/d4f/ddc/l12e 0 2026-03-09T16:15:30.859 INFO:tasks.workunit.client.1.vm05.stdout:6/900: truncate d17/d1d/f1e 69992 0 2026-03-09T16:15:30.859 INFO:tasks.workunit.client.1.vm05.stdout:7/963: creat d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/f151 x:0 0 0 2026-03-09T16:15:30.861 INFO:tasks.workunit.client.1.vm05.stdout:2/848: dwrite db/dd/d15/d1f/d20/fee [0,4194304] 0 2026-03-09T16:15:30.869 INFO:tasks.workunit.client.1.vm05.stdout:8/971: rmdir d4/d6/db/dc 39 2026-03-09T16:15:30.876 INFO:tasks.workunit.client.1.vm05.stdout:3/886: dread - d0/d9/d22/d5f/d7b/da8/dd8/fe4 zero size 2026-03-09T16:15:30.877 INFO:tasks.workunit.client.1.vm05.stdout:6/901: mkdir d17/d22/d9d/da9/d128/d153 0 2026-03-09T16:15:30.877 INFO:tasks.workunit.client.1.vm05.stdout:7/964: creat d1/d2/d8/dc/d14/f152 x:0 0 0 2026-03-09T16:15:30.877 INFO:tasks.workunit.client.1.vm05.stdout:2/849: mkdir db/dd/d15/d3f/d5b/d60/d6a/dea/d115 0 2026-03-09T16:15:30.877 INFO:tasks.workunit.client.1.vm05.stdout:7/965: stat d1/d2/d8/dc/d1b/d71/d3c/fdd 0 2026-03-09T16:15:30.880 INFO:tasks.workunit.client.1.vm05.stdout:0/885: dread d5/d11/f106 [0,4194304] 0 2026-03-09T16:15:30.885 INFO:tasks.workunit.client.1.vm05.stdout:0/886: dwrite d5/db/d5f/da3/f11f [0,4194304] 0 2026-03-09T16:15:30.904 INFO:tasks.workunit.client.1.vm05.stdout:3/887: dread d0/d33/f41 [0,4194304] 0 2026-03-09T16:15:30.906 INFO:tasks.workunit.client.1.vm05.stdout:0/887: dread d5/db/d5f/fd6 [0,4194304] 0 2026-03-09T16:15:30.910 INFO:tasks.workunit.client.1.vm05.stdout:2/850: sync 2026-03-09T16:15:30.911 INFO:tasks.workunit.client.1.vm05.stdout:2/851: dread - db/dd/d15/d4c/d56/f10c zero size 2026-03-09T16:15:30.916 INFO:tasks.workunit.client.1.vm05.stdout:7/966: truncate d1/d2/d8/dc/d1b/d30/d7d/d114/f92 551356 0 2026-03-09T16:15:30.925 INFO:tasks.workunit.client.1.vm05.stdout:3/888: dread d0/d9/d22/d6b/fa0 [0,4194304] 0 2026-03-09T16:15:30.925 INFO:tasks.workunit.client.1.vm05.stdout:3/889: chown d0/d9/d22/d5f/d75/f117 0 1 2026-03-09T16:15:30.926 INFO:tasks.workunit.client.1.vm05.stdout:9/951: truncate d4/d10/d35/d36/fb3 876498 0 2026-03-09T16:15:30.930 INFO:tasks.workunit.client.1.vm05.stdout:8/972: creat d4/d6/db/dc/d5d/da0/dd7/d13a/f13e x:0 0 0 2026-03-09T16:15:30.930 INFO:tasks.workunit.client.1.vm05.stdout:6/902: dwrite d17/d22/d9d/fb2 [0,4194304] 0 2026-03-09T16:15:30.944 INFO:tasks.workunit.client.1.vm05.stdout:0/888: mkdir d5/d2c/d49/d83/d8b/daf/d12f 0 2026-03-09T16:15:30.945 INFO:tasks.workunit.client.1.vm05.stdout:2/852: creat db/dd/d15/d3f/d5b/d60/da2/f116 x:0 0 0 2026-03-09T16:15:30.950 INFO:tasks.workunit.client.1.vm05.stdout:8/973: write d4/f1c [2184428,16337] 0 2026-03-09T16:15:30.956 INFO:tasks.workunit.client.1.vm05.stdout:0/889: symlink d5/d11/d4f/ddc/l130 0 2026-03-09T16:15:30.959 INFO:tasks.workunit.client.1.vm05.stdout:0/890: chown d5/d2c/d49/d83/d8b/dd5/d121 189403 1 2026-03-09T16:15:30.971 INFO:tasks.workunit.client.1.vm05.stdout:6/903: write d17/d22/d27/d58/f97 [79570,47613] 0 2026-03-09T16:15:30.976 INFO:tasks.workunit.client.1.vm05.stdout:3/890: rename d0/l42 to d0/d9/d22/d5f/d75/d76/d88/d89/l130 0 2026-03-09T16:15:30.984 INFO:tasks.workunit.client.1.vm05.stdout:8/974: dread d4/d6/f5f [0,4194304] 0 2026-03-09T16:15:30.989 INFO:tasks.workunit.client.1.vm05.stdout:2/853: rename db/dd/d15/d3f/d5b/d60/d95/dd7/cdf to db/dd/d15/d1f/d20/d23/c117 0 2026-03-09T16:15:30.992 INFO:tasks.workunit.client.1.vm05.stdout:2/854: dread db/dd/d15/d46/fa6 [0,4194304] 0 2026-03-09T16:15:31.001 INFO:tasks.workunit.client.1.vm05.stdout:7/967: getdents d1/d2/d8 0 2026-03-09T16:15:31.009 INFO:tasks.workunit.client.1.vm05.stdout:6/904: rename d17/d22/fe4 to d17/d5d/d73/d83/d145/f154 0 2026-03-09T16:15:31.016 INFO:tasks.workunit.client.1.vm05.stdout:2/855: write db/dd/d15/d3f/d5b/d60/d95/d109/fe5 [905003,20087] 0 2026-03-09T16:15:31.019 INFO:tasks.workunit.client.1.vm05.stdout:9/952: getdents d4/d10/d35/d36/d48/d60/dcb/dd2 0 2026-03-09T16:15:31.021 INFO:tasks.workunit.client.1.vm05.stdout:2/856: sync 2026-03-09T16:15:31.025 INFO:tasks.workunit.client.1.vm05.stdout:2/857: dread db/dd/d15/d1f/d20/f3d [0,4194304] 0 2026-03-09T16:15:31.026 INFO:tasks.workunit.client.1.vm05.stdout:8/975: truncate d4/d6/f44 2222522 0 2026-03-09T16:15:31.029 INFO:tasks.workunit.client.1.vm05.stdout:7/968: creat d1/d2/d8/dc/d9c/f153 x:0 0 0 2026-03-09T16:15:31.030 INFO:tasks.workunit.client.1.vm05.stdout:7/969: chown d1/d2/d8/dc/d1b/de6/f13f 2 1 2026-03-09T16:15:31.031 INFO:tasks.workunit.client.1.vm05.stdout:6/905: rmdir d17 39 2026-03-09T16:15:31.043 INFO:tasks.workunit.client.1.vm05.stdout:0/891: link d5/d11/d4f/l51 d5/d11/l131 0 2026-03-09T16:15:31.045 INFO:tasks.workunit.client.1.vm05.stdout:9/953: dread d4/d10/d35/d36/d48/fb7 [0,4194304] 0 2026-03-09T16:15:31.049 INFO:tasks.workunit.client.1.vm05.stdout:2/858: chown db/dd/d15/d3f/d5b/d60/d95/d109/ce8 6014738 1 2026-03-09T16:15:31.052 INFO:tasks.workunit.client.1.vm05.stdout:8/976: truncate d4/d6/d3a/fb8 1038455 0 2026-03-09T16:15:31.054 INFO:tasks.workunit.client.1.vm05.stdout:7/970: rename d1/d2/d8/d67/d76 to d1/d2/d8/d154 0 2026-03-09T16:15:31.062 INFO:tasks.workunit.client.1.vm05.stdout:3/891: getdents d0/d9/d97/dac 0 2026-03-09T16:15:31.074 INFO:tasks.workunit.client.1.vm05.stdout:8/977: mkdir d4/d6/db/dc/d5d/da0/dbf/d13f 0 2026-03-09T16:15:31.075 INFO:tasks.workunit.client.1.vm05.stdout:7/971: creat d1/d2/d8/dc/d33/f155 x:0 0 0 2026-03-09T16:15:31.080 INFO:tasks.workunit.client.1.vm05.stdout:6/906: creat d17/d22/d9d/db4/f155 x:0 0 0 2026-03-09T16:15:31.080 INFO:tasks.workunit.client.1.vm05.stdout:6/907: readlink d17/d22/d27/d44/lfb 0 2026-03-09T16:15:31.081 INFO:tasks.workunit.client.1.vm05.stdout:6/908: fdatasync d17/d4f/fbd 0 2026-03-09T16:15:31.082 INFO:tasks.workunit.client.1.vm05.stdout:6/909: chown d17/d22/d27/d34/d4b/f5a 132699030 1 2026-03-09T16:15:31.086 INFO:tasks.workunit.client.1.vm05.stdout:3/892: dread d0/d9/d22/d5f/d7b/fb7 [0,4194304] 0 2026-03-09T16:15:31.095 INFO:tasks.workunit.client.1.vm05.stdout:2/859: creat db/dd/d15/d46/df3/f118 x:0 0 0 2026-03-09T16:15:31.095 INFO:tasks.workunit.client.1.vm05.stdout:0/892: dwrite d5/db/d48/fad [0,4194304] 0 2026-03-09T16:15:31.107 INFO:tasks.workunit.client.1.vm05.stdout:8/978: mknod d4/de9/d10c/c140 0 2026-03-09T16:15:31.108 INFO:tasks.workunit.client.1.vm05.stdout:7/972: dread - d1/d2/d8/dc/d1b/d30/d4b/d65/f8e zero size 2026-03-09T16:15:31.133 INFO:tasks.workunit.client.1.vm05.stdout:6/910: write d17/d22/d27/d34/d42/d53/d9f/ffc [1037449,31286] 0 2026-03-09T16:15:31.137 INFO:tasks.workunit.client.1.vm05.stdout:3/893: dwrite d0/d9/d22/d5f/d75/d76/d88/da3/ff1 [0,4194304] 0 2026-03-09T16:15:31.138 INFO:tasks.workunit.client.1.vm05.stdout:3/894: dread - d0/d9/d8b/f105 zero size 2026-03-09T16:15:31.139 INFO:tasks.workunit.client.1.vm05.stdout:3/895: write d0/d9/d22/d5f/d75/d76/d88/da3/ff1 [2681546,97745] 0 2026-03-09T16:15:31.163 INFO:tasks.workunit.client.1.vm05.stdout:2/860: creat db/dd/d15/d3f/d5b/d60/f119 x:0 0 0 2026-03-09T16:15:31.163 INFO:tasks.workunit.client.1.vm05.stdout:2/861: write db/dd/d15/d4c/d56/f114 [373867,105495] 0 2026-03-09T16:15:31.164 INFO:tasks.workunit.client.1.vm05.stdout:2/862: write db/dd/d15/d4c/d56/f114 [1287058,19505] 0 2026-03-09T16:15:31.167 INFO:tasks.workunit.client.1.vm05.stdout:0/893: unlink d5/d2c/l75 0 2026-03-09T16:15:31.172 INFO:tasks.workunit.client.1.vm05.stdout:8/979: mkdir d4/d6/db/dc/d2e/d141 0 2026-03-09T16:15:31.172 INFO:tasks.workunit.client.1.vm05.stdout:9/954: link d4/d10/d35/d2b/d38/f62 d4/d10/d35/d36/d48/d60/dcb/f132 0 2026-03-09T16:15:31.178 INFO:tasks.workunit.client.1.vm05.stdout:9/955: creat d4/d10/d35/d2b/d31/d82/f133 x:0 0 0 2026-03-09T16:15:31.179 INFO:tasks.workunit.client.1.vm05.stdout:2/863: mkdir db/dd/d15/d1f/dc1/d11a 0 2026-03-09T16:15:31.179 INFO:tasks.workunit.client.1.vm05.stdout:7/973: link d1/d2/d8/dc/d1b/d30/d4b/db2/cf6 d1/d2/d11/d86/d8a/d91/c156 0 2026-03-09T16:15:31.180 INFO:tasks.workunit.client.1.vm05.stdout:2/864: stat db/dd/d15/d46/d67/ff1 0 2026-03-09T16:15:31.184 INFO:tasks.workunit.client.1.vm05.stdout:9/956: unlink d4/f5b 0 2026-03-09T16:15:31.185 INFO:tasks.workunit.client.1.vm05.stdout:7/974: sync 2026-03-09T16:15:31.187 INFO:tasks.workunit.client.1.vm05.stdout:8/980: dread d4/d6/d3a/d40/d71/fc6 [0,4194304] 0 2026-03-09T16:15:31.204 INFO:tasks.workunit.client.1.vm05.stdout:6/911: getdents d17/d22/dce 0 2026-03-09T16:15:31.205 INFO:tasks.workunit.client.1.vm05.stdout:6/912: chown d17/d22/d27/d34/d42/d53/d87/f121 16330 1 2026-03-09T16:15:31.206 INFO:tasks.workunit.client.1.vm05.stdout:3/896: write d0/d33/f36 [7623974,123596] 0 2026-03-09T16:15:31.211 INFO:tasks.workunit.client.1.vm05.stdout:0/894: dwrite d5/d11/d4f/d68/f6b [0,4194304] 0 2026-03-09T16:15:31.211 INFO:tasks.workunit.client.1.vm05.stdout:0/895: chown d5/db 89686 1 2026-03-09T16:15:31.212 INFO:tasks.workunit.client.1.vm05.stdout:2/865: write db/dd/d15/d4c/f103 [445261,91835] 0 2026-03-09T16:15:31.236 INFO:tasks.workunit.client.1.vm05.stdout:9/957: rmdir d4/d10/d35/d36/d48/d60/dae 39 2026-03-09T16:15:31.237 INFO:tasks.workunit.client.1.vm05.stdout:9/958: fsync d4/d10/d35/d36/f12f 0 2026-03-09T16:15:31.238 INFO:tasks.workunit.client.1.vm05.stdout:9/959: chown d4/d10/d35/d36/d48/d54/l7b 0 1 2026-03-09T16:15:31.238 INFO:tasks.workunit.client.1.vm05.stdout:9/960: readlink d4/d10/l23 0 2026-03-09T16:15:31.272 INFO:tasks.workunit.client.1.vm05.stdout:2/866: fdatasync db/dd/d15/d3f/d5b/d60/f7c 0 2026-03-09T16:15:31.282 INFO:tasks.workunit.client.1.vm05.stdout:7/975: symlink d1/d2/d11/l157 0 2026-03-09T16:15:31.282 INFO:tasks.workunit.client.1.vm05.stdout:8/981: mkdir d4/d6/db/dc/d5d/da0/dbf/d13f/d142 0 2026-03-09T16:15:31.282 INFO:tasks.workunit.client.1.vm05.stdout:8/982: chown d4/d6/db/da6/d135 1900 1 2026-03-09T16:15:31.283 INFO:tasks.workunit.client.1.vm05.stdout:7/976: write d1/d2/d8/dc/d1b/d71/d3c/f9b [2564726,84242] 0 2026-03-09T16:15:31.301 INFO:tasks.workunit.client.1.vm05.stdout:9/961: dread - d4/d10/d35/d36/d48/d60/dae/ff6 zero size 2026-03-09T16:15:31.310 INFO:tasks.workunit.client.1.vm05.stdout:7/977: rename d1/d2/d8/dc/d1b/d30/d4b/d65/f7f to d1/d2/d8/dc/d1b/d30/d4b/d65/db1/f158 0 2026-03-09T16:15:31.311 INFO:tasks.workunit.client.1.vm05.stdout:6/913: rmdir d17/d5d/d73/d150 0 2026-03-09T16:15:31.313 INFO:tasks.workunit.client.1.vm05.stdout:7/978: sync 2026-03-09T16:15:31.314 INFO:tasks.workunit.client.1.vm05.stdout:2/867: mknod db/dd/d15/dff/c11b 0 2026-03-09T16:15:31.315 INFO:tasks.workunit.client.1.vm05.stdout:2/868: fdatasync db/dd/d15/d4c/fe4 0 2026-03-09T16:15:31.316 INFO:tasks.workunit.client.1.vm05.stdout:8/983: mknod d4/d6/db/dc/d5d/da0/c143 0 2026-03-09T16:15:31.316 INFO:tasks.workunit.client.1.vm05.stdout:8/984: stat d4/d6/db/df/d4f/c5c 0 2026-03-09T16:15:31.317 INFO:tasks.workunit.client.1.vm05.stdout:3/897: getdents d0/d9/d22/d5f/d75/d76/d88 0 2026-03-09T16:15:31.318 INFO:tasks.workunit.client.1.vm05.stdout:6/914: mknod d17/d22/d27/d34/d42/d65/d10d/c156 0 2026-03-09T16:15:31.318 INFO:tasks.workunit.client.1.vm05.stdout:6/915: chown d17/d22/f14a 6841526 1 2026-03-09T16:15:31.319 INFO:tasks.workunit.client.1.vm05.stdout:7/979: mkdir d1/d2/d8/dc/dd4/da8/d159 0 2026-03-09T16:15:31.321 INFO:tasks.workunit.client.1.vm05.stdout:7/980: sync 2026-03-09T16:15:31.321 INFO:tasks.workunit.client.1.vm05.stdout:9/962: getdents d4/d10/d35/d2b/d31/d82/dec 0 2026-03-09T16:15:31.322 INFO:tasks.workunit.client.1.vm05.stdout:8/985: fdatasync d4/d6/d53/f89 0 2026-03-09T16:15:31.323 INFO:tasks.workunit.client.1.vm05.stdout:3/898: chown d0/l70 3876181 1 2026-03-09T16:15:31.334 INFO:tasks.workunit.client.1.vm05.stdout:7/981: mknod d1/d2/d8/dc/d1b/d30/d4b/db2/c15a 0 2026-03-09T16:15:31.339 INFO:tasks.workunit.client.1.vm05.stdout:3/899: mknod d0/d9/d22/d5f/d7b/da8/dd8/c131 0 2026-03-09T16:15:31.341 INFO:tasks.workunit.client.1.vm05.stdout:8/986: dread f0 [0,4194304] 0 2026-03-09T16:15:31.343 INFO:tasks.workunit.client.1.vm05.stdout:7/982: link d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/c75 d1/d2/d8/dc/dd4/da8/d159/c15b 0 2026-03-09T16:15:31.343 INFO:tasks.workunit.client.1.vm05.stdout:8/987: read d4/d6/d3a/d3c/f3f [273212,31480] 0 2026-03-09T16:15:31.345 INFO:tasks.workunit.client.1.vm05.stdout:3/900: dwrite d0/d9/d22/d5f/d75/d76/d88/d89/f12d [0,4194304] 0 2026-03-09T16:15:31.348 INFO:tasks.workunit.client.1.vm05.stdout:8/988: fsync d4/d6/d3a/d40/d71/f121 0 2026-03-09T16:15:31.356 INFO:tasks.workunit.client.1.vm05.stdout:0/896: dwrite d5/d1b/d3b/f3c [0,4194304] 0 2026-03-09T16:15:31.375 INFO:tasks.workunit.client.1.vm05.stdout:8/989: fsync d4/d6/db/d59/db0/dd6/ffb 0 2026-03-09T16:15:31.377 INFO:tasks.workunit.client.1.vm05.stdout:8/990: dread d4/d6/d3a/d7c/f11d [0,4194304] 0 2026-03-09T16:15:31.382 INFO:tasks.workunit.client.1.vm05.stdout:0/897: unlink d5/db/d5b/d82/cdf 0 2026-03-09T16:15:31.384 INFO:tasks.workunit.client.1.vm05.stdout:0/898: readlink d5/db/d5f/la1 0 2026-03-09T16:15:31.389 INFO:tasks.workunit.client.1.vm05.stdout:6/916: write d17/d22/d27/d58/db8/ff3 [774300,104555] 0 2026-03-09T16:15:31.393 INFO:tasks.workunit.client.1.vm05.stdout:2/869: dwrite f7 [0,4194304] 0 2026-03-09T16:15:31.395 INFO:tasks.workunit.client.1.vm05.stdout:9/963: dwrite d4/d10/d35/d36/f85 [4194304,4194304] 0 2026-03-09T16:15:31.397 INFO:tasks.workunit.client.1.vm05.stdout:2/870: chown db/dd/d15 12 1 2026-03-09T16:15:31.406 INFO:tasks.workunit.client.1.vm05.stdout:7/983: dwrite d1/d2/d8/dc/d1b/d30/d4b/d65/db1/fdc [0,4194304] 0 2026-03-09T16:15:31.413 INFO:tasks.workunit.client.1.vm05.stdout:3/901: dwrite d0/d9/d22/d5f/d7b/f9a [4194304,4194304] 0 2026-03-09T16:15:31.440 INFO:tasks.workunit.client.1.vm05.stdout:6/917: unlink d17/d22/d27/d34/dd1/fe2 0 2026-03-09T16:15:31.445 INFO:tasks.workunit.client.1.vm05.stdout:2/871: mknod db/dd/d15/d3f/d5b/d60/d95/dd7/c11c 0 2026-03-09T16:15:31.465 INFO:tasks.workunit.client.1.vm05.stdout:6/918: rename d17/d22/d9d/da9/d128 to d17/d22/d9d/da9/d157 0 2026-03-09T16:15:31.470 INFO:tasks.workunit.client.1.vm05.stdout:2/872: symlink db/dd/d15/d1f/dc1/l11d 0 2026-03-09T16:15:31.479 INFO:tasks.workunit.client.1.vm05.stdout:2/873: mkdir db/dd/d15/d3f/d5b/d60/d95/d109/d11e 0 2026-03-09T16:15:31.480 INFO:tasks.workunit.client.1.vm05.stdout:0/899: write d5/db/d5f/f7b [4405375,63259] 0 2026-03-09T16:15:31.485 INFO:tasks.workunit.client.1.vm05.stdout:2/874: dwrite db/dd/d15/d3f/d5b/d60/d95/d109/f112 [0,4194304] 0 2026-03-09T16:15:31.501 INFO:tasks.workunit.client.1.vm05.stdout:8/991: link d4/d6/d9a/db3/l64 d4/d6/db/dc/l144 0 2026-03-09T16:15:31.502 INFO:tasks.workunit.client.1.vm05.stdout:9/964: getdents d4/d10/d35/d2b/d31/d82 0 2026-03-09T16:15:31.503 INFO:tasks.workunit.client.1.vm05.stdout:7/984: getdents d1/d2/d8/dc/d1b/d30/d4b/db2 0 2026-03-09T16:15:31.504 INFO:tasks.workunit.client.1.vm05.stdout:2/875: readlink db/dd/d15/d4c/l85 0 2026-03-09T16:15:31.504 INFO:tasks.workunit.client.1.vm05.stdout:9/965: chown d4/d10/laf 8433 1 2026-03-09T16:15:31.504 INFO:tasks.workunit.client.1.vm05.stdout:7/985: chown d1/d2/d8/d31/ff7 2668543 1 2026-03-09T16:15:31.509 INFO:tasks.workunit.client.1.vm05.stdout:2/876: dwrite db/dd/d15/d3f/f100 [0,4194304] 0 2026-03-09T16:15:31.518 INFO:tasks.workunit.client.1.vm05.stdout:7/986: read - d1/d2/d11/d86/d8a/fa3 zero size 2026-03-09T16:15:31.519 INFO:tasks.workunit.client.1.vm05.stdout:3/902: write d0/d9/d22/f54 [1418909,49683] 0 2026-03-09T16:15:31.520 INFO:tasks.workunit.client.1.vm05.stdout:2/877: sync 2026-03-09T16:15:31.524 INFO:tasks.workunit.client.1.vm05.stdout:7/987: dread d1/d2/d8/dc/d1b/d30/f93 [0,4194304] 0 2026-03-09T16:15:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:31 vm05.local ceph-mon[58702]: pgmap v25: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 94 MiB/s wr, 219 op/s 2026-03-09T16:15:31.526 INFO:tasks.workunit.client.1.vm05.stdout:2/878: dwrite db/dd/d15/d1f/d20/d23/d78/f92 [0,4194304] 0 2026-03-09T16:15:31.547 INFO:tasks.workunit.client.1.vm05.stdout:6/919: write d17/d22/d9d/da5/f147 [240847,14483] 0 2026-03-09T16:15:31.547 INFO:tasks.workunit.client.1.vm05.stdout:6/920: readlink d17/d22/d27/d34/d42/d53/d87/d137/l146 0 2026-03-09T16:15:31.549 INFO:tasks.workunit.client.1.vm05.stdout:3/903: symlink d0/d9/d97/dac/d113/l132 0 2026-03-09T16:15:31.549 INFO:tasks.workunit.client.1.vm05.stdout:3/904: fdatasync d0/d9/d8b/fc7 0 2026-03-09T16:15:31.549 INFO:tasks.workunit.client.1.vm05.stdout:3/905: stat d0/d9/d97/dac/caf 0 2026-03-09T16:15:31.553 INFO:tasks.workunit.client.1.vm05.stdout:8/992: write d4/d6/db/dc/f26 [222373,17910] 0 2026-03-09T16:15:31.554 INFO:tasks.workunit.client.1.vm05.stdout:0/900: link d5/db/cb3 d5/db/d5b/d82/c132 0 2026-03-09T16:15:31.554 INFO:tasks.workunit.client.1.vm05.stdout:7/988: fdatasync d1/d2/d8/dc/d1b/d30/d7d/fa5 0 2026-03-09T16:15:31.557 INFO:tasks.workunit.client.1.vm05.stdout:9/966: creat d4/d10/d35/d2b/d38/d65/dd6/f134 x:0 0 0 2026-03-09T16:15:31.559 INFO:tasks.workunit.client.1.vm05.stdout:6/921: mkdir d17/d22/d27/d34/d42/d65/d117/d158 0 2026-03-09T16:15:31.563 INFO:tasks.workunit.client.1.vm05.stdout:8/993: dwrite d4/d6/db/dc/d5d/da0/dd7/f12a [0,4194304] 0 2026-03-09T16:15:31.564 INFO:tasks.workunit.client.1.vm05.stdout:0/901: rmdir d5/db/d77/df3 39 2026-03-09T16:15:31.565 INFO:tasks.workunit.client.1.vm05.stdout:7/989: mknod d1/d2/d8/dc/d1b/d30/d4b/d65/db1/c15c 0 2026-03-09T16:15:31.571 INFO:tasks.workunit.client.1.vm05.stdout:8/994: chown d4/d6/db/dc/d5d/da0/dbf 172202 1 2026-03-09T16:15:31.571 INFO:tasks.workunit.client.1.vm05.stdout:6/922: mknod d17/d4f/c159 0 2026-03-09T16:15:31.578 INFO:tasks.workunit.client.1.vm05.stdout:9/967: dwrite d4/d10/d35/d2b/d38/d65/dd6/f120 [0,4194304] 0 2026-03-09T16:15:31.588 INFO:tasks.workunit.client.1.vm05.stdout:7/990: dwrite d1/d2/d8/dc/d1b/d71/d3c/f9b [0,4194304] 0 2026-03-09T16:15:31.589 INFO:tasks.workunit.client.1.vm05.stdout:2/879: dread db/dd/d15/d1f/f36 [0,4194304] 0 2026-03-09T16:15:31.589 INFO:tasks.workunit.client.1.vm05.stdout:8/995: mkdir d4/d6/db/df/d80/d137/dda/d145 0 2026-03-09T16:15:31.597 INFO:tasks.workunit.client.1.vm05.stdout:9/968: mkdir d4/d10/d35/d2b/d38/d65/d135 0 2026-03-09T16:15:31.605 INFO:tasks.workunit.client.1.vm05.stdout:3/906: link d0/l39 d0/d9/d22/d5f/d90/dae/dd2/d101/l133 0 2026-03-09T16:15:31.606 INFO:tasks.workunit.client.1.vm05.stdout:9/969: rmdir d4/d10/d35/d36/d48/d60/dae 39 2026-03-09T16:15:31.607 INFO:tasks.workunit.client.1.vm05.stdout:2/880: creat db/dd/d15/d3f/d5b/d7e/f11f x:0 0 0 2026-03-09T16:15:31.607 INFO:tasks.workunit.client.1.vm05.stdout:2/881: write db/dd/d15/d3f/f100 [3096107,26286] 0 2026-03-09T16:15:31.609 INFO:tasks.workunit.client.1.vm05.stdout:2/882: stat db/c30 0 2026-03-09T16:15:31.614 INFO:tasks.workunit.client.1.vm05.stdout:6/923: rename d17/d22/d27/d34/dd1 to d17/d22/d27/d15a 0 2026-03-09T16:15:31.615 INFO:tasks.workunit.client.1.vm05.stdout:9/970: mkdir d4/d10/d35/d2b/d31/d82/d136 0 2026-03-09T16:15:31.616 INFO:tasks.workunit.client.1.vm05.stdout:3/907: mkdir d0/d9/d22/d5f/d90/dae/dd2/d101/d134 0 2026-03-09T16:15:31.617 INFO:tasks.workunit.client.1.vm05.stdout:2/883: fsync db/dd/d15/d3f/d5b/d60/d6a/fc8 0 2026-03-09T16:15:31.618 INFO:tasks.workunit.client.1.vm05.stdout:3/908: write d0/d9/d22/d5f/d75/f117 [858082,76473] 0 2026-03-09T16:15:31.619 INFO:tasks.workunit.client.1.vm05.stdout:6/924: rmdir d17/d22/d27/d34/d42/d53/d87/df6 39 2026-03-09T16:15:31.619 INFO:tasks.workunit.client.1.vm05.stdout:7/991: creat d1/d2/d8/dc/dd4/f15d x:0 0 0 2026-03-09T16:15:31.620 INFO:tasks.workunit.client.1.vm05.stdout:2/884: chown db/dd/d15/d46/l5a 5 1 2026-03-09T16:15:31.622 INFO:tasks.workunit.client.1.vm05.stdout:8/996: getdents d4/d6/db/df/d80/d12b 0 2026-03-09T16:15:31.628 INFO:tasks.workunit.client.1.vm05.stdout:3/909: write d0/d33/f7d [2828592,11675] 0 2026-03-09T16:15:31.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:31 vm03.local ceph-mon[51019]: pgmap v25: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 94 MiB/s wr, 219 op/s 2026-03-09T16:15:31.652 INFO:tasks.workunit.client.1.vm05.stdout:7/992: mkdir d1/d2/d8/dc/d1b/d30/d4b/d65/d3e/d14b/d15e 0 2026-03-09T16:15:31.652 INFO:tasks.workunit.client.1.vm05.stdout:7/993: readlink d1/d2/d8/dc/d1b/d71/l50 0 2026-03-09T16:15:31.664 INFO:tasks.workunit.client.1.vm05.stdout:3/910: truncate d0/d9/d22/d5f/d75/d76/fa5 636218 0 2026-03-09T16:15:31.665 INFO:tasks.workunit.client.1.vm05.stdout:3/911: write d0/d9/d22/d6b/fab [1577619,66106] 0 2026-03-09T16:15:31.673 INFO:tasks.workunit.client.1.vm05.stdout:3/912: dread d0/d9/d22/d5f/d90/ff9 [0,4194304] 0 2026-03-09T16:15:31.673 INFO:tasks.workunit.client.1.vm05.stdout:9/971: creat d4/d10/d35/d2b/d38/d65/dd6/f137 x:0 0 0 2026-03-09T16:15:31.674 INFO:tasks.workunit.client.1.vm05.stdout:3/913: read d0/d9/d22/d12b/f121 [62405,104728] 0 2026-03-09T16:15:31.678 INFO:tasks.workunit.client.1.vm05.stdout:0/902: write d5/d11/d4f/d70/fbf [544785,123358] 0 2026-03-09T16:15:31.684 INFO:tasks.workunit.client.1.vm05.stdout:9/972: symlink d4/d10/d35/d2b/dc1/dc2/d10f/l138 0 2026-03-09T16:15:31.686 INFO:tasks.workunit.client.1.vm05.stdout:3/914: read - d0/d9/d22/d5f/d75/d76/d88/d89/fe5 zero size 2026-03-09T16:15:31.692 INFO:tasks.workunit.client.1.vm05.stdout:3/915: dwrite d0/d33/f5e [0,4194304] 0 2026-03-09T16:15:31.697 INFO:tasks.workunit.client.1.vm05.stdout:6/925: dwrite d17/d1d/f67 [0,4194304] 0 2026-03-09T16:15:31.700 INFO:tasks.workunit.client.1.vm05.stdout:3/916: sync 2026-03-09T16:15:31.700 INFO:tasks.workunit.client.1.vm05.stdout:2/885: dwrite db/dd/d15/d3f/d5b/f69 [0,4194304] 0 2026-03-09T16:15:31.700 INFO:tasks.workunit.client.1.vm05.stdout:3/917: chown d0/f45 991 1 2026-03-09T16:15:31.702 INFO:tasks.workunit.client.1.vm05.stdout:2/886: chown db/dd/d15/d3f/ca8 31986 1 2026-03-09T16:15:31.733 INFO:tasks.workunit.client.1.vm05.stdout:7/994: write d1/d2/d8/dc/d9c/fda [893529,125053] 0 2026-03-09T16:15:31.733 INFO:tasks.workunit.client.1.vm05.stdout:8/997: truncate d4/d6/d9a/db3/f9d 2793729 0 2026-03-09T16:15:31.735 INFO:tasks.workunit.client.1.vm05.stdout:8/998: truncate d4/d6/db/d9b/fbe 874041 0 2026-03-09T16:15:31.735 INFO:tasks.workunit.client.1.vm05.stdout:8/999: chown d4/d6/f5f 1 1 2026-03-09T16:15:31.744 INFO:tasks.workunit.client.1.vm05.stdout:3/918: rmdir d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80 39 2026-03-09T16:15:31.747 INFO:tasks.workunit.client.1.vm05.stdout:2/887: dread - db/dd/f6b zero size 2026-03-09T16:15:31.749 INFO:tasks.workunit.client.1.vm05.stdout:0/903: dwrite d5/db/d5b/d82/fe5 [0,4194304] 0 2026-03-09T16:15:31.749 INFO:tasks.workunit.client.1.vm05.stdout:7/995: symlink d1/d2/d8/d154/l15f 0 2026-03-09T16:15:31.754 INFO:tasks.workunit.client.1.vm05.stdout:7/996: fdatasync d1/d2/d11/d86/d8a/f14f 0 2026-03-09T16:15:31.763 INFO:tasks.workunit.client.1.vm05.stdout:6/926: dread d17/d22/d27/fd7 [0,4194304] 0 2026-03-09T16:15:31.763 INFO:tasks.workunit.client.1.vm05.stdout:0/904: dread d5/d11/d4f/d68/f6b [0,4194304] 0 2026-03-09T16:15:31.772 INFO:tasks.workunit.client.1.vm05.stdout:3/919: dread d0/f60 [0,4194304] 0 2026-03-09T16:15:31.782 INFO:tasks.workunit.client.1.vm05.stdout:9/973: write d4/d10/d35/d36/d48/d60/dae/fcf [4768719,45523] 0 2026-03-09T16:15:31.795 INFO:tasks.workunit.client.1.vm05.stdout:9/974: sync 2026-03-09T16:15:31.810 INFO:tasks.workunit.client.1.vm05.stdout:6/927: symlink d17/d5d/d73/d83/d119/l15b 0 2026-03-09T16:15:31.810 INFO:tasks.workunit.client.1.vm05.stdout:6/928: chown d17/d22/d27/d58/db8/fcf 1712 1 2026-03-09T16:15:31.811 INFO:tasks.workunit.client.1.vm05.stdout:6/929: chown d17/d22/d27/d8a/d8b/l120 0 1 2026-03-09T16:15:31.812 INFO:tasks.workunit.client.1.vm05.stdout:6/930: read d17/d22/d9d/da5/fd9 [640957,54659] 0 2026-03-09T16:15:31.812 INFO:tasks.workunit.client.1.vm05.stdout:0/905: truncate d5/db/d5f/da3/fc6 928502 0 2026-03-09T16:15:31.813 INFO:tasks.workunit.client.1.vm05.stdout:6/931: chown d17/d22/d27/d8a/fd0 6 1 2026-03-09T16:15:31.816 INFO:tasks.workunit.client.1.vm05.stdout:9/975: dread - d4/d10/d35/d36/d48/d54/f103 zero size 2026-03-09T16:15:31.816 INFO:tasks.workunit.client.1.vm05.stdout:2/888: getdents db/dd/d15/d3f/d5b/d60/d6a/dea/dfc 0 2026-03-09T16:15:31.819 INFO:tasks.workunit.client.1.vm05.stdout:2/889: dwrite db/dd/d15/f6f [0,4194304] 0 2026-03-09T16:15:31.827 INFO:tasks.workunit.client.1.vm05.stdout:2/890: chown db/dd/d15/d1f/d21/l33 12 1 2026-03-09T16:15:31.843 INFO:tasks.workunit.client.1.vm05.stdout:2/891: unlink db/dd/fca 0 2026-03-09T16:15:31.848 INFO:tasks.workunit.client.1.vm05.stdout:7/997: getdents d1/d2/d8/dc/dd4/da8 0 2026-03-09T16:15:31.848 INFO:tasks.workunit.client.1.vm05.stdout:2/892: creat db/dd/d15/d4c/d56/f120 x:0 0 0 2026-03-09T16:15:31.849 INFO:tasks.workunit.client.1.vm05.stdout:9/976: creat d4/f139 x:0 0 0 2026-03-09T16:15:31.851 INFO:tasks.workunit.client.1.vm05.stdout:2/893: sync 2026-03-09T16:15:31.853 INFO:tasks.workunit.client.1.vm05.stdout:9/977: mkdir d4/d10/d35/d2b/d38/d65/d135/d13a 0 2026-03-09T16:15:31.856 INFO:tasks.workunit.client.1.vm05.stdout:2/894: symlink db/dd/d15/d3f/d5b/d60/d95/dd7/l121 0 2026-03-09T16:15:31.860 INFO:tasks.workunit.client.1.vm05.stdout:2/895: creat db/dd/d15/d3f/d5b/d60/d95/f122 x:0 0 0 2026-03-09T16:15:31.861 INFO:tasks.workunit.client.1.vm05.stdout:0/906: write d5/d2c/f41 [3048657,77238] 0 2026-03-09T16:15:31.861 INFO:tasks.workunit.client.1.vm05.stdout:2/896: fsync db/dd/d15/d3f/d5b/d60/d95/f76 0 2026-03-09T16:15:31.862 INFO:tasks.workunit.client.1.vm05.stdout:9/978: creat d4/d10/d35/d2b/d38/d65/dd6/f13b x:0 0 0 2026-03-09T16:15:31.863 INFO:tasks.workunit.client.1.vm05.stdout:9/979: fdatasync d4/d10/d35/d36/d48/f126 0 2026-03-09T16:15:31.867 INFO:tasks.workunit.client.1.vm05.stdout:3/920: dwrite d0/d9/d97/dbc/ff3 [0,4194304] 0 2026-03-09T16:15:31.870 INFO:tasks.workunit.client.1.vm05.stdout:3/921: chown d0/d33/f41 1367078 1 2026-03-09T16:15:31.874 INFO:tasks.workunit.client.1.vm05.stdout:6/932: truncate d17/d5d/f71 1029783 0 2026-03-09T16:15:31.876 INFO:tasks.workunit.client.1.vm05.stdout:3/922: dread d0/f60 [0,4194304] 0 2026-03-09T16:15:31.877 INFO:tasks.workunit.client.1.vm05.stdout:9/980: mkdir d4/d10/d35/d2b/d31/d82/df8/d13c 0 2026-03-09T16:15:31.878 INFO:tasks.workunit.client.1.vm05.stdout:2/897: truncate db/dd/d15/f90 302139 0 2026-03-09T16:15:31.879 INFO:tasks.workunit.client.1.vm05.stdout:9/981: chown d4/d10/d35/d2b/d38/fa0 848753 1 2026-03-09T16:15:31.880 INFO:tasks.workunit.client.1.vm05.stdout:2/898: readlink db/dd/d15/d1f/d20/d23/l14 0 2026-03-09T16:15:31.884 INFO:tasks.workunit.client.1.vm05.stdout:6/933: truncate d17/d22/d27/d8a/f88 4525696 0 2026-03-09T16:15:31.885 INFO:tasks.workunit.client.1.vm05.stdout:7/998: dwrite d1/d2/d8/dc/dd4/fd6 [0,4194304] 0 2026-03-09T16:15:31.894 INFO:tasks.workunit.client.1.vm05.stdout:3/923: dread d0/d9/f4d [0,4194304] 0 2026-03-09T16:15:31.894 INFO:tasks.workunit.client.1.vm05.stdout:2/899: dread db/dd/d15/d3f/d5b/d60/d95/d109/dcd/fed [0,4194304] 0 2026-03-09T16:15:31.894 INFO:tasks.workunit.client.1.vm05.stdout:2/900: dread - db/dd/d15/d1f/d21/d87/fbe zero size 2026-03-09T16:15:31.895 INFO:tasks.workunit.client.1.vm05.stdout:2/901: dwrite db/dd/d15/d46/df3/f118 [0,4194304] 0 2026-03-09T16:15:31.900 INFO:tasks.workunit.client.1.vm05.stdout:2/902: dwrite db/dd/d15/d4c/d56/f10c [0,4194304] 0 2026-03-09T16:15:31.914 INFO:tasks.workunit.client.1.vm05.stdout:6/934: symlink d17/d4f/l15c 0 2026-03-09T16:15:31.926 INFO:tasks.workunit.client.1.vm05.stdout:0/907: dwrite d5/d1b/f6a [0,4194304] 0 2026-03-09T16:15:31.928 INFO:tasks.workunit.client.1.vm05.stdout:3/924: dread d0/d9/d8b/ff6 [0,4194304] 0 2026-03-09T16:15:31.934 INFO:tasks.workunit.client.1.vm05.stdout:2/903: creat db/dd/d15/d3f/d5b/d60/d95/de7/f123 x:0 0 0 2026-03-09T16:15:31.937 INFO:tasks.workunit.client.1.vm05.stdout:9/982: link d4/d10/d35/d36/d48/d54/l7b d4/d10/d35/d2b/d31/dc8/l13d 0 2026-03-09T16:15:31.940 INFO:tasks.workunit.client.1.vm05.stdout:6/935: dread d17/d22/d27/d34/d4b/f98 [0,4194304] 0 2026-03-09T16:15:31.940 INFO:tasks.workunit.client.1.vm05.stdout:0/908: creat d5/d2c/d49/d83/d8b/daf/de8/f133 x:0 0 0 2026-03-09T16:15:31.941 INFO:tasks.workunit.client.1.vm05.stdout:9/983: dwrite d4/d10/d35/d2b/d38/fa0 [0,4194304] 0 2026-03-09T16:15:31.942 INFO:tasks.workunit.client.1.vm05.stdout:0/909: write d5/d1b/d3b/f3c [2983373,48629] 0 2026-03-09T16:15:31.944 INFO:tasks.workunit.client.1.vm05.stdout:3/925: rmdir d0/d33 39 2026-03-09T16:15:31.962 INFO:tasks.workunit.client.1.vm05.stdout:6/936: read d17/f18 [589675,74609] 0 2026-03-09T16:15:31.972 INFO:tasks.workunit.client.1.vm05.stdout:2/904: rename db/dd/d15/d46/d67/f10a to db/dd/d15/d3f/d5b/f124 0 2026-03-09T16:15:31.974 INFO:tasks.workunit.client.1.vm05.stdout:6/937: mkdir d17/d5d/d73/d15d 0 2026-03-09T16:15:31.977 INFO:tasks.workunit.client.1.vm05.stdout:0/910: creat d5/db/d5b/f134 x:0 0 0 2026-03-09T16:15:31.979 INFO:tasks.workunit.client.1.vm05.stdout:9/984: rmdir d4/d10/d35/d36/d48/d54/db0/d12a 0 2026-03-09T16:15:31.983 INFO:tasks.workunit.client.1.vm05.stdout:3/926: rename d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/cc3 to d0/d9/d22/d5f/d90/dae/dd2/d101/d134/c135 0 2026-03-09T16:15:31.984 INFO:tasks.workunit.client.1.vm05.stdout:0/911: dwrite d5/d2c/f84 [0,4194304] 0 2026-03-09T16:15:31.984 INFO:tasks.workunit.client.1.vm05.stdout:6/938: dread d17/d22/d9d/fe8 [0,4194304] 0 2026-03-09T16:15:31.988 INFO:tasks.workunit.client.1.vm05.stdout:6/939: chown d17/d22/d9d/da5/d122/f142 200720713 1 2026-03-09T16:15:31.994 INFO:tasks.workunit.client.1.vm05.stdout:2/905: mkdir db/dd/d15/d1f/dc1/d11a/d125 0 2026-03-09T16:15:31.994 INFO:tasks.workunit.client.1.vm05.stdout:3/927: fdatasync d0/d9/d22/d5f/d7b/d99/f9d 0 2026-03-09T16:15:31.998 INFO:tasks.workunit.client.1.vm05.stdout:0/912: dwrite d5/d1b/f61 [0,4194304] 0 2026-03-09T16:15:31.998 INFO:tasks.workunit.client.1.vm05.stdout:3/928: dread d0/d9/d22/d5f/d7b/f9a [4194304,4194304] 0 2026-03-09T16:15:32.007 INFO:tasks.workunit.client.1.vm05.stdout:2/906: rename db/dd/d15/d3f/d5b/d60/d95/ddd/ce6 to db/dd/d15/d1f/dc1/d11a/c126 0 2026-03-09T16:15:32.013 INFO:tasks.workunit.client.1.vm05.stdout:6/940: creat d17/d5d/d73/d15d/f15e x:0 0 0 2026-03-09T16:15:32.015 INFO:tasks.workunit.client.1.vm05.stdout:0/913: unlink d5/d11/d4f/f81 0 2026-03-09T16:15:32.016 INFO:tasks.workunit.client.1.vm05.stdout:0/914: stat d5/d11/d4f/d70/c88 0 2026-03-09T16:15:32.025 INFO:tasks.workunit.client.1.vm05.stdout:2/907: rename db/dd/d15/d3f/d5b/d60/d95/d109/faa to db/dd/d15/d46/df3/f127 0 2026-03-09T16:15:32.031 INFO:tasks.workunit.client.1.vm05.stdout:6/941: symlink d17/d22/d27/d58/db8/l15f 0 2026-03-09T16:15:32.034 INFO:tasks.workunit.client.1.vm05.stdout:3/929: creat d0/d9/d22/d5f/d75/d76/d88/da3/df7/d123/f136 x:0 0 0 2026-03-09T16:15:32.036 INFO:tasks.workunit.client.1.vm05.stdout:3/930: write d0/d9/d22/d5f/d75/d76/d88/d89/f112 [999842,38756] 0 2026-03-09T16:15:32.036 INFO:tasks.workunit.client.1.vm05.stdout:0/915: creat d5/db/d48/d66/f135 x:0 0 0 2026-03-09T16:15:32.038 INFO:tasks.workunit.client.1.vm05.stdout:9/985: getdents d4/d10/dd7 0 2026-03-09T16:15:32.042 INFO:tasks.workunit.client.1.vm05.stdout:6/942: unlink d17/d22/d27/d8a/fa7 0 2026-03-09T16:15:32.045 INFO:tasks.workunit.client.1.vm05.stdout:9/986: dwrite d4/d10/d35/d36/d48/d54/db0/f110 [0,4194304] 0 2026-03-09T16:15:32.045 INFO:tasks.workunit.client.1.vm05.stdout:3/931: creat d0/d9/d97/dc2/f137 x:0 0 0 2026-03-09T16:15:32.052 INFO:tasks.workunit.client.1.vm05.stdout:0/916: dread d5/db/def/df2/f112 [0,4194304] 0 2026-03-09T16:15:32.055 INFO:tasks.workunit.client.1.vm05.stdout:3/932: write d0/d33/f77 [4649159,120735] 0 2026-03-09T16:15:32.055 INFO:tasks.workunit.client.1.vm05.stdout:3/933: stat d0/d9/d8b 0 2026-03-09T16:15:32.063 INFO:tasks.workunit.client.1.vm05.stdout:2/908: dread db/dd/d15/d46/d67/f73 [0,4194304] 0 2026-03-09T16:15:32.070 INFO:tasks.workunit.client.1.vm05.stdout:3/934: symlink d0/d33/l138 0 2026-03-09T16:15:32.070 INFO:tasks.workunit.client.1.vm05.stdout:3/935: dread - d0/d9/d22/d5f/f11c zero size 2026-03-09T16:15:32.084 INFO:tasks.workunit.client.1.vm05.stdout:2/909: symlink db/dd/d15/d1f/dc1/d11a/d125/l128 0 2026-03-09T16:15:32.087 INFO:tasks.workunit.client.1.vm05.stdout:2/910: creat db/dd/d15/d3f/d5b/d60/d6a/f129 x:0 0 0 2026-03-09T16:15:32.090 INFO:tasks.workunit.client.1.vm05.stdout:3/936: sync 2026-03-09T16:15:32.094 INFO:tasks.workunit.client.1.vm05.stdout:2/911: stat db/dd/d15/d3f/d5b/d60/cda 0 2026-03-09T16:15:32.099 INFO:tasks.workunit.client.1.vm05.stdout:3/937: write d0/f60 [984555,107142] 0 2026-03-09T16:15:32.100 INFO:tasks.workunit.client.1.vm05.stdout:3/938: stat d0/da9/fe3 0 2026-03-09T16:15:32.101 INFO:tasks.workunit.client.1.vm05.stdout:3/939: read d0/d9/d22/d5f/d75/d76/d88/d89/f12d [2383657,130469] 0 2026-03-09T16:15:32.109 INFO:tasks.workunit.client.1.vm05.stdout:7/999: write d1/d2/d11/d86/da2/fb0 [126661,121495] 0 2026-03-09T16:15:32.114 INFO:tasks.workunit.client.1.vm05.stdout:3/940: symlink d0/d9/d22/d12c/l139 0 2026-03-09T16:15:32.116 INFO:tasks.workunit.client.1.vm05.stdout:2/912: dread db/fc6 [0,4194304] 0 2026-03-09T16:15:32.118 INFO:tasks.workunit.client.1.vm05.stdout:3/941: write d0/d9/d22/d5f/d75/d76/d88/d89/f12d [1135780,52545] 0 2026-03-09T16:15:32.121 INFO:tasks.workunit.client.1.vm05.stdout:2/913: write db/dd/d15/d46/f4e [8647155,5499] 0 2026-03-09T16:15:32.124 INFO:tasks.workunit.client.1.vm05.stdout:3/942: symlink d0/d9/d8b/l13a 0 2026-03-09T16:15:32.127 INFO:tasks.workunit.client.1.vm05.stdout:2/914: dread - db/dd/d15/d3f/d5b/d60/da2/fa9 zero size 2026-03-09T16:15:32.129 INFO:tasks.workunit.client.1.vm05.stdout:2/915: creat db/dd/d15/d1f/d21/f12a x:0 0 0 2026-03-09T16:15:32.129 INFO:tasks.workunit.client.1.vm05.stdout:2/916: readlink db/dd/d15/d1f/d20/d23/l37 0 2026-03-09T16:15:32.131 INFO:tasks.workunit.client.1.vm05.stdout:6/943: write d17/d22/d27/d34/d42/d65/f75 [2538610,5610] 0 2026-03-09T16:15:32.131 INFO:tasks.workunit.client.1.vm05.stdout:9/987: write d4/d10/d35/d36/d48/d60/dcb/f132 [4099062,69002] 0 2026-03-09T16:15:32.138 INFO:tasks.workunit.client.1.vm05.stdout:3/943: getdents d0/d9/d22/d5f/d7b 0 2026-03-09T16:15:32.139 INFO:tasks.workunit.client.1.vm05.stdout:3/944: dread - d0/d9/d22/d6b/f127 zero size 2026-03-09T16:15:32.139 INFO:tasks.workunit.client.1.vm05.stdout:0/917: write d5/f76 [3741270,37470] 0 2026-03-09T16:15:32.143 INFO:tasks.workunit.client.1.vm05.stdout:9/988: mknod d4/d10/d35/d2b/dc1/dc2/d10f/c13e 0 2026-03-09T16:15:32.145 INFO:tasks.workunit.client.1.vm05.stdout:6/944: truncate d17/d5d/d73/d83/fe6 45877 0 2026-03-09T16:15:32.149 INFO:tasks.workunit.client.1.vm05.stdout:9/989: creat d4/d10/d35/d2b/dc1/f13f x:0 0 0 2026-03-09T16:15:32.150 INFO:tasks.workunit.client.1.vm05.stdout:2/917: dread db/dd/d15/d1f/f2b [0,4194304] 0 2026-03-09T16:15:32.151 INFO:tasks.workunit.client.1.vm05.stdout:6/945: symlink d17/d22/d27/d15a/l160 0 2026-03-09T16:15:32.152 INFO:tasks.workunit.client.1.vm05.stdout:0/918: getdents d5/db/d77/d12a 0 2026-03-09T16:15:32.154 INFO:tasks.workunit.client.1.vm05.stdout:9/990: dwrite d4/d10/d35/d2b/dc1/dc2/d10f/f11f [0,4194304] 0 2026-03-09T16:15:32.156 INFO:tasks.workunit.client.1.vm05.stdout:9/991: write d4/d10/d35/d36/d48/d60/dae/fcf [4006474,12175] 0 2026-03-09T16:15:32.156 INFO:tasks.workunit.client.1.vm05.stdout:9/992: fdatasync d4/f2e 0 2026-03-09T16:15:32.157 INFO:tasks.workunit.client.1.vm05.stdout:9/993: write d4/d10/faa [4949177,53965] 0 2026-03-09T16:15:32.170 INFO:tasks.workunit.client.1.vm05.stdout:6/946: dread d17/d22/d9d/da5/fcc [0,4194304] 0 2026-03-09T16:15:32.175 INFO:tasks.workunit.client.1.vm05.stdout:3/945: link d0/c62 d0/d9/d22/d5f/d7b/da8/dd8/c13b 0 2026-03-09T16:15:32.175 INFO:tasks.workunit.client.1.vm05.stdout:9/994: rmdir d4/d10/d35/d2b/d31/d82/df8 39 2026-03-09T16:15:32.176 INFO:tasks.workunit.client.1.vm05.stdout:3/946: stat d0/d9/d22/d5f/d90/dae/dd2/d101 0 2026-03-09T16:15:32.177 INFO:tasks.workunit.client.1.vm05.stdout:6/947: creat d17/d22/d9d/da5/f161 x:0 0 0 2026-03-09T16:15:32.178 INFO:tasks.workunit.client.1.vm05.stdout:3/947: dread d0/f60 [0,4194304] 0 2026-03-09T16:15:32.184 INFO:tasks.workunit.client.1.vm05.stdout:6/948: dread d17/f2d [0,4194304] 0 2026-03-09T16:15:32.185 INFO:tasks.workunit.client.1.vm05.stdout:3/948: sync 2026-03-09T16:15:32.186 INFO:tasks.workunit.client.1.vm05.stdout:2/918: getdents db/dd/d15 0 2026-03-09T16:15:32.191 INFO:tasks.workunit.client.1.vm05.stdout:0/919: link d5/d109/f118 d5/db/d48/f136 0 2026-03-09T16:15:32.194 INFO:tasks.workunit.client.1.vm05.stdout:6/949: creat d17/d22/d27/d15a/f162 x:0 0 0 2026-03-09T16:15:32.204 INFO:tasks.workunit.client.1.vm05.stdout:2/919: write db/dd/d98/fe1 [4146842,103219] 0 2026-03-09T16:15:32.206 INFO:tasks.workunit.client.1.vm05.stdout:2/920: chown db/ffa 2961581 1 2026-03-09T16:15:32.222 INFO:tasks.workunit.client.1.vm05.stdout:9/995: creat d4/d10/d35/d36/f140 x:0 0 0 2026-03-09T16:15:32.222 INFO:tasks.workunit.client.1.vm05.stdout:0/920: mkdir d5/d1b/d137 0 2026-03-09T16:15:32.225 INFO:tasks.workunit.client.1.vm05.stdout:6/950: creat d17/d22/d27/d34/d42/d53/d87/df6/f163 x:0 0 0 2026-03-09T16:15:32.225 INFO:tasks.workunit.client.1.vm05.stdout:3/949: dwrite d0/d9/f2c [0,4194304] 0 2026-03-09T16:15:32.239 INFO:tasks.workunit.client.1.vm05.stdout:2/921: mkdir db/dd/d15/d4c/d12b 0 2026-03-09T16:15:32.241 INFO:tasks.workunit.client.1.vm05.stdout:0/921: mknod d5/db/d48/d66/c138 0 2026-03-09T16:15:32.242 INFO:tasks.workunit.client.1.vm05.stdout:9/996: creat d4/d10/d35/d2b/d38/d65/f141 x:0 0 0 2026-03-09T16:15:32.250 INFO:tasks.workunit.client.1.vm05.stdout:3/950: mkdir d0/d9/d22/d5f/d90/dae/dd2/d101/d13c 0 2026-03-09T16:15:32.265 INFO:tasks.workunit.client.1.vm05.stdout:9/997: dread d4/d10/d35/d36/fce [0,4194304] 0 2026-03-09T16:15:32.266 INFO:tasks.workunit.client.1.vm05.stdout:3/951: mkdir d0/d9/d22/d5f/d90/dae/dd2/d101/d13d 0 2026-03-09T16:15:32.266 INFO:tasks.workunit.client.1.vm05.stdout:9/998: write d4/d10c/f124 [722958,41360] 0 2026-03-09T16:15:32.267 INFO:tasks.workunit.client.1.vm05.stdout:3/952: write d0/d9/d22/f54 [3906534,27513] 0 2026-03-09T16:15:32.270 INFO:tasks.workunit.client.1.vm05.stdout:0/922: mkdir d5/db/def/df2/d103/d139 0 2026-03-09T16:15:32.270 INFO:tasks.workunit.client.1.vm05.stdout:0/923: chown d5/db/d48/dc3 5350 1 2026-03-09T16:15:32.276 INFO:tasks.workunit.client.1.vm05.stdout:6/951: creat d17/d5d/d73/f164 x:0 0 0 2026-03-09T16:15:32.277 INFO:tasks.workunit.client.1.vm05.stdout:9/999: unlink d4/d10/d35/d36/d48/d60/f6c 0 2026-03-09T16:15:32.298 INFO:tasks.workunit.client.1.vm05.stdout:0/924: creat d5/db/def/f13a x:0 0 0 2026-03-09T16:15:32.299 INFO:tasks.workunit.client.1.vm05.stdout:6/952: creat d17/d22/d27/d34/d42/d65/f165 x:0 0 0 2026-03-09T16:15:32.306 INFO:tasks.workunit.client.1.vm05.stdout:6/953: chown d17/d22/d27/d58/f72 7880511 1 2026-03-09T16:15:32.306 INFO:tasks.workunit.client.1.vm05.stdout:2/922: getdents db/dd/d15/d3f/d5b/d60/d95 0 2026-03-09T16:15:32.311 INFO:tasks.workunit.client.1.vm05.stdout:3/953: dread d0/d9/d97/fd4 [0,4194304] 0 2026-03-09T16:15:32.339 INFO:tasks.workunit.client.1.vm05.stdout:3/954: rename d0/d9/d22/f5b to d0/da9/f13e 0 2026-03-09T16:15:32.339 INFO:tasks.workunit.client.1.vm05.stdout:0/925: mknod d5/db/d77/df3/d11c/c13b 0 2026-03-09T16:15:32.341 INFO:tasks.workunit.client.1.vm05.stdout:3/955: creat d0/d9/d22/d5f/d75/d76/d88/da3/df7/d123/f13f x:0 0 0 2026-03-09T16:15:32.342 INFO:tasks.workunit.client.1.vm05.stdout:0/926: mknod d5/db/d5f/deb/c13c 0 2026-03-09T16:15:32.343 INFO:tasks.workunit.client.1.vm05.stdout:0/927: chown d5/f17 1 1 2026-03-09T16:15:32.349 INFO:tasks.workunit.client.1.vm05.stdout:3/956: getdents d0/d9/d22/d5f/d90 0 2026-03-09T16:15:32.377 INFO:tasks.workunit.client.1.vm05.stdout:2/923: write db/dd/d7b/f102 [332790,66816] 0 2026-03-09T16:15:32.385 INFO:tasks.workunit.client.1.vm05.stdout:2/924: symlink db/dd/d15/d4c/d56/l12c 0 2026-03-09T16:15:32.389 INFO:tasks.workunit.client.1.vm05.stdout:6/954: dwrite d17/d5d/d73/d83/fe6 [0,4194304] 0 2026-03-09T16:15:32.397 INFO:tasks.workunit.client.1.vm05.stdout:0/928: write d5/db/d5b/d82/fcc [860589,25021] 0 2026-03-09T16:15:32.397 INFO:tasks.workunit.client.1.vm05.stdout:3/957: write d0/d9/d22/d5f/fa7 [72101,48453] 0 2026-03-09T16:15:32.398 INFO:tasks.workunit.client.1.vm05.stdout:6/955: creat d17/d22/dce/d11d/f166 x:0 0 0 2026-03-09T16:15:32.399 INFO:tasks.workunit.client.1.vm05.stdout:6/956: readlink d17/d5d/d73/d83/l108 0 2026-03-09T16:15:32.399 INFO:tasks.workunit.client.1.vm05.stdout:3/958: chown d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/fbf 1 1 2026-03-09T16:15:32.400 INFO:tasks.workunit.client.1.vm05.stdout:0/929: creat d5/db/d5f/da3/f13d x:0 0 0 2026-03-09T16:15:32.401 INFO:tasks.workunit.client.1.vm05.stdout:0/930: readlink d5/d1b/l2b 0 2026-03-09T16:15:32.401 INFO:tasks.workunit.client.1.vm05.stdout:0/931: stat d5/db/d5f/deb/c13c 0 2026-03-09T16:15:32.401 INFO:tasks.workunit.client.1.vm05.stdout:6/957: mknod d17/d22/d9d/da5/d122/c167 0 2026-03-09T16:15:32.405 INFO:tasks.workunit.client.1.vm05.stdout:3/959: creat d0/d9/d22/d6b/f140 x:0 0 0 2026-03-09T16:15:32.407 INFO:tasks.workunit.client.1.vm05.stdout:0/932: mkdir d5/db/d5f/d13e 0 2026-03-09T16:15:32.411 INFO:tasks.workunit.client.1.vm05.stdout:6/958: link d17/d22/d9d/da9/d157/l134 d17/d22/d27/d34/d42/d53/l168 0 2026-03-09T16:15:32.411 INFO:tasks.workunit.client.1.vm05.stdout:6/959: stat d17/ff9 0 2026-03-09T16:15:32.417 INFO:tasks.workunit.client.1.vm05.stdout:6/960: unlink d17/d22/d27/d34/f66 0 2026-03-09T16:15:32.419 INFO:tasks.workunit.client.1.vm05.stdout:2/925: dwrite db/dd/d15/d3f/f5c [0,4194304] 0 2026-03-09T16:15:32.431 INFO:tasks.workunit.client.1.vm05.stdout:2/926: dwrite db/dd/d15/d3f/d5b/d60/da2/fde [0,4194304] 0 2026-03-09T16:15:32.434 INFO:tasks.workunit.client.1.vm05.stdout:6/961: readlink d17/l39 0 2026-03-09T16:15:32.435 INFO:tasks.workunit.client.1.vm05.stdout:3/960: getdents d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3 0 2026-03-09T16:15:32.436 INFO:tasks.workunit.client.1.vm05.stdout:3/961: stat d0/d9/d22/d5f/d90 0 2026-03-09T16:15:32.436 INFO:tasks.workunit.client.1.vm05.stdout:0/933: getdents d5 0 2026-03-09T16:15:32.449 INFO:tasks.workunit.client.1.vm05.stdout:2/927: sync 2026-03-09T16:15:32.454 INFO:tasks.workunit.client.1.vm05.stdout:6/962: rename d17/d4f/fbd to d17/d22/d27/df8/d112/d143/f169 0 2026-03-09T16:15:32.455 INFO:tasks.workunit.client.1.vm05.stdout:0/934: symlink d5/db/d77/df3/l13f 0 2026-03-09T16:15:32.456 INFO:tasks.workunit.client.1.vm05.stdout:0/935: stat d5/db/d48 0 2026-03-09T16:15:32.468 INFO:tasks.workunit.client.1.vm05.stdout:2/928: rmdir db/dd/d15/d1f/d21 39 2026-03-09T16:15:32.469 INFO:tasks.workunit.client.1.vm05.stdout:6/963: mkdir d17/d22/d27/d34/d42/d53/d87/df6/d16a 0 2026-03-09T16:15:32.470 INFO:tasks.workunit.client.1.vm05.stdout:0/936: symlink d5/db/d48/dc3/l140 0 2026-03-09T16:15:32.471 INFO:tasks.workunit.client.1.vm05.stdout:3/962: link d0/d9/d22/d5f/d7b/da8/f109 d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3/f141 0 2026-03-09T16:15:32.473 INFO:tasks.workunit.client.1.vm05.stdout:6/964: rename d17/d22/d27/d34/d4b/d7f/fc8 to d17/d22/d27/d44/f16b 0 2026-03-09T16:15:32.475 INFO:tasks.workunit.client.1.vm05.stdout:3/963: creat d0/d9/d22/d5f/d75/d76/d88/da3/df7/d4e/db3/f142 x:0 0 0 2026-03-09T16:15:32.476 INFO:tasks.workunit.client.1.vm05.stdout:6/965: creat d17/d22/d27/d34/d42/d53/d87/d104/f16c x:0 0 0 2026-03-09T16:15:32.477 INFO:tasks.workunit.client.1.vm05.stdout:6/966: readlink d17/d22/d27/d44/l5e 0 2026-03-09T16:15:32.480 INFO:tasks.workunit.client.1.vm05.stdout:2/929: getdents db/dd/d15/d1f/d20/d23 0 2026-03-09T16:15:32.481 INFO:tasks.workunit.client.1.vm05.stdout:3/964: symlink d0/d9/l143 0 2026-03-09T16:15:32.484 INFO:tasks.workunit.client.1.vm05.stdout:2/930: creat db/dd/d15/d3f/d5b/d60/d95/ddd/f12d x:0 0 0 2026-03-09T16:15:32.485 INFO:tasks.workunit.client.1.vm05.stdout:3/965: truncate d0/d9/d22/d5f/d90/fde 1005319 0 2026-03-09T16:15:32.490 INFO:tasks.workunit.client.1.vm05.stdout:6/967: link d17/d22/d27/d58/db8/c107 d17/d22/d27/d34/d42/d53/c16d 0 2026-03-09T16:15:32.494 INFO:tasks.workunit.client.1.vm05.stdout:2/931: getdents db/dd/d15/d46/df3 0 2026-03-09T16:15:32.494 INFO:tasks.workunit.client.1.vm05.stdout:6/968: write d17/d22/d9d/f140 [224098,29661] 0 2026-03-09T16:15:32.499 INFO:tasks.workunit.client.1.vm05.stdout:0/937: dwrite d5/d11/f9f [4194304,4194304] 0 2026-03-09T16:15:32.502 INFO:tasks.workunit.client.1.vm05.stdout:6/969: truncate d17/d22/d9d/f140 632823 0 2026-03-09T16:15:32.502 INFO:tasks.workunit.client.1.vm05.stdout:3/966: dread d0/d33/f36 [0,4194304] 0 2026-03-09T16:15:32.509 INFO:tasks.workunit.client.1.vm05.stdout:0/938: dread d5/d11/d4f/d70/fbf [0,4194304] 0 2026-03-09T16:15:32.509 INFO:tasks.workunit.client.1.vm05.stdout:2/932: getdents db/dd/d15/d3f/d5b/d60/d95/de7/d10f 0 2026-03-09T16:15:32.513 INFO:tasks.workunit.client.1.vm05.stdout:2/933: chown db/dd/d15/d3f/d5b/d60/f119 353 1 2026-03-09T16:15:32.514 INFO:tasks.workunit.client.1.vm05.stdout:2/934: fsync db/dd/d15/d3f/d5b/f69 0 2026-03-09T16:15:32.517 INFO:tasks.workunit.client.1.vm05.stdout:2/935: dwrite db/dd/f107 [0,4194304] 0 2026-03-09T16:15:32.523 INFO:tasks.workunit.client.1.vm05.stdout:3/967: write d0/d9/d97/fd4 [4387195,85969] 0 2026-03-09T16:15:32.529 INFO:tasks.workunit.client.1.vm05.stdout:6/970: unlink d17/d22/d27/d34/d42/d53/f11c 0 2026-03-09T16:15:32.530 INFO:tasks.workunit.client.1.vm05.stdout:6/971: write d17/d22/dce/fdf [4077340,2524] 0 2026-03-09T16:15:32.540 INFO:tasks.workunit.client.1.vm05.stdout:2/936: mkdir db/dd/d98/d12e 0 2026-03-09T16:15:32.541 INFO:tasks.workunit.client.1.vm05.stdout:3/968: creat d0/d9/d22/d5f/d75/d76/d88/da3/df7/d123/f144 x:0 0 0 2026-03-09T16:15:32.542 INFO:tasks.workunit.client.1.vm05.stdout:6/972: fdatasync d17/d22/d27/d34/d42/d53/f69 0 2026-03-09T16:15:32.545 INFO:tasks.workunit.client.1.vm05.stdout:6/973: symlink d17/d5d/d73/d83/d145/l16e 0 2026-03-09T16:15:32.547 INFO:tasks.workunit.client.1.vm05.stdout:2/937: truncate db/dd/d15/d1f/d21/d87/fbe 657995 0 2026-03-09T16:15:32.548 INFO:tasks.workunit.client.1.vm05.stdout:6/974: creat d17/d22/d27/d34/d42/d53/d87/d104/f16f x:0 0 0 2026-03-09T16:15:32.552 INFO:tasks.workunit.client.1.vm05.stdout:6/975: symlink d17/d22/d27/d34/d42/d65/d117/d158/l170 0 2026-03-09T16:15:32.555 INFO:tasks.workunit.client.1.vm05.stdout:0/939: write d5/d1b/d3b/dc2/fe1 [959489,128250] 0 2026-03-09T16:15:32.556 INFO:tasks.workunit.client.1.vm05.stdout:0/940: write d5/d11/f40 [2846232,30401] 0 2026-03-09T16:15:32.556 INFO:tasks.workunit.client.1.vm05.stdout:3/969: write d0/d9/d22/f2e [3111804,100658] 0 2026-03-09T16:15:32.558 INFO:tasks.workunit.client.1.vm05.stdout:2/938: write db/dd/d15/d3f/d5b/d60/f7c [2779466,23702] 0 2026-03-09T16:15:32.561 INFO:tasks.workunit.client.1.vm05.stdout:6/976: creat d17/d22/d27/d44/d125/f171 x:0 0 0 2026-03-09T16:15:32.565 INFO:tasks.workunit.client.1.vm05.stdout:3/970: dwrite d0/d9/d97/dac/f129 [0,4194304] 0 2026-03-09T16:15:32.569 INFO:tasks.workunit.client.1.vm05.stdout:0/941: truncate d5/db/d48/d66/f91 1817315 0 2026-03-09T16:15:32.583 INFO:tasks.workunit.client.1.vm05.stdout:2/939: rename db/dd/d15/d4c/d12b to db/dd/d15/d3f/d5b/d60/d12f 0 2026-03-09T16:15:32.594 INFO:tasks.workunit.client.1.vm05.stdout:0/942: rmdir d5/d2c/d49 39 2026-03-09T16:15:32.594 INFO:tasks.workunit.client.1.vm05.stdout:0/943: chown d5/db/def 24524 1 2026-03-09T16:15:32.597 INFO:tasks.workunit.client.1.vm05.stdout:2/940: sync 2026-03-09T16:15:32.599 INFO:tasks.workunit.client.1.vm05.stdout:0/944: symlink d5/d1b/d3b/dc2/l141 0 2026-03-09T16:15:32.601 INFO:tasks.workunit.client.1.vm05.stdout:6/977: write d17/f30 [1905672,33350] 0 2026-03-09T16:15:32.604 INFO:tasks.workunit.client.1.vm05.stdout:2/941: mkdir db/dd/d15/d3f/d130 0 2026-03-09T16:15:32.604 INFO:tasks.workunit.client.1.vm05.stdout:6/978: truncate d17/d22/d27/d34/d42/d53/d87/df6/f14b 1498396 0 2026-03-09T16:15:32.606 INFO:tasks.workunit.client.1.vm05.stdout:6/979: write d17/d22/d27/f6b [4020882,130169] 0 2026-03-09T16:15:32.607 INFO:tasks.workunit.client.1.vm05.stdout:3/971: creat d0/d9/d97/dad/f145 x:0 0 0 2026-03-09T16:15:32.608 INFO:tasks.workunit.client.1.vm05.stdout:3/972: sync 2026-03-09T16:15:32.608 INFO:tasks.workunit.client.1.vm05.stdout:0/945: dread d5/d11/d4f/d11b/fbe [0,4194304] 0 2026-03-09T16:15:32.616 INFO:tasks.workunit.client.1.vm05.stdout:2/942: fdatasync db/dd/d15/d3f/f75 0 2026-03-09T16:15:32.618 INFO:tasks.workunit.client.1.vm05.stdout:2/943: write db/dd/d15/f48 [883258,68323] 0 2026-03-09T16:15:32.622 INFO:tasks.workunit.client.1.vm05.stdout:3/973: mknod d0/d9/d22/d5f/d7b/d99/c146 0 2026-03-09T16:15:32.623 INFO:tasks.workunit.client.1.vm05.stdout:3/974: write d0/d9/d22/d5f/d75/f100 [564489,53172] 0 2026-03-09T16:15:32.623 INFO:tasks.workunit.client.1.vm05.stdout:6/980: dread d17/d22/d9d/fb2 [0,4194304] 0 2026-03-09T16:15:32.626 INFO:tasks.workunit.client.1.vm05.stdout:3/975: write d0/d9/d22/d5f/d75/d76/d88/f9c [4619830,29732] 0 2026-03-09T16:15:32.628 INFO:tasks.workunit.client.1.vm05.stdout:3/976: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d123/f13f [97972,14416] 0 2026-03-09T16:15:32.630 INFO:tasks.workunit.client.1.vm05.stdout:3/977: chown d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/l8a 7 1 2026-03-09T16:15:32.633 INFO:tasks.workunit.client.1.vm05.stdout:6/981: dwrite d17/d22/d27/d34/d42/d53/d87/df6/f163 [0,4194304] 0 2026-03-09T16:15:32.636 INFO:tasks.workunit.client.1.vm05.stdout:3/978: truncate d0/d9/d22/d5f/d7b/da8/f109 612733 0 2026-03-09T16:15:32.640 INFO:tasks.workunit.client.1.vm05.stdout:6/982: sync 2026-03-09T16:15:32.641 INFO:tasks.workunit.client.1.vm05.stdout:2/944: write db/dd/d15/d1f/d20/d23/f7a [103276,58763] 0 2026-03-09T16:15:32.645 INFO:tasks.workunit.client.1.vm05.stdout:6/983: mkdir d17/d4f/d172 0 2026-03-09T16:15:32.646 INFO:tasks.workunit.client.1.vm05.stdout:6/984: truncate d17/d22/d9d/fca 355285 0 2026-03-09T16:15:32.646 INFO:tasks.workunit.client.1.vm05.stdout:3/979: rmdir d0/d9/d22/d5f/d90 39 2026-03-09T16:15:32.648 INFO:tasks.workunit.client.1.vm05.stdout:0/946: getdents d5/db/d5b/d82 0 2026-03-09T16:15:32.654 INFO:tasks.workunit.client.1.vm05.stdout:2/945: mkdir db/dd/d131 0 2026-03-09T16:15:32.655 INFO:tasks.workunit.client.1.vm05.stdout:3/980: sync 2026-03-09T16:15:32.655 INFO:tasks.workunit.client.1.vm05.stdout:6/985: dread d17/d22/d27/d34/d42/d53/f90 [0,4194304] 0 2026-03-09T16:15:32.656 INFO:tasks.workunit.client.1.vm05.stdout:2/946: write db/dd/d15/d4c/d56/f10c [3762170,56538] 0 2026-03-09T16:15:32.656 INFO:tasks.workunit.client.1.vm05.stdout:0/947: stat d5/db/ff7 0 2026-03-09T16:15:32.663 INFO:tasks.workunit.client.1.vm05.stdout:3/981: mkdir d0/d9/d22/d5f/d7b/da8/dd8/d147 0 2026-03-09T16:15:32.663 INFO:tasks.workunit.client.1.vm05.stdout:3/982: chown d0/c106 51 1 2026-03-09T16:15:32.665 INFO:tasks.workunit.client.1.vm05.stdout:2/947: creat db/dd/d15/d3f/d5b/d7e/f132 x:0 0 0 2026-03-09T16:15:32.667 INFO:tasks.workunit.client.1.vm05.stdout:3/983: dwrite d0/d9/d22/d5f/d75/d76/d88/d89/f112 [0,4194304] 0 2026-03-09T16:15:32.669 INFO:tasks.workunit.client.1.vm05.stdout:6/986: truncate d17/d4f/f70 1478466 0 2026-03-09T16:15:32.670 INFO:tasks.workunit.client.1.vm05.stdout:0/948: creat d5/d2c/d49/d83/f142 x:0 0 0 2026-03-09T16:15:32.675 INFO:tasks.workunit.client.1.vm05.stdout:3/984: dwrite d0/d9/d97/dc2/ffc [0,4194304] 0 2026-03-09T16:15:32.681 INFO:tasks.workunit.client.1.vm05.stdout:6/987: dwrite d17/d22/d27/d34/d42/d53/f136 [0,4194304] 0 2026-03-09T16:15:32.696 INFO:tasks.workunit.client.1.vm05.stdout:3/985: symlink d0/d33/l148 0 2026-03-09T16:15:32.697 INFO:tasks.workunit.client.1.vm05.stdout:0/949: mknod d5/d1b/d3b/dfc/c143 0 2026-03-09T16:15:32.700 INFO:tasks.workunit.client.1.vm05.stdout:3/986: symlink d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/l149 0 2026-03-09T16:15:32.701 INFO:tasks.workunit.client.1.vm05.stdout:0/950: rename d5/d11/d4f/c8e to d5/d2c/d49/d83/d8b/dd5/c144 0 2026-03-09T16:15:32.704 INFO:tasks.workunit.client.1.vm05.stdout:3/987: mknod d0/da9/c14a 0 2026-03-09T16:15:32.704 INFO:tasks.workunit.client.1.vm05.stdout:0/951: creat d5/db/def/f145 x:0 0 0 2026-03-09T16:15:32.705 INFO:tasks.workunit.client.1.vm05.stdout:0/952: read - d5/d97/f127 zero size 2026-03-09T16:15:32.705 INFO:tasks.workunit.client.1.vm05.stdout:3/988: write d0/d9/d22/d5f/d75/f117 [1666079,128165] 0 2026-03-09T16:15:32.706 INFO:tasks.workunit.client.1.vm05.stdout:0/953: stat d5/db/d5f/la1 0 2026-03-09T16:15:32.708 INFO:tasks.workunit.client.1.vm05.stdout:3/989: unlink d0/d33/l138 0 2026-03-09T16:15:32.710 INFO:tasks.workunit.client.1.vm05.stdout:3/990: truncate d0/d9/d97/fd4 5116477 0 2026-03-09T16:15:32.715 INFO:tasks.workunit.client.1.vm05.stdout:3/991: mknod d0/d9/d22/c14b 0 2026-03-09T16:15:32.715 INFO:tasks.workunit.client.1.vm05.stdout:3/992: symlink d0/d33/l14c 0 2026-03-09T16:15:32.717 INFO:tasks.workunit.client.1.vm05.stdout:3/993: rename d0/d9/d22/d5f/d7b/d99/f9d to d0/da9/f14d 0 2026-03-09T16:15:32.725 INFO:tasks.workunit.client.1.vm05.stdout:3/994: creat d0/d9/d22/d5f/d90/f14e x:0 0 0 2026-03-09T16:15:32.727 INFO:tasks.workunit.client.1.vm05.stdout:2/948: write db/f17 [1224280,96555] 0 2026-03-09T16:15:32.729 INFO:tasks.workunit.client.1.vm05.stdout:2/949: chown db/dd/f6b 8 1 2026-03-09T16:15:32.730 INFO:tasks.workunit.client.1.vm05.stdout:2/950: write db/dd/d15/d3f/d5b/d60/da2/f116 [316760,24048] 0 2026-03-09T16:15:32.730 INFO:tasks.workunit.client.1.vm05.stdout:6/988: write d17/f3b [979813,48352] 0 2026-03-09T16:15:32.733 INFO:tasks.workunit.client.1.vm05.stdout:2/951: write db/dd/d15/d3f/d5b/d60/f119 [161344,78877] 0 2026-03-09T16:15:32.741 INFO:tasks.workunit.client.1.vm05.stdout:2/952: readlink db/dd/d15/d1f/lc7 0 2026-03-09T16:15:32.743 INFO:tasks.workunit.client.1.vm05.stdout:0/954: write d5/d2c/dff/f6d [1266627,1830] 0 2026-03-09T16:15:32.745 INFO:tasks.workunit.client.1.vm05.stdout:3/995: write d0/d9/d22/d5f/d75/d76/fed [816375,58873] 0 2026-03-09T16:15:32.750 INFO:tasks.workunit.client.1.vm05.stdout:2/953: rename db/dd/d15/d46/df3/f127 to db/dd/d15/d3f/d5b/d60/d95/d109/f133 0 2026-03-09T16:15:32.750 INFO:tasks.workunit.client.1.vm05.stdout:6/989: dwrite d17/d1d/f33 [0,4194304] 0 2026-03-09T16:15:32.756 INFO:tasks.workunit.client.1.vm05.stdout:0/955: dwrite d5/d2c/dff/f59 [8388608,4194304] 0 2026-03-09T16:15:32.761 INFO:tasks.workunit.client.1.vm05.stdout:3/996: creat d0/d9/d22/d5f/d75/d76/d88/d10c/f14f x:0 0 0 2026-03-09T16:15:32.828 INFO:tasks.workunit.client.1.vm05.stdout:6/990: mkdir d17/d22/dce/d11d/d173 0 2026-03-09T16:15:32.828 INFO:tasks.workunit.client.1.vm05.stdout:6/991: stat d17/f31 0 2026-03-09T16:15:32.829 INFO:tasks.workunit.client.1.vm05.stdout:6/992: fdatasync d17/d22/d27/d58/f97 0 2026-03-09T16:15:32.831 INFO:tasks.workunit.client.1.vm05.stdout:0/956: symlink d5/db/d5f/d13e/l146 0 2026-03-09T16:15:32.832 INFO:tasks.workunit.client.1.vm05.stdout:0/957: write d5/d2c/dff/f2e [1186052,91984] 0 2026-03-09T16:15:32.835 INFO:tasks.workunit.client.1.vm05.stdout:6/993: creat d17/d4f/d172/f174 x:0 0 0 2026-03-09T16:15:32.841 INFO:tasks.workunit.client.1.vm05.stdout:3/997: write d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80/fbf [225437,70678] 0 2026-03-09T16:15:32.843 INFO:tasks.workunit.client.1.vm05.stdout:0/958: write d5/d11/d4f/d70/fbf [500229,88363] 0 2026-03-09T16:15:32.844 INFO:tasks.workunit.client.1.vm05.stdout:3/998: stat d0/d9/d22/d5f/d75/d76/d88/da3/df7/d80 0 2026-03-09T16:15:32.846 INFO:tasks.workunit.client.1.vm05.stdout:0/959: unlink d5/d1b/d3b/dfc/c143 0 2026-03-09T16:15:32.846 INFO:tasks.workunit.client.1.vm05.stdout:2/954: dwrite db/dd/d15/d3f/d5b/d60/da2/fa9 [0,4194304] 0 2026-03-09T16:15:32.850 INFO:tasks.workunit.client.1.vm05.stdout:0/960: write d5/d2c/dff/f2e [2041774,52316] 0 2026-03-09T16:15:32.850 INFO:tasks.workunit.client.1.vm05.stdout:0/961: write d5/d2c/dff/f6d [1399599,35178] 0 2026-03-09T16:15:32.857 INFO:tasks.workunit.client.1.vm05.stdout:2/955: creat db/dd/d15/d1f/dc1/d11a/f134 x:0 0 0 2026-03-09T16:15:32.861 INFO:tasks.workunit.client.1.vm05.stdout:3/999: dread d0/d9/d22/d5f/d75/d76/d88/f9c [0,4194304] 0 2026-03-09T16:15:32.879 INFO:tasks.workunit.client.1.vm05.stdout:2/956: read - db/dd/d15/d1f/f9c zero size 2026-03-09T16:15:32.883 INFO:tasks.workunit.client.1.vm05.stdout:6/994: dwrite d17/d5d/fef [0,4194304] 0 2026-03-09T16:15:32.886 INFO:tasks.workunit.client.1.vm05.stdout:2/957: rename db/dd/d15/d1f/d21/d87/f99 to db/dd/d15/d1f/dc1/f135 0 2026-03-09T16:15:32.886 INFO:tasks.workunit.client.1.vm05.stdout:6/995: mknod d17/d22/d27/d34/d42/d65/c175 0 2026-03-09T16:15:32.886 INFO:tasks.workunit.client.1.vm05.stdout:6/996: read - d17/d22/dce/d11d/f166 zero size 2026-03-09T16:15:32.898 INFO:tasks.workunit.client.1.vm05.stdout:6/997: creat d17/d4f/d172/f176 x:0 0 0 2026-03-09T16:15:32.898 INFO:tasks.workunit.client.1.vm05.stdout:0/962: dwrite d5/d1b/fbb [0,4194304] 0 2026-03-09T16:15:32.906 INFO:tasks.workunit.client.1.vm05.stdout:2/958: link db/dd/d15/d46/d67/ld0 db/dd/d98/l136 0 2026-03-09T16:15:32.907 INFO:tasks.workunit.client.1.vm05.stdout:0/963: symlink d5/d11/d4f/d68/l147 0 2026-03-09T16:15:32.911 INFO:tasks.workunit.client.1.vm05.stdout:0/964: fsync d5/db/ff7 0 2026-03-09T16:15:32.914 INFO:tasks.workunit.client.1.vm05.stdout:2/959: link db/dd/d15/d1f/d20/d23/c26 db/dd/d15/d46/df3/c137 0 2026-03-09T16:15:32.917 INFO:tasks.workunit.client.1.vm05.stdout:2/960: mknod db/dd/d15/d3f/d5b/d60/d95/c138 0 2026-03-09T16:15:32.922 INFO:tasks.workunit.client.1.vm05.stdout:2/961: symlink db/dd/d15/d4c/l139 0 2026-03-09T16:15:32.924 INFO:tasks.workunit.client.1.vm05.stdout:2/962: symlink db/dd/d15/d3f/d5b/d60/d6a/l13a 0 2026-03-09T16:15:32.927 INFO:tasks.workunit.client.1.vm05.stdout:2/963: dwrite db/dd/d15/d3f/d5b/d60/d95/f122 [0,4194304] 0 2026-03-09T16:15:32.944 INFO:tasks.workunit.client.1.vm05.stdout:6/998: dwrite d17/d22/d9d/da5/fcc [0,4194304] 0 2026-03-09T16:15:32.946 INFO:tasks.workunit.client.1.vm05.stdout:0/965: dwrite d5/d2c/d49/d83/fc9 [0,4194304] 0 2026-03-09T16:15:32.971 INFO:tasks.workunit.client.1.vm05.stdout:6/999: symlink d17/d22/d27/d8a/d8b/l177 0 2026-03-09T16:15:32.977 INFO:tasks.workunit.client.1.vm05.stdout:0/966: symlink d5/db/d77/df3/l148 0 2026-03-09T16:15:32.980 INFO:tasks.workunit.client.1.vm05.stdout:2/964: rmdir db/dd/d15/d3f/d5b/d60/d95/de7/d10f 0 2026-03-09T16:15:32.981 INFO:tasks.workunit.client.1.vm05.stdout:2/965: chown db/dd/d15/d46/d67/c74 1195454 1 2026-03-09T16:15:32.992 INFO:tasks.workunit.client.1.vm05.stdout:0/967: mknod d5/d1b/c149 0 2026-03-09T16:15:32.992 INFO:tasks.workunit.client.1.vm05.stdout:2/966: getdents db/dd/d15/d3f/d5b/d60/d12f 0 2026-03-09T16:15:32.995 INFO:tasks.workunit.client.1.vm05.stdout:0/968: symlink d5/d11/d4f/d70/l14a 0 2026-03-09T16:15:33.009 INFO:tasks.workunit.client.1.vm05.stdout:0/969: sync 2026-03-09T16:15:33.009 INFO:tasks.workunit.client.1.vm05.stdout:0/970: truncate d5/d11/f90 698482 0 2026-03-09T16:15:33.011 INFO:tasks.workunit.client.1.vm05.stdout:2/967: write db/dd/d15/d1f/d20/d23/f9b [3897695,27742] 0 2026-03-09T16:15:33.014 INFO:tasks.workunit.client.1.vm05.stdout:0/971: truncate d5/d97/fe2 889302 0 2026-03-09T16:15:33.015 INFO:tasks.workunit.client.1.vm05.stdout:2/968: chown db/dd/d15/d3f/d5b/d7e/cd6 0 1 2026-03-09T16:15:33.018 INFO:tasks.workunit.client.1.vm05.stdout:2/969: creat db/dd/d15/d3f/d5b/d60/d95/d109/d11e/f13b x:0 0 0 2026-03-09T16:15:33.021 INFO:tasks.workunit.client.1.vm05.stdout:2/970: creat db/dd/d15/d3f/d5b/d60/d95/dd7/f13c x:0 0 0 2026-03-09T16:15:33.022 INFO:tasks.workunit.client.1.vm05.stdout:2/971: write db/dd/d15/d3f/f100 [1787472,55822] 0 2026-03-09T16:15:33.026 INFO:tasks.workunit.client.1.vm05.stdout:0/972: dwrite d5/d1b/d30/f29 [0,4194304] 0 2026-03-09T16:15:33.027 INFO:tasks.workunit.client.1.vm05.stdout:2/972: mkdir db/dd/d15/d3f/d130/d13d 0 2026-03-09T16:15:33.032 INFO:tasks.workunit.client.1.vm05.stdout:0/973: write d5/d1b/d30/f29 [2099096,64329] 0 2026-03-09T16:15:33.033 INFO:tasks.workunit.client.1.vm05.stdout:2/973: symlink db/dd/d15/d1f/d21/l13e 0 2026-03-09T16:15:33.053 INFO:tasks.workunit.client.1.vm05.stdout:0/974: write d5/db/d5f/f85 [143284,81974] 0 2026-03-09T16:15:33.054 INFO:tasks.workunit.client.1.vm05.stdout:2/974: link db/dd/d15/d1f/d21/l13e db/dd/d7b/l13f 0 2026-03-09T16:15:33.055 INFO:tasks.workunit.client.1.vm05.stdout:2/975: write db/dd/d15/d46/f4e [7237400,83569] 0 2026-03-09T16:15:33.057 INFO:tasks.workunit.client.1.vm05.stdout:2/976: sync 2026-03-09T16:15:33.060 INFO:tasks.workunit.client.1.vm05.stdout:0/975: dwrite d5/d1b/f25 [0,4194304] 0 2026-03-09T16:15:33.070 INFO:tasks.workunit.client.1.vm05.stdout:2/977: symlink db/dd/d15/d3f/d5b/d60/d95/d109/d11e/l140 0 2026-03-09T16:15:33.070 INFO:tasks.workunit.client.1.vm05.stdout:0/976: rename d5/db/d5f/cee to d5/db/d48/dc3/c14b 0 2026-03-09T16:15:33.075 INFO:tasks.workunit.client.1.vm05.stdout:0/977: dread d5/db/d5b/d82/f89 [0,4194304] 0 2026-03-09T16:15:33.083 INFO:tasks.workunit.client.1.vm05.stdout:2/978: mkdir db/dd/d15/d3f/d5b/d60/d6a/dea/dfc/d141 0 2026-03-09T16:15:33.085 INFO:tasks.workunit.client.1.vm05.stdout:0/978: unlink d5/db/d48/d66/l67 0 2026-03-09T16:15:33.088 INFO:tasks.workunit.client.1.vm05.stdout:2/979: creat db/dd/d131/f142 x:0 0 0 2026-03-09T16:15:33.088 INFO:tasks.workunit.client.1.vm05.stdout:2/980: stat db/dd/d15/dff/c11b 0 2026-03-09T16:15:33.089 INFO:tasks.workunit.client.1.vm05.stdout:2/981: read - db/dd/d15/d1f/f9c zero size 2026-03-09T16:15:33.093 INFO:tasks.workunit.client.1.vm05.stdout:2/982: unlink db/dd/d15/d1f/f2b 0 2026-03-09T16:15:33.094 INFO:tasks.workunit.client.1.vm05.stdout:0/979: write d5/d109/f118 [165256,68668] 0 2026-03-09T16:15:33.095 INFO:tasks.workunit.client.1.vm05.stdout:2/983: mkdir db/dd/d15/d46/d143 0 2026-03-09T16:15:33.096 INFO:tasks.workunit.client.1.vm05.stdout:0/980: sync 2026-03-09T16:15:33.097 INFO:tasks.workunit.client.1.vm05.stdout:0/981: fsync d5/d1b/f25 0 2026-03-09T16:15:33.098 INFO:tasks.workunit.client.1.vm05.stdout:0/982: sync 2026-03-09T16:15:33.099 INFO:tasks.workunit.client.1.vm05.stdout:2/984: mkdir db/dd/d144 0 2026-03-09T16:15:33.101 INFO:tasks.workunit.client.1.vm05.stdout:0/983: chown d5/db/c1c 1 1 2026-03-09T16:15:33.104 INFO:tasks.workunit.client.1.vm05.stdout:0/984: creat d5/d97/f14c x:0 0 0 2026-03-09T16:15:33.108 INFO:tasks.workunit.client.1.vm05.stdout:0/985: creat d5/d11/d4f/ddc/d10a/f14d x:0 0 0 2026-03-09T16:15:33.114 INFO:tasks.workunit.client.1.vm05.stdout:2/985: link db/dd/d15/d3f/c96 db/dd/d15/d46/c145 0 2026-03-09T16:15:33.115 INFO:tasks.workunit.client.1.vm05.stdout:0/986: write d5/f73 [1824136,61191] 0 2026-03-09T16:15:33.116 INFO:tasks.workunit.client.1.vm05.stdout:0/987: mkdir d5/db/d77/df3/d14e 0 2026-03-09T16:15:33.120 INFO:tasks.workunit.client.1.vm05.stdout:0/988: dwrite d5/d109/f118 [0,4194304] 0 2026-03-09T16:15:33.121 INFO:tasks.workunit.client.1.vm05.stdout:2/986: rename db/dd/d15/d1f/d20/fee to db/dd/d15/d46/f146 0 2026-03-09T16:15:33.122 INFO:tasks.workunit.client.1.vm05.stdout:2/987: readlink db/dd/d15/d1f/d21/l40 0 2026-03-09T16:15:33.132 INFO:tasks.workunit.client.1.vm05.stdout:0/989: mkdir d5/d1b/d3b/dc2/d14f 0 2026-03-09T16:15:33.133 INFO:tasks.workunit.client.1.vm05.stdout:0/990: truncate d5/db/d5b/d82/f129 818989 0 2026-03-09T16:15:33.136 INFO:tasks.workunit.client.1.vm05.stdout:2/988: mkdir db/dd/d131/d147 0 2026-03-09T16:15:33.138 INFO:tasks.workunit.client.1.vm05.stdout:0/991: fsync d5/db/d5b/d82/f89 0 2026-03-09T16:15:33.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:32 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:33.147 INFO:tasks.workunit.client.1.vm05.stdout:0/992: link d5/d2c/dff/f2e d5/d11/d4f/d70/f150 0 2026-03-09T16:15:33.151 INFO:tasks.workunit.client.1.vm05.stdout:0/993: mkdir d5/d11/d151 0 2026-03-09T16:15:33.157 INFO:tasks.workunit.client.1.vm05.stdout:0/994: creat d5/db/d48/f152 x:0 0 0 2026-03-09T16:15:33.158 INFO:tasks.workunit.client.1.vm05.stdout:2/989: dread db/f12 [0,4194304] 0 2026-03-09T16:15:33.159 INFO:tasks.workunit.client.1.vm05.stdout:0/995: dread - d5/d1b/d3b/dc2/fea zero size 2026-03-09T16:15:33.163 INFO:tasks.workunit.client.1.vm05.stdout:2/990: mknod db/dd/d15/d4c/d56/c148 0 2026-03-09T16:15:33.163 INFO:tasks.workunit.client.1.vm05.stdout:2/991: dread - db/dd/d15/d3f/d5b/d7e/f11f zero size 2026-03-09T16:15:33.166 INFO:tasks.workunit.client.1.vm05.stdout:0/996: rmdir d5/db/def 39 2026-03-09T16:15:33.171 INFO:tasks.workunit.client.1.vm05.stdout:2/992: rmdir db/dd/d131 39 2026-03-09T16:15:33.178 INFO:tasks.workunit.client.1.vm05.stdout:2/993: mknod db/dd/d15/d3f/d5b/d60/d6a/c149 0 2026-03-09T16:15:33.182 INFO:tasks.workunit.client.1.vm05.stdout:0/997: truncate d5/d11/f9f 401889 0 2026-03-09T16:15:33.185 INFO:tasks.workunit.client.1.vm05.stdout:0/998: dread d5/d11/d4f/d70/f150 [4194304,4194304] 0 2026-03-09T16:15:33.214 INFO:tasks.workunit.client.1.vm05.stdout:2/994: rmdir db/dd/d98/d12e 0 2026-03-09T16:15:33.236 INFO:tasks.workunit.client.1.vm05.stdout:0/999: write d5/d97/f122 [81957,108184] 0 2026-03-09T16:15:33.249 INFO:tasks.workunit.client.1.vm05.stdout:2/995: dread db/dd/d15/d46/d67/ff1 [4194304,4194304] 0 2026-03-09T16:15:33.275 INFO:tasks.workunit.client.1.vm05.stdout:2/996: dwrite db/dd/d15/d1f/d21/d87/fbe [0,4194304] 0 2026-03-09T16:15:33.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:32 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:33.287 INFO:tasks.workunit.client.1.vm05.stdout:2/997: dread - db/dd/d15/d4c/fcb zero size 2026-03-09T16:15:33.315 INFO:tasks.workunit.client.1.vm05.stdout:2/998: link db/lc db/l14a 0 2026-03-09T16:15:33.316 INFO:tasks.workunit.client.1.vm05.stdout:2/999: fdatasync db/dd/d15/d3f/ffe 0 2026-03-09T16:15:33.320 INFO:tasks.workunit.client.1.vm05.stderr:+ rm -rf -- ./tmp.ArsmWsNydV 2026-03-09T16:15:34.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:33 vm03.local ceph-mon[51019]: pgmap v26: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 94 MiB/s wr, 219 op/s 2026-03-09T16:15:34.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:33 vm05.local ceph-mon[58702]: pgmap v26: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 94 MiB/s wr, 219 op/s 2026-03-09T16:15:35.415 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:35 vm03.local ceph-mon[51019]: pgmap v27: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 54 MiB/s rd, 143 MiB/s wr, 351 op/s 2026-03-09T16:15:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:35 vm05.local ceph-mon[58702]: pgmap v27: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 54 MiB/s rd, 143 MiB/s wr, 351 op/s 2026-03-09T16:15:37.704 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:37 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:37.704 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:37 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:37.704 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:37 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:37.704 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:37 vm03.local ceph-mon[51019]: pgmap v28: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 35 MiB/s rd, 86 MiB/s wr, 225 op/s 2026-03-09T16:15:38.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:37 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:38.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:37 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:38.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:37 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:38.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:37 vm05.local ceph-mon[58702]: pgmap v28: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 35 MiB/s rd, 86 MiB/s wr, 225 op/s 2026-03-09T16:15:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:39 vm05.local ceph-mon[58702]: pgmap v29: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 36 MiB/s rd, 86 MiB/s wr, 273 op/s 2026-03-09T16:15:40.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:39 vm03.local ceph-mon[51019]: pgmap v29: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 36 MiB/s rd, 86 MiB/s wr, 273 op/s 2026-03-09T16:15:40.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:40 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:40.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:40 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:40.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:40 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:40.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:40 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:40 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:40 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:40 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:40 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: pgmap v30: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 23 MiB/s rd, 49 MiB/s wr, 179 op/s 2026-03-09T16:15:41.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:41.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:41.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:41.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:41 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: pgmap v30: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 23 MiB/s rd, 49 MiB/s wr, 179 op/s 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:42.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:41 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:42.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:43.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:43 vm03.local ceph-mon[51019]: Upgrade: Updating alertmanager.vm03 2026-03-09T16:15:43.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:43 vm03.local ceph-mon[51019]: Deploying daemon alertmanager.vm03 on vm03 2026-03-09T16:15:43.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:43 vm03.local ceph-mon[51019]: pgmap v31: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 23 MiB/s rd, 49 MiB/s wr, 179 op/s 2026-03-09T16:15:44.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:43 vm05.local ceph-mon[58702]: Upgrade: Updating alertmanager.vm03 2026-03-09T16:15:44.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:43 vm05.local ceph-mon[58702]: Deploying daemon alertmanager.vm03 on vm03 2026-03-09T16:15:44.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:43 vm05.local ceph-mon[58702]: pgmap v31: 65 pgs: 65 active+clean; 2.7 GiB data, 9.4 GiB used, 111 GiB / 120 GiB avail; 23 MiB/s rd, 49 MiB/s wr, 179 op/s 2026-03-09T16:15:44.900 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T16:15:44.900 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-09T16:15:45.172 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:45 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:45.172 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:45 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:45.172 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:45 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:45.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:45 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:45.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:45 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:45.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:45 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:46.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:46 vm03.local ceph-mon[51019]: pgmap v32: 65 pgs: 65 active+clean; 2.1 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 23 MiB/s rd, 50 MiB/s wr, 235 op/s 2026-03-09T16:15:46.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:46 vm05.local ceph-mon[58702]: pgmap v32: 65 pgs: 65 active+clean; 2.1 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 23 MiB/s rd, 50 MiB/s wr, 235 op/s 2026-03-09T16:15:47.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:47 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:47 vm03.local ceph-mon[51019]: pgmap v33: 65 pgs: 65 active+clean; 2.1 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 269 KiB/s rd, 868 KiB/s wr, 104 op/s 2026-03-09T16:15:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:47 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:47.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:47 vm05.local ceph-mon[58702]: pgmap v33: 65 pgs: 65 active+clean; 2.1 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 269 KiB/s rd, 868 KiB/s wr, 104 op/s 2026-03-09T16:15:48.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:48 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:48.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:49.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:49 vm03.local ceph-mon[51019]: pgmap v34: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 277 KiB/s rd, 1.2 MiB/s wr, 156 op/s 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:15:50.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:49 vm05.local ceph-mon[58702]: pgmap v34: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 277 KiB/s rd, 1.2 MiB/s wr, 156 op/s 2026-03-09T16:15:50.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:50 vm03.local ceph-mon[51019]: Upgrade: Updating grafana.vm03 2026-03-09T16:15:50.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:50 vm03.local ceph-mon[51019]: Deploying daemon grafana.vm03 on vm03 2026-03-09T16:15:51.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:50 vm05.local ceph-mon[58702]: Upgrade: Updating grafana.vm03 2026-03-09T16:15:51.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:50 vm05.local ceph-mon[58702]: Deploying daemon grafana.vm03 on vm03 2026-03-09T16:15:51.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:51 vm03.local ceph-mon[51019]: pgmap v35: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 15 KiB/s rd, 819 KiB/s wr, 108 op/s 2026-03-09T16:15:52.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:51 vm05.local ceph-mon[58702]: pgmap v35: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 15 KiB/s rd, 819 KiB/s wr, 108 op/s 2026-03-09T16:15:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:53 vm05.local ceph-mon[58702]: pgmap v36: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 15 KiB/s rd, 819 KiB/s wr, 108 op/s 2026-03-09T16:15:53.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:53 vm03.local ceph-mon[51019]: pgmap v36: 65 pgs: 65 active+clean; 1.4 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 15 KiB/s rd, 819 KiB/s wr, 108 op/s 2026-03-09T16:15:55.511 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:55 vm05.local ceph-mon[58702]: pgmap v37: 65 pgs: 65 active+clean; 917 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 28 KiB/s rd, 1.4 MiB/s wr, 157 op/s 2026-03-09T16:15:55.606 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:55 vm03.local ceph-mon[51019]: pgmap v37: 65 pgs: 65 active+clean; 917 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 28 KiB/s rd, 1.4 MiB/s wr, 157 op/s 2026-03-09T16:15:57.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:57 vm05.local ceph-mon[58702]: pgmap v38: 65 pgs: 65 active+clean; 917 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 21 KiB/s rd, 959 KiB/s wr, 100 op/s 2026-03-09T16:15:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:57 vm03.local ceph-mon[51019]: pgmap v38: 65 pgs: 65 active+clean; 917 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 21 KiB/s rd, 959 KiB/s wr, 100 op/s 2026-03-09T16:15:58.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.279+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/4052797160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 msgr2=0x7fb3dc071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.279+0000 7fb3e1e1d640 1 --2- 192.168.123.103:0/4052797160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc071da0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fb3cc0099b0 tx=0x7fb3cc02f240 comp rx=0 tx=0).stop 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.281+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/4052797160 shutdown_connections 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.281+0000 7fb3e1e1d640 1 --2- 192.168.123.103:0/4052797160 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc110d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.281+0000 7fb3e1e1d640 1 --2- 192.168.123.103:0/4052797160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc071da0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.281+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/4052797160 >> 192.168.123.103:0/4052797160 conn(0x7fb3dc06d4f0 msgr2=0x7fb3dc06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.288+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/4052797160 shutdown_connections 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.288+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/4052797160 wait complete. 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.288+0000 7fb3e1e1d640 1 Processor -- start 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.288+0000 7fb3e1e1d640 1 -- start start 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e1e1d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e1e1d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc1a31b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e1e1d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3dc1a37b0 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e1e1d640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3dc1a3920 con 0x7fb3dc0719a0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e0e1b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e0e1b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47872/0 (socket says 192.168.123.103:47872) 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3e0e1b640 1 -- 192.168.123.103:0/3449072775 learned_addr learned my addr 192.168.123.103:0/3449072775 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3d3fff640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc1a31b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3d3fff640 1 -- 192.168.123.103:0/3449072775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 msgr2=0x7fb3dc1a2c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3d3fff640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.289+0000 7fb3d3fff640 1 -- 192.168.123.103:0/3449072775 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3cc009660 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3d3fff640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc1a31b0 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fb3c400b4d0 tx=0x7fb3c400b9a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3c4004280 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/3449072775 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3dc1a83b0 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/3449072775 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3dc1a8980 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb3c40043e0 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3c4010b80 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.290+0000 7fb3e0e1b640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.291+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/3449072775 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3a4005350 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.291+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fb3c4010ce0 con 0x7fb3dc0722e0 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.292+0000 7fb3d1ffb640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 0x7fb3ac079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.292+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb3c4098c60 con 0x7fb3dc0722e0 2026-03-09T16:15:58.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.292+0000 7fb3e0e1b640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 0x7fb3ac079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.297+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fb3c40625f0 con 0x7fb3dc0722e0 2026-03-09T16:15:58.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.298+0000 7fb3e0e1b640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 0x7fb3ac079c10 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb3cc002bf0 tx=0x7fb3cc03a040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.444+0000 7fb3e1e1d640 1 -- 192.168.123.103:0/3449072775 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb3a4002bf0 con 0x7fb3ac077750 2026-03-09T16:15:58.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.449+0000 7fb3d1ffb640 1 -- 192.168.123.103:0/3449072775 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7fb3a4002bf0 con 0x7fb3ac077750 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.462+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 msgr2=0x7fb3ac079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.462+0000 7fb3ab7fe640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 0x7fb3ac079c10 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb3cc002bf0 tx=0x7fb3cc03a040 comp rx=0 tx=0).stop 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.462+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 msgr2=0x7fb3dc1a31b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.462+0000 7fb3ab7fe640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc1a31b0 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fb3c400b4d0 tx=0x7fb3c400b9a0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.463+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 shutdown_connections 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.463+0000 7fb3ab7fe640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fb3ac077750 0x7fb3ac079c10 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.463+0000 7fb3ab7fe640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3dc0722e0 0x7fb3dc1a31b0 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.463+0000 7fb3ab7fe640 1 --2- 192.168.123.103:0/3449072775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3dc0719a0 0x7fb3dc1a2c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.463+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 >> 192.168.123.103:0/3449072775 conn(0x7fb3dc06d4f0 msgr2=0x7fb3dc10f120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.464+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 shutdown_connections 2026-03-09T16:15:58.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.464+0000 7fb3ab7fe640 1 -- 192.168.123.103:0/3449072775 wait complete. 2026-03-09T16:15:58.474 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:15:58.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 -- 192.168.123.103:0/3930582359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808071a70 msgr2=0x7f2808071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 --2- 192.168.123.103:0/3930582359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808071a70 0x7f2808071e70 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f27f8007920 tx=0x7f27f802ffe0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 -- 192.168.123.103:0/3930582359 shutdown_connections 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 --2- 192.168.123.103:0/3930582359 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808072440 0x7f28080771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 --2- 192.168.123.103:0/3930582359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808071a70 0x7f2808071e70 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.529+0000 7f280e7af640 1 -- 192.168.123.103:0/3930582359 >> 192.168.123.103:0/3930582359 conn(0x7f280806d4f0 msgr2=0x7f280806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 -- 192.168.123.103:0/3930582359 shutdown_connections 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 -- 192.168.123.103:0/3930582359 wait complete. 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 Processor -- start 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 -- start start 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 0x7f2808131a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 0x7f28081323e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f28081333d0 con 0x7f2808072440 2026-03-09T16:15:58.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f280e7af640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2808133540 con 0x7f2808131f60 2026-03-09T16:15:58.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f2807fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 0x7f2808131a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f2807fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 0x7f2808131a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47746/0 (socket says 192.168.123.103:47746) 2026-03-09T16:15:58.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f2807fff640 1 -- 192.168.123.103:0/2794222740 learned_addr learned my addr 192.168.123.103:0/2794222740 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:58.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.530+0000 7f28077fe640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 0x7f28081323e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.532+0000 7f28077fe640 1 -- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 msgr2=0x7f2808131a20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.532+0000 7f28077fe640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 0x7f2808131a20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.532+0000 7f28077fe640 1 -- 192.168.123.103:0/2794222740 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27f80075d0 con 0x7f2808131f60 2026-03-09T16:15:58.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.533+0000 7f28077fe640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 0x7f28081323e0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f280000bd70 tx=0x7f2800009f70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.533+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2800002d10 con 0x7f2808131f60 2026-03-09T16:15:58.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.533+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f280807fb70 con 0x7f2808131f60 2026-03-09T16:15:58.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.533+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2808080040 con 0x7f2808131f60 2026-03-09T16:15:58.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.535+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2800002e70 con 0x7f2808131f60 2026-03-09T16:15:58.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.535+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f27d4005350 con 0x7f2808131f60 2026-03-09T16:15:58.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.535+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2800015680 con 0x7f2808131f60 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.539+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f2800008030 con 0x7f2808131f60 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.540+0000 7f28057fa640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 0x7f27d8079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.540+0000 7f2807fff640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 0x7f27d8079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.540+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2800099880 con 0x7f2808131f60 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.540+0000 7f2807fff640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 0x7f27d8079c10 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f27f80075a0 tx=0x7f27f803b040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.540+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f28000c9630 con 0x7f2808131f60 2026-03-09T16:15:58.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.697+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f27d4002bf0 con 0x7f27d8077750 2026-03-09T16:15:58.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.699+0000 7f28057fa640 1 -- 192.168.123.103:0/2794222740 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f27d4002bf0 con 0x7f27d8077750 2026-03-09T16:15:58.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.701+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 msgr2=0x7f27d8079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.701+0000 7f280e7af640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 0x7f27d8079c10 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f27f80075a0 tx=0x7f27f803b040 comp rx=0 tx=0).stop 2026-03-09T16:15:58.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 msgr2=0x7f28081323e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 0x7f28081323e0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f280000bd70 tx=0x7f2800009f70 comp rx=0 tx=0).stop 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 shutdown_connections 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f27d8077750 0x7f27d8079c10 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2808131f60 0x7f28081323e0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 --2- 192.168.123.103:0/2794222740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2808072440 0x7f2808131a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 >> 192.168.123.103:0/2794222740 conn(0x7f280806d4f0 msgr2=0x7f2808075420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 shutdown_connections 2026-03-09T16:15:58.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.702+0000 7f280e7af640 1 -- 192.168.123.103:0/2794222740 wait complete. 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.768+0000 7f23c74c3640 1 -- 192.168.123.103:0/3267515024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0071a70 msgr2=0x7f23c0071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.768+0000 7f23c74c3640 1 --2- 192.168.123.103:0/3267515024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0071a70 0x7f23c0071e70 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f23bc010a30 tx=0x7f23bc033330 comp rx=0 tx=0).stop 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.769+0000 7f23c74c3640 1 -- 192.168.123.103:0/3267515024 shutdown_connections 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.769+0000 7f23c74c3640 1 --2- 192.168.123.103:0/3267515024 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c00771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.769+0000 7f23c74c3640 1 --2- 192.168.123.103:0/3267515024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0071a70 0x7f23c0071e70 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.769+0000 7f23c74c3640 1 -- 192.168.123.103:0/3267515024 >> 192.168.123.103:0/3267515024 conn(0x7f23c006d4f0 msgr2=0x7f23c006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 -- 192.168.123.103:0/3267515024 shutdown_connections 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 -- 192.168.123.103:0/3267515024 wait complete. 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 Processor -- start 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 -- start start 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0080190 0x7f23c0080610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23c0080b50 con 0x7f23c0080190 2026-03-09T16:15:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c74c3640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23c0080cc0 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c5238640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c5238640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50540/0 (socket says 192.168.123.103:50540) 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.770+0000 7f23c5238640 1 -- 192.168.123.103:0/1950252783 learned_addr learned my addr 192.168.123.103:0/1950252783 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.771+0000 7f23c5238640 1 -- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0080190 msgr2=0x7f23c0080610 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.771+0000 7f23c5238640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0080190 0x7f23c0080610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.771+0000 7f23c5238640 1 -- 192.168.123.103:0/1950252783 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23bc0106e0 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.771+0000 7f23c5238640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f23bc002410 tx=0x7f23bc002c80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.772+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23bc0045d0 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.772+0000 7f23c74c3640 1 -- 192.168.123.103:0/1950252783 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23c0080f40 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.772+0000 7f23c74c3640 1 -- 192.168.123.103:0/1950252783 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23c01b5bc0 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.772+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f23bc0104c0 con 0x7f23c0072440 2026-03-09T16:15:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.772+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23bc043ba0 con 0x7f23c0072440 2026-03-09T16:15:58.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.774+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f23bc043d00 con 0x7f23c0072440 2026-03-09T16:15:58.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.775+0000 7f23b67fc640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 0x7f23a0079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:58.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.775+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f23bc00f070 con 0x7f23c0072440 2026-03-09T16:15:58.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.776+0000 7f23c74c3640 1 -- 192.168.123.103:0/1950252783 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2390005350 con 0x7f23c0072440 2026-03-09T16:15:58.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.780+0000 7f23c4a37640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 0x7f23a0079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:58.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.781+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f23bc087bf0 con 0x7f23c0072440 2026-03-09T16:15:58.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.790+0000 7f23c4a37640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 0x7f23a0079c10 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f23c00818c0 tx=0x7f23b8009460 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:58.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.965+0000 7f23c74c3640 1 -- 192.168.123.103:0/1950252783 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2390002bf0 con 0x7f23a0077750 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.978+0000 7f23b67fc640 1 -- 192.168.123.103:0/1950252783 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f2390002bf0 con 0x7f23a0077750 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (15s) 12s ago 5m 16.5M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 12s ago 5m 8992k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (5m) 34s ago 5m 8812k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 12s ago 5m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (5m) 34s ago 5m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (5m) 12s ago 5m 90.4M - 9.4.7 954c08fa6188 9b9ef5226e00 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (3m) 12s ago 3m 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (3m) 12s ago 3m 271M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:15:58.979 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (3m) 34s ago 3m 15.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (3m) 34s ago 3m 17.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (83s) 12s ago 6m 606M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (58s) 34s ago 5m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 12s ago 6m 59.9M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (5m) 34s ago 5m 48.2M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (45s) 12s ago 5m 8586k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (38s) 34s ago 5m 5356k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 12s ago 4m 312M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 12s ago 4m 317M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 12s ago 4m 256M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (4m) 34s ago 4m 354M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (4m) 34s ago 4m 304M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (4m) 34s ago 4m 266M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:15:58.980 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (22s) 12s ago 5m 37.9M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 msgr2=0x7f23a0079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 0x7f23a0079c10 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f23c00818c0 tx=0x7f23b8009460 comp rx=0 tx=0).stop 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 msgr2=0x7f23c0081b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f23bc002410 tx=0x7f23bc002c80 comp rx=0 tx=0).stop 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 shutdown_connections 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f23a0077750 0x7f23a0079c10 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f23c0080190 0x7f23c0080610 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.981+0000 7f238bfff640 1 --2- 192.168.123.103:0/1950252783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23c0072440 0x7f23c0081b40 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:58.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.982+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 >> 192.168.123.103:0/1950252783 conn(0x7f23c006d4f0 msgr2=0x7f23c0075a40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:58.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.983+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 shutdown_connections 2026-03-09T16:15:58.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:58.983+0000 7f238bfff640 1 -- 192.168.123.103:0/1950252783 wait complete. 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.090+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2397071834 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 msgr2=0x7f2c7410c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.090+0000 7f2c7ac8a640 1 --2- 192.168.123.103:0/2397071834 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 0x7f2c7410c590 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7f2c640099b0 tx=0x7f2c6402f220 comp rx=0 tx=0).stop 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.091+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2397071834 shutdown_connections 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.091+0000 7f2c7ac8a640 1 --2- 192.168.123.103:0/2397071834 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 0x7f2c7410c590 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.091+0000 7f2c7ac8a640 1 --2- 192.168.123.103:0/2397071834 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c74071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.091+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2397071834 >> 192.168.123.103:0/2397071834 conn(0x7f2c7406d4f0 msgr2=0x7f2c7406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:59.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.091+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2397071834 shutdown_connections 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2397071834 wait complete. 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 Processor -- start 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 -- start start 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 0x7f2c741a7590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2c741a7bb0 con 0x7f2c74072370 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c7ac8a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2c741a7d20 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c79c88640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c79c88640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50546/0 (socket says 192.168.123.103:50546) 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.092+0000 7f2c79c88640 1 -- 192.168.123.103:0/2501758222 learned_addr learned my addr 192.168.123.103:0/2501758222 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c79c88640 1 -- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 msgr2=0x7f2c741a7590 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c79c88640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 0x7f2c741a7590 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c79c88640 1 -- 192.168.123.103:0/2501758222 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2c64009660 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c79c88640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f2c6c00efc0 tx=0x7f2c6c00c490 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c6c009280 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2501758222 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2c740774d0 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.093+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2501758222 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2c74077a20 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.094+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2501758222 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2c3c005350 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.094+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2c6c00f040 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.094+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c6c004940 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.095+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f2c6c007520 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.096+0000 7f2c62ffd640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 0x7f2c5c079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.096+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2c6c09abf0 con 0x7f2c740719a0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.096+0000 7f2c79487640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 0x7f2c5c079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:59.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.096+0000 7f2c79487640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 0x7f2c5c079b40 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f2c64002af0 tx=0x7f2c640023d0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:59.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.099+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f2c6c0636b0 con 0x7f2c740719a0 2026-03-09T16:15:59.296 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:59 vm03.local ceph-mon[51019]: from='client.14678 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:59 vm03.local ceph-mon[51019]: from='client.24473 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:59 vm03.local ceph-mon[51019]: from='client.24477 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:15:59 vm03.local ceph-mon[51019]: pgmap v39: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 27 KiB/s rd, 1.2 MiB/s wr, 130 op/s 2026-03-09T16:15:59.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.369+0000 7f2c7ac8a640 1 -- 192.168.123.103:0/2501758222 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2c3c0058d0 con 0x7f2c740719a0 2026-03-09T16:15:59.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.370+0000 7f2c62ffd640 1 -- 192.168.123.103:0/2501758222 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f2c6c0093e0 con 0x7f2c740719a0 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:15:59.372 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:15:59.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 msgr2=0x7f2c5c079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 0x7f2c5c079b40 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f2c64002af0 tx=0x7f2c640023d0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 msgr2=0x7f2c741a7050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f2c6c00efc0 tx=0x7f2c6c00c490 comp rx=0 tx=0).stop 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 shutdown_connections 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f2c5c077680 0x7f2c5c079b40 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2c74072370 0x7f2c741a7590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 --2- 192.168.123.103:0/2501758222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c740719a0 0x7f2c741a7050 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.374+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 >> 192.168.123.103:0/2501758222 conn(0x7f2c7406d4f0 msgr2=0x7f2c7410a770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.375+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 shutdown_connections 2026-03-09T16:15:59.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.375+0000 7f2c60ff9640 1 -- 192.168.123.103:0/2501758222 wait complete. 2026-03-09T16:15:59.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.453+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/2513038710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b740719a0 msgr2=0x7f9b74071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.453+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/2513038710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b740719a0 0x7f9b74071da0 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7f9b640099b0 tx=0x7f9b6402f240 comp rx=0 tx=0).stop 2026-03-09T16:15:59.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/2513038710 shutdown_connections 2026-03-09T16:15:59.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/2513038710 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b7410c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/2513038710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b740719a0 0x7f9b74071da0 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/2513038710 >> 192.168.123.103:0/2513038710 conn(0x7f9b7406d4f0 msgr2=0x7f9b7406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:59.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/2513038710 shutdown_connections 2026-03-09T16:15:59.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/2513038710 wait complete. 2026-03-09T16:15:59.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.454+0000 7f9b7a9a2640 1 Processor -- start 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b7a9a2640 1 -- start start 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b7a9a2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b7a9a2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b741a7680 0x7f9b741ac6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b7a9a2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b741a7b00 con 0x7f9b741a7680 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b7a9a2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b741a7c70 con 0x7f9b74072370 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b799a0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b799a0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50570/0 (socket says 192.168.123.103:50570) 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.455+0000 7f9b799a0640 1 -- 192.168.123.103:0/3917140825 learned_addr learned my addr 192.168.123.103:0/3917140825 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:15:59.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.456+0000 7f9b799a0640 1 -- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b741a7680 msgr2=0x7f9b741ac6f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:15:59.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.456+0000 7f9b799a0640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b741a7680 0x7f9b741ac6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.456+0000 7f9b799a0640 1 -- 192.168.123.103:0/3917140825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9b64009660 con 0x7f9b74072370 2026-03-09T16:15:59.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.456+0000 7f9b799a0640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f9b6402f750 tx=0x7f9b64002c40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:59.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.463+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b6403d070 con 0x7f9b74072370 2026-03-09T16:15:59.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.464+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9b7410ed10 con 0x7f9b74072370 2026-03-09T16:15:59.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.464+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9b7410f1b0 con 0x7f9b74072370 2026-03-09T16:15:59.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.464+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9b64004510 con 0x7f9b74072370 2026-03-09T16:15:59.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.464+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b64031070 con 0x7f9b74072370 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.467+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f9b64049050 con 0x7f9b74072370 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.467+0000 7f9b60ff9640 1 -- 192.168.123.103:0/3917140825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9b741183e0 con 0x7f9b74072370 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.468+0000 7f9b62ffd640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 0x7f9b4c079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.468+0000 7f9b7919f640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 0x7f9b4c079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.468+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9b640bde90 con 0x7f9b74072370 2026-03-09T16:15:59.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.469+0000 7f9b7919f640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 0x7f9b4c079b40 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f9b68005fd0 tx=0x7f9b68005950 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:15:59.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.472+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f9b64086810 con 0x7f9b74072370 2026-03-09T16:15:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:59 vm05.local ceph-mon[58702]: from='client.14678 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:59 vm05.local ceph-mon[58702]: from='client.24473 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:59 vm05.local ceph-mon[58702]: from='client.24477 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:15:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:15:59 vm05.local ceph-mon[58702]: pgmap v39: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 27 KiB/s rd, 1.2 MiB/s wr, 130 op/s 2026-03-09T16:15:59.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.689+0000 7f9b60ff9640 1 -- 192.168.123.103:0/3917140825 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9b741a7e30 con 0x7f9b4c077680 2026-03-09T16:15:59.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.695+0000 7f9b62ffd640 1 -- 192.168.123.103:0/3917140825 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f9b741a7e30 con 0x7f9b4c077680 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:15:59.699 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:15:59.700 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/2 daemons upgraded", 2026-03-09T16:15:59.700 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading grafana daemons", 2026-03-09T16:15:59.700 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:15:59.700 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:15:59.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.700+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 msgr2=0x7f9b4c079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.700+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 0x7f9b4c079b40 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f9b68005fd0 tx=0x7f9b68005950 comp rx=0 tx=0).stop 2026-03-09T16:15:59.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.700+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 msgr2=0x7f9b741a7140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:15:59.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.700+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f9b6402f750 tx=0x7f9b64002c40 comp rx=0 tx=0).stop 2026-03-09T16:15:59.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 shutdown_connections 2026-03-09T16:15:59.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9b4c077680 0x7f9b4c079b40 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b741a7680 0x7f9b741ac6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 --2- 192.168.123.103:0/3917140825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9b74072370 0x7f9b741a7140 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:15:59.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 >> 192.168.123.103:0/3917140825 conn(0x7f9b7406d4f0 msgr2=0x7f9b74070430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:15:59.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.701+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 shutdown_connections 2026-03-09T16:15:59.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:15:59.702+0000 7f9b7a9a2640 1 -- 192.168.123.103:0/3917140825 wait complete. 2026-03-09T16:16:00.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:00 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/2501758222' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:00.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:00 vm05.local ceph-mon[58702]: from='client.24485 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:00 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/2501758222' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:00 vm03.local ceph-mon[51019]: from='client.24485 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:01.212 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T16:16:01.212 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-09T16:16:01.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:01.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:01.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:01 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:01.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:01 vm03.local ceph-mon[51019]: pgmap v40: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 19 KiB/s rd, 876 KiB/s wr, 78 op/s 2026-03-09T16:16:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:01 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:01 vm05.local ceph-mon[58702]: pgmap v40: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 19 KiB/s rd, 876 KiB/s wr, 78 op/s 2026-03-09T16:16:02.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:02 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:02 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:04.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:04 vm03.local ceph-mon[51019]: pgmap v41: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 19 KiB/s rd, 876 KiB/s wr, 78 op/s 2026-03-09T16:16:04.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:04 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:04 vm05.local ceph-mon[58702]: pgmap v41: 65 pgs: 65 active+clean; 635 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 19 KiB/s rd, 876 KiB/s wr, 78 op/s 2026-03-09T16:16:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:04 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:05 vm05.local ceph-mon[58702]: pgmap v42: 65 pgs: 65 active+clean; 341 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.3 MiB/s wr, 111 op/s 2026-03-09T16:16:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:05 vm03.local ceph-mon[51019]: pgmap v42: 65 pgs: 65 active+clean; 341 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.3 MiB/s wr, 111 op/s 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: Upgrade: Finalizing container_image settings 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T16:16:06.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: Upgrade: Complete! 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:06 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T16:16:06.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: Upgrade: Finalizing container_image settings 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: Upgrade: Complete! 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:06.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:06 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:07.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:07 vm05.local ceph-mon[58702]: pgmap v43: 65 pgs: 65 active+clean; 341 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 720 KiB/s wr, 62 op/s 2026-03-09T16:16:07.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:07 vm03.local ceph-mon[51019]: pgmap v43: 65 pgs: 65 active+clean; 341 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 720 KiB/s wr, 62 op/s 2026-03-09T16:16:09.463 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:09 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:09.463 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:09 vm03.local ceph-mon[51019]: pgmap v44: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 110 op/s 2026-03-09T16:16:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:09 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:09 vm05.local ceph-mon[58702]: pgmap v44: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 110 op/s 2026-03-09T16:16:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:11 vm03.local ceph-mon[51019]: pgmap v45: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.2 MiB/s wr, 81 op/s 2026-03-09T16:16:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:11 vm05.local ceph-mon[58702]: pgmap v45: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.2 MiB/s wr, 81 op/s 2026-03-09T16:16:13.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:13 vm03.local ceph-mon[51019]: pgmap v46: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.2 MiB/s wr, 81 op/s 2026-03-09T16:16:13.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:13 vm05.local ceph-mon[58702]: pgmap v46: 65 pgs: 65 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.2 MiB/s wr, 81 op/s 2026-03-09T16:16:16.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:15 vm05.local ceph-mon[58702]: pgmap v47: 65 pgs: 65 active+clean; 299 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 39 KiB/s rd, 2.0 MiB/s wr, 128 op/s 2026-03-09T16:16:16.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:15 vm03.local ceph-mon[51019]: pgmap v47: 65 pgs: 65 active+clean; 299 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 39 KiB/s rd, 2.0 MiB/s wr, 128 op/s 2026-03-09T16:16:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:17 vm05.local ceph-mon[58702]: pgmap v48: 65 pgs: 65 active+clean; 299 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 1.6 MiB/s wr, 95 op/s 2026-03-09T16:16:17.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:17 vm03.local ceph-mon[51019]: pgmap v48: 65 pgs: 65 active+clean; 299 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 1.6 MiB/s wr, 95 op/s 2026-03-09T16:16:18.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:18 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:18 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:19.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:19 vm05.local ceph-mon[58702]: pgmap v49: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 48 KiB/s rd, 2.8 MiB/s wr, 150 op/s 2026-03-09T16:16:19.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:19 vm03.local ceph-mon[51019]: pgmap v49: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 48 KiB/s rd, 2.8 MiB/s wr, 150 op/s 2026-03-09T16:16:21.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:20 vm03.local ceph-mon[51019]: mgrmap e31: vm03.gbgzmu(active, since 92s), standbys: vm05.dygxfv 2026-03-09T16:16:21.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:20 vm05.local ceph-mon[58702]: mgrmap e31: vm03.gbgzmu(active, since 92s), standbys: vm05.dygxfv 2026-03-09T16:16:22.070 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:21 vm05.local ceph-mon[58702]: pgmap v50: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 2.0 MiB/s wr, 102 op/s 2026-03-09T16:16:22.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:21 vm03.local ceph-mon[51019]: pgmap v50: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 2.0 MiB/s wr, 102 op/s 2026-03-09T16:16:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:23 vm05.local ceph-mon[58702]: pgmap v51: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 2.0 MiB/s wr, 102 op/s 2026-03-09T16:16:23.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:23 vm03.local ceph-mon[51019]: pgmap v51: 65 pgs: 65 active+clean; 303 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 2.0 MiB/s wr, 102 op/s 2026-03-09T16:16:25.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:25 vm05.local ceph-mon[58702]: pgmap v52: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 52 KiB/s rd, 2.8 MiB/s wr, 147 op/s 2026-03-09T16:16:25.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:25 vm03.local ceph-mon[51019]: pgmap v52: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 52 KiB/s rd, 2.8 MiB/s wr, 147 op/s 2026-03-09T16:16:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:27 vm05.local ceph-mon[58702]: pgmap v53: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 2.0 MiB/s wr, 100 op/s 2026-03-09T16:16:27.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:27 vm03.local ceph-mon[51019]: pgmap v53: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 2.0 MiB/s wr, 100 op/s 2026-03-09T16:16:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:29 vm05.local ceph-mon[58702]: pgmap v54: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 55 KiB/s rd, 2.8 MiB/s wr, 157 op/s 2026-03-09T16:16:29.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:29 vm03.local ceph-mon[51019]: pgmap v54: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 55 KiB/s rd, 2.8 MiB/s wr, 157 op/s 2026-03-09T16:16:29.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.791+0000 7f522b148640 1 -- 192.168.123.103:0/4284585251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224072440 msgr2=0x7f52240771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.791+0000 7f522b148640 1 --2- 192.168.123.103:0/4284585251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224072440 0x7f52240771b0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7f521c008090 tx=0x7f521c031ea0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.792+0000 7f522b148640 1 -- 192.168.123.103:0/4284585251 shutdown_connections 2026-03-09T16:16:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.792+0000 7f522b148640 1 --2- 192.168.123.103:0/4284585251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224072440 0x7f52240771b0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.792+0000 7f522b148640 1 --2- 192.168.123.103:0/4284585251 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224071a70 0x7f5224071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.792+0000 7f522b148640 1 -- 192.168.123.103:0/4284585251 >> 192.168.123.103:0/4284585251 conn(0x7f522406d4f0 msgr2=0x7f522406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 -- 192.168.123.103:0/4284585251 shutdown_connections 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 -- 192.168.123.103:0/4284585251 wait complete. 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 Processor -- start 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 -- start start 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 0x7f5224084110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 0x7f5224082be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5224084650 con 0x7f5224071a70 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f522b148640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5224083150 con 0x7f5224082760 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f5228ebd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 0x7f5224084110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f5228ebd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 0x7f5224084110 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39936/0 (socket says 192.168.123.103:39936) 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.793+0000 7f5228ebd640 1 -- 192.168.123.103:0/4218210132 learned_addr learned my addr 192.168.123.103:0/4218210132 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5223fff640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 0x7f5224082be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5223fff640 1 -- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 msgr2=0x7f5224084110 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5223fff640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 0x7f5224084110 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5223fff640 1 -- 192.168.123.103:0/4218210132 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f521c007ce0 con 0x7f5224082760 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5223fff640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 0x7f5224082be0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f521c008090 tx=0x7f521c002ea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.794+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f521c032780 con 0x7f5224082760 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.795+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f521c032da0 con 0x7f5224082760 2026-03-09T16:16:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.795+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f521c00e450 con 0x7f5224082760 2026-03-09T16:16:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.795+0000 7f522b148640 1 -- 192.168.123.103:0/4218210132 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52240833d0 con 0x7f5224082760 2026-03-09T16:16:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.795+0000 7f522b148640 1 -- 192.168.123.103:0/4218210132 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f522412ef70 con 0x7f5224082760 2026-03-09T16:16:29.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.798+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f521c04b050 con 0x7f5224082760 2026-03-09T16:16:29.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.798+0000 7f5221ffb640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 0x7f5204079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:29.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.798+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f521c0bfde0 con 0x7f5224082760 2026-03-09T16:16:29.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.798+0000 7f5228ebd640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 0x7f5204079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:29.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.799+0000 7f5228ebd640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 0x7f5204079af0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f5214005fd0 tx=0x7f52140074e0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:29.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.799+0000 7f522b148640 1 -- 192.168.123.103:0/4218210132 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f51e8005350 con 0x7f5224082760 2026-03-09T16:16:29.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.807+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f521c088980 con 0x7f5224082760 2026-03-09T16:16:29.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.968+0000 7f522b148640 1 -- 192.168.123.103:0/4218210132 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f51e8002bf0 con 0x7f5204077630 2026-03-09T16:16:29.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.969+0000 7f5221ffb640 1 -- 192.168.123.103:0/4218210132 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f51e8002bf0 con 0x7f5204077630 2026-03-09T16:16:29.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.973+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 msgr2=0x7f5204079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:29.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 0x7f5204079af0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f5214005fd0 tx=0x7f52140074e0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 msgr2=0x7f5224082be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 0x7f5224082be0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f521c008090 tx=0x7f521c002ea0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 shutdown_connections 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f5204077630 0x7f5204079af0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5224082760 0x7f5224082be0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 --2- 192.168.123.103:0/4218210132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5224071a70 0x7f5224084110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 >> 192.168.123.103:0/4218210132 conn(0x7f522406d4f0 msgr2=0x7f5224073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 shutdown_connections 2026-03-09T16:16:29.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:29.974+0000 7f52037fe640 1 -- 192.168.123.103:0/4218210132 wait complete. 2026-03-09T16:16:30.159 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | length == 1'"'"'' 2026-03-09T16:16:30.408 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:30.744 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:30 vm03.local ceph-mon[51019]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:30.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.874+0000 7faad7fff640 1 -- 192.168.123.103:0/522379769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072120 msgr2=0x7faad8072500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:30.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.874+0000 7faad7fff640 1 --2- 192.168.123.103:0/522379769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072120 0x7faad8072500 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7faacc0099b0 tx=0x7faacc02f220 comp rx=0 tx=0).stop 2026-03-09T16:16:30.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.876+0000 7faad7fff640 1 -- 192.168.123.103:0/522379769 shutdown_connections 2026-03-09T16:16:30.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.876+0000 7faad7fff640 1 --2- 192.168.123.103:0/522379769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072a40 0x7faad810ca90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:30.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.876+0000 7faad7fff640 1 --2- 192.168.123.103:0/522379769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072120 0x7faad8072500 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:30.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.876+0000 7faad7fff640 1 -- 192.168.123.103:0/522379769 >> 192.168.123.103:0/522379769 conn(0x7faad806c7d0 msgr2=0x7faad806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:30.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.877+0000 7faad7fff640 1 -- 192.168.123.103:0/522379769 shutdown_connections 2026-03-09T16:16:30.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.877+0000 7faad7fff640 1 -- 192.168.123.103:0/522379769 wait complete. 2026-03-09T16:16:30.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.877+0000 7faad7fff640 1 Processor -- start 2026-03-09T16:16:30.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad7fff640 1 -- start start 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad7fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad7fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072a40 0x7faad81a7b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad7fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faad81a81b0 con 0x7faad8072a40 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad7fff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faad81abf20 con 0x7faad8072120 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad6ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad6ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46532/0 (socket says 192.168.123.103:46532) 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.878+0000 7faad6ffd640 1 -- 192.168.123.103:0/458560123 learned_addr learned my addr 192.168.123.103:0/458560123 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072a40 0x7faad81a7b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:30.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad6ffd640 1 -- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072a40 msgr2=0x7faad81a7b20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:30.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad6ffd640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072a40 0x7faad81a7b20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:30.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad6ffd640 1 -- 192.168.123.103:0/458560123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faacc009660 con 0x7faad8072120 2026-03-09T16:16:30.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad6ffd640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7faacc005ec0 tx=0x7faacc004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:30.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faacc03d070 con 0x7faad8072120 2026-03-09T16:16:30.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad7fff640 1 -- 192.168.123.103:0/458560123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faad81ac1a0 con 0x7faad8072120 2026-03-09T16:16:30.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faad7fff640 1 -- 192.168.123.103:0/458560123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faad81ac690 con 0x7faad8072120 2026-03-09T16:16:30.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faacc038730 con 0x7faad8072120 2026-03-09T16:16:30.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.879+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faacc041600 con 0x7faad8072120 2026-03-09T16:16:30.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.881+0000 7faad7fff640 1 -- 192.168.123.103:0/458560123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faad8108570 con 0x7faad8072120 2026-03-09T16:16:30.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.883+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7faacc0388a0 con 0x7faad8072120 2026-03-09T16:16:30.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.883+0000 7faadc87b640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 0x7faaa80799d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:30.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.883+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7faacc0be450 con 0x7faad8072120 2026-03-09T16:16:30.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.884+0000 7faad67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 0x7faaa80799d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:30.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.884+0000 7faad67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 0x7faaa80799d0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faad81a8b90 tx=0x7faac00073d0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:30.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:30.885+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7faacc087130 con 0x7faad8072120 2026-03-09T16:16:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:30 vm05.local ceph-mon[58702]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:31.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.090+0000 7faad7fff640 1 -- 192.168.123.103:0/458560123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7faad811c9c0 con 0x7faad8072120 2026-03-09T16:16:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.096+0000 7faadc87b640 1 -- 192.168.123.103:0/458560123 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7faacc086880 con 0x7faad8072120 2026-03-09T16:16:31.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 msgr2=0x7faaa80799d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 0x7faaa80799d0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faad81a8b90 tx=0x7faac00073d0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 msgr2=0x7faad81a75e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7faacc005ec0 tx=0x7faacc004290 comp rx=0 tx=0).stop 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 shutdown_connections 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7faaa8077510 0x7faaa80799d0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faad8072a40 0x7faad81a7b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 --2- 192.168.123.103:0/458560123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faad8072120 0x7faad81a75e0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.099+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 >> 192.168.123.103:0/458560123 conn(0x7faad806c7d0 msgr2=0x7faad810dde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:31.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.101+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 shutdown_connections 2026-03-09T16:16:31.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.101+0000 7faab67fc640 1 -- 192.168.123.103:0/458560123 wait complete. 2026-03-09T16:16:31.115 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:16:31.200 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | keys'"'"' | grep $sha1' 2026-03-09T16:16:31.425 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:31.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:31 vm03.local ceph-mon[51019]: pgmap v55: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.5 MiB/s wr, 102 op/s 2026-03-09T16:16:31.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:31 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/458560123' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:31.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.749+0000 7fe2b58c2640 1 -- 192.168.123.103:0/3872030383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0072af0 msgr2=0x7fe2b010ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.749+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/3872030383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0072af0 0x7fe2b010ba70 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7fe2a800b600 tx=0x7fe2a8030670 comp rx=0 tx=0).stop 2026-03-09T16:16:31.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 -- 192.168.123.103:0/3872030383 shutdown_connections 2026-03-09T16:16:31.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/3872030383 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0072af0 0x7fe2b010ba70 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/3872030383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 0x7fe2b0072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 -- 192.168.123.103:0/3872030383 >> 192.168.123.103:0/3872030383 conn(0x7fe2b006c7e0 msgr2=0x7fe2b006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:31.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 -- 192.168.123.103:0/3872030383 shutdown_connections 2026-03-09T16:16:31.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 -- 192.168.123.103:0/3872030383 wait complete. 2026-03-09T16:16:31.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.750+0000 7fe2b58c2640 1 Processor -- start 2026-03-09T16:16:31.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2b58c2640 1 -- start start 2026-03-09T16:16:31.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2b58c2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 0x7fe2b0133230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2b58c2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2b58c2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2b0133d20 con 0x7fe2b0133770 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2b58c2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2b0133e90 con 0x7fe2b0072140 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39966/0 (socket says 192.168.123.103:39966) 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 -- 192.168.123.103:0/109157308 learned_addr learned my addr 192.168.123.103:0/109157308 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:31.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2aeffd640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 0x7fe2b0133230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 -- 192.168.123.103:0/109157308 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 msgr2=0x7fe2b0133230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 0x7fe2b0133230 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.751+0000 7fe2ae7fc640 1 -- 192.168.123.103:0/109157308 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2a8009d00 con 0x7fe2b0133770 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2ae7fc640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7fe2a8009510 tx=0x7fe2a8009540 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2a8007db0 con 0x7fe2b0133770 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe2a8033040 con 0x7fe2b0133770 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2a8038c40 con 0x7fe2b0133770 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2b007ee40 con 0x7fe2b0133770 2026-03-09T16:16:31.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.752+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2b007f390 con 0x7fe2b0133770 2026-03-09T16:16:31.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.754+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe288005350 con 0x7fe2b0133770 2026-03-09T16:16:31.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.754+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7fe2a8048050 con 0x7fe2b0133770 2026-03-09T16:16:31.755 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.755+0000 7fe2b48c0640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 0x7fe294079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:31.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.755+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fe2a80bef10 con 0x7fe2b0133770 2026-03-09T16:16:31.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.755+0000 7fe2aeffd640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 0x7fe294079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:31.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.758+0000 7fe2aeffd640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 0x7fe294079b90 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fe2a00059c0 tx=0x7fe2a000a380 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:31.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.762+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fe2a8087b40 con 0x7fe2b0133770 2026-03-09T16:16:31.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.953+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe288005e10 con 0x7fe2b0133770 2026-03-09T16:16:31.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.956+0000 7fe2b48c0640 1 -- 192.168.123.103:0/109157308 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7fe2a8087290 con 0x7fe2b0133770 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 msgr2=0x7fe294079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 0x7fe294079b90 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fe2a00059c0 tx=0x7fe2a000a380 comp rx=0 tx=0).stop 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 msgr2=0x7fe2b007e8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7fe2a8009510 tx=0x7fe2a8009540 comp rx=0 tx=0).stop 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 shutdown_connections 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fe2940776d0 0x7fe294079b90 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2b0133770 0x7fe2b007e8a0 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 --2- 192.168.123.103:0/109157308 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b0072140 0x7fe2b0133230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.959+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 >> 192.168.123.103:0/109157308 conn(0x7fe2b006c7e0 msgr2=0x7fe2b006f8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.960+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 shutdown_connections 2026-03-09T16:16:31.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:31.960+0000 7fe2b58c2640 1 -- 192.168.123.103:0/109157308 wait complete. 2026-03-09T16:16:31.972 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T16:16:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:31 vm05.local ceph-mon[58702]: pgmap v55: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.5 MiB/s wr, 102 op/s 2026-03-09T16:16:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:31 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/458560123' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:32.057 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 2'"'"'' 2026-03-09T16:16:32.365 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.190+0000 7f81ca8c5640 1 -- 192.168.123.103:0/3319643488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 msgr2=0x7f81c410ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:33.191 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:32 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/109157308' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.190+0000 7f81ca8c5640 1 --2- 192.168.123.103:0/3319643488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 0x7f81c410ca90 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f81b80099b0 tx=0x7f81b802f240 comp rx=0 tx=0).stop 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 -- 192.168.123.103:0/3319643488 shutdown_connections 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 --2- 192.168.123.103:0/3319643488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 0x7f81c410ca90 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 --2- 192.168.123.103:0/3319643488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 -- 192.168.123.103:0/3319643488 >> 192.168.123.103:0/3319643488 conn(0x7f81c406c7d0 msgr2=0x7f81c406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:33.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 -- 192.168.123.103:0/3319643488 shutdown_connections 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 -- 192.168.123.103:0/3319643488 wait complete. 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 Processor -- start 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.191+0000 7f81ca8c5640 1 -- start start 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81ca8c5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81ca8c5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 0x7f81c4113420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81ca8c5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81c4113b00 con 0x7f81c4072a40 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81ca8c5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81c4117840 con 0x7f81c4072120 2026-03-09T16:16:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46572/0 (socket says 192.168.123.103:46572) 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 -- 192.168.123.103:0/72255672 learned_addr learned my addr 192.168.123.103:0/72255672 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 -- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 msgr2=0x7f81c4113420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 0x7f81c4113420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 -- 192.168.123.103:0/72255672 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81b8009660 con 0x7f81c4072120 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81c98c3640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f81b400e990 tx=0x7f81b400ee60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81b400cd30 con 0x7f81c4072120 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.192+0000 7f81ca8c5640 1 -- 192.168.123.103:0/72255672 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81c4117b20 con 0x7f81c4072120 2026-03-09T16:16:33.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.193+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f81b400ce90 con 0x7f81c4072120 2026-03-09T16:16:33.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.193+0000 7f81ca8c5640 1 -- 192.168.123.103:0/72255672 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81c4118070 con 0x7f81c4072120 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.193+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81b4004280 con 0x7f81c4072120 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.194+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f81b40043e0 con 0x7f81c4072120 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.194+0000 7f81b2ffd640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 0x7f81a0079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.194+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f81b4014070 con 0x7f81c4072120 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.195+0000 7f81c90c2640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 0x7f81a0079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:33.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.195+0000 7f81ca8c5640 1 -- 192.168.123.103:0/72255672 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f81c4108570 con 0x7f81c4072120 2026-03-09T16:16:33.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.197+0000 7f81c90c2640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 0x7f81a0079a20 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f81c4114420 tx=0x7f81b80047c0 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:33.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.199+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f81b4062320 con 0x7f81c4072120 2026-03-09T16:16:33.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:32 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/109157308' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:33.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.372+0000 7f81ca8c5640 1 -- 192.168.123.103:0/72255672 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f81c411c9c0 con 0x7f81c4072120 2026-03-09T16:16:33.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.374+0000 7f81b2ffd640 1 -- 192.168.123.103:0/72255672 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f81b4061a70 con 0x7f81c4072120 2026-03-09T16:16:33.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.378+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 msgr2=0x7f81a0079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:33.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.378+0000 7f81b0ff9640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 0x7f81a0079a20 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f81c4114420 tx=0x7f81b80047c0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.378+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 msgr2=0x7f81c4112ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:33.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.378+0000 7f81b0ff9640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f81b400e990 tx=0x7f81b400ee60 comp rx=0 tx=0).stop 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.379+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 shutdown_connections 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.379+0000 7f81b0ff9640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f81a0077560 0x7f81a0079a20 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.379+0000 7f81b0ff9640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4072a40 0x7f81c4113420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.379+0000 7f81b0ff9640 1 --2- 192.168.123.103:0/72255672 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81c4072120 0x7f81c4112ec0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.379+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 >> 192.168.123.103:0/72255672 conn(0x7f81c406c7d0 msgr2=0x7f81c410dd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.380+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 shutdown_connections 2026-03-09T16:16:33.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:33.380+0000 7f81b0ff9640 1 -- 192.168.123.103:0/72255672 wait complete. 2026-03-09T16:16:33.391 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:16:33.496 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '"'"'.up_to_date | length == 2'"'"'' 2026-03-09T16:16:33.696 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.033+0000 7f3d6412e640 1 -- 192.168.123.103:0/1297584269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c072af0 msgr2=0x7f3d5c10ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.033+0000 7f3d6412e640 1 --2- 192.168.123.103:0/1297584269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c072af0 0x7f3d5c10ba70 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7f3d54009040 tx=0x7f3d5402fc10 comp rx=0 tx=0).stop 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- 192.168.123.103:0/1297584269 shutdown_connections 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 --2- 192.168.123.103:0/1297584269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c072af0 0x7f3d5c10ba70 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 --2- 192.168.123.103:0/1297584269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- 192.168.123.103:0/1297584269 >> 192.168.123.103:0/1297584269 conn(0x7f3d5c06c7e0 msgr2=0x7f3d5c06cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:34.035 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:33 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:34.035 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:33 vm03.local ceph-mon[51019]: pgmap v56: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.5 MiB/s wr, 102 op/s 2026-03-09T16:16:34.035 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:33 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/72255672' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- 192.168.123.103:0/1297584269 shutdown_connections 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- 192.168.123.103:0/1297584269 wait complete. 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 Processor -- start 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- start start 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c0843b0 0x7f3d5c07d9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d5c07e0b0 con 0x7f3d5c0843b0 2026-03-09T16:16:34.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.034+0000 7f3d6412e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d5c07e220 con 0x7f3d5c072140 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46598/0 (socket says 192.168.123.103:46598) 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 -- 192.168.123.103:0/1782588559 learned_addr learned my addr 192.168.123.103:0/1782588559 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 -- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c0843b0 msgr2=0x7f3d5c07d9d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c0843b0 0x7f3d5c07d9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 -- 192.168.123.103:0/1782588559 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d54008cf0 con 0x7f3d5c072140 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d61ea3640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f3d58009870 tx=0x7f3d58009d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:34.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d58010040 con 0x7f3d5c072140 2026-03-09T16:16:34.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d6412e640 1 -- 192.168.123.103:0/1782588559 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d5c081fb0 con 0x7f3d5c072140 2026-03-09T16:16:34.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.035+0000 7f3d6412e640 1 -- 192.168.123.103:0/1782588559 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d5c082500 con 0x7f3d5c072140 2026-03-09T16:16:34.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.036+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3d5800ecf0 con 0x7f3d5c072140 2026-03-09T16:16:34.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.036+0000 7f3d6412e640 1 -- 192.168.123.103:0/1782588559 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d5c108570 con 0x7f3d5c072140 2026-03-09T16:16:34.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.036+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d58002cf0 con 0x7f3d5c072140 2026-03-09T16:16:34.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.039+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f3d5801e3b0 con 0x7f3d5c072140 2026-03-09T16:16:34.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.040+0000 7f3d52ffd640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 0x7f3d40079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:34.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.040+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3d5809a3f0 con 0x7f3d5c072140 2026-03-09T16:16:34.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.040+0000 7f3d616a2640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 0x7f3d40079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:34.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.040+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f3d5809a870 con 0x7f3d5c072140 2026-03-09T16:16:34.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.041+0000 7f3d616a2640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 0x7f3d40079a20 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f3d54004bf0 tx=0x7f3d54004b40 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:34.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:34.213+0000 7f3d6412e640 1 -- 192.168.123.103:0/1782588559 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f3d5c078ea0 con 0x7f3d40077560 2026-03-09T16:16:34.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:33 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:34.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:33 vm05.local ceph-mon[58702]: pgmap v56: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.5 MiB/s wr, 102 op/s 2026-03-09T16:16:34.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:33 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/72255672' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:35 vm05.local ceph-mon[58702]: from='client.24503 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:35 vm05.local ceph-mon[58702]: pgmap v57: 65 pgs: 65 active+clean; 308 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 52 KiB/s rd, 2.2 MiB/s wr, 193 op/s 2026-03-09T16:16:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:35 vm03.local ceph-mon[51019]: from='client.24503 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:35 vm03.local ceph-mon[51019]: pgmap v57: 65 pgs: 65 active+clean; 308 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 52 KiB/s rd, 2.2 MiB/s wr, 193 op/s 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.112+0000 7f3d52ffd640 1 -- 192.168.123.103:0/1782588559 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+5673 (secure 0 0 0) 0x7f3d5c078ea0 con 0x7f3d40077560 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.114+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 msgr2=0x7f3d40079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.114+0000 7f3d50ff9640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 0x7f3d40079a20 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f3d54004bf0 tx=0x7f3d54004b40 comp rx=0 tx=0).stop 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.114+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 msgr2=0x7f3d5c07d490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f3d58009870 tx=0x7f3d58009d40 comp rx=0 tx=0).stop 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 shutdown_connections 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f3d40077560 0x7f3d40079a20 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d5c0843b0 0x7f3d5c07d9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 --2- 192.168.123.103:0/1782588559 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d5c072140 0x7f3d5c07d490 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 >> 192.168.123.103:0/1782588559 conn(0x7f3d5c06c7e0 msgr2=0x7f3d5c071490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 shutdown_connections 2026-03-09T16:16:36.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.115+0000 7f3d50ff9640 1 -- 192.168.123.103:0/1782588559 wait complete. 2026-03-09T16:16:36.125 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:16:36.173 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T16:16:36.370 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 -- 192.168.123.103:0/1223948961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 msgr2=0x7f146c108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 --2- 192.168.123.103:0/1223948961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c108be0 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f1460009a00 tx=0x7f146002f290 comp rx=0 tx=0).stop 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 -- 192.168.123.103:0/1223948961 shutdown_connections 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 --2- 192.168.123.103:0/1223948961 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c102c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 --2- 192.168.123.103:0/1223948961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c108be0 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.703+0000 7f146b577640 1 -- 192.168.123.103:0/1223948961 >> 192.168.123.103:0/1223948961 conn(0x7f146c0fe540 msgr2=0x7f146c100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:36.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.705+0000 7f146b577640 1 -- 192.168.123.103:0/1223948961 shutdown_connections 2026-03-09T16:16:36.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.705+0000 7f146b577640 1 -- 192.168.123.103:0/1223948961 wait complete. 2026-03-09T16:16:36.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.705+0000 7f146b577640 1 Processor -- start 2026-03-09T16:16:36.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146b577640 1 -- start start 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146b577640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c1a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146b577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c1a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146b577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f146c19a680 con 0x7f146c108800 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146b577640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f146c19a7c0 con 0x7f146c102800 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f1469d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c1a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f1469d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c1a0b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40026/0 (socket says 192.168.123.103:40026) 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f1469d74640 1 -- 192.168.123.103:0/1631798083 learned_addr learned my addr 192.168.123.103:0/1631798083 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.707+0000 7f146a575640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c1a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f146a575640 1 -- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 msgr2=0x7f146c1a0b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f146a575640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c1a0b60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f146a575640 1 -- 192.168.123.103:0/1631798083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1460009660 con 0x7f146c102800 2026-03-09T16:16:36.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f146a575640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c1a0620 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f146002f7a0 tx=0x7f1460004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:36.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1460004400 con 0x7f146c102800 2026-03-09T16:16:36.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.708+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f146002fd00 con 0x7f146c102800 2026-03-09T16:16:36.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.709+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1460041a40 con 0x7f146c102800 2026-03-09T16:16:36.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.710+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f146c19aa40 con 0x7f146c102800 2026-03-09T16:16:36.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.710+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f146c19ae60 con 0x7f146c102800 2026-03-09T16:16:36.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.710+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1430005350 con 0x7f146c102800 2026-03-09T16:16:36.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.712+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f146003f070 con 0x7f146c102800 2026-03-09T16:16:36.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.712+0000 7f14537fe640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 0x7f1444079810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:36.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.712+0000 7f1469d74640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 0x7f1444079810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:36.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.713+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f14600be450 con 0x7f146c102800 2026-03-09T16:16:36.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.714+0000 7f1469d74640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 0x7f1444079810 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f146c19bf40 tx=0x7f1454009290 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:36.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.714+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f14600870b0 con 0x7f146c102800 2026-03-09T16:16:36.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.841+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1430002bf0 con 0x7f1444077350 2026-03-09T16:16:36.847 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (52s) 33s ago 6m 16.5M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 33s ago 6m 9013k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (5m) 72s ago 5m 8812k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (6m) 33s ago 6m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (5m) 72s ago 5m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (36s) 33s ago 6m 37.3M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (4m) 33s ago 4m 17.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (4m) 33s ago 4m 272M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (4m) 72s ago 4m 15.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (4m) 72s ago 4m 17.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (2m) 33s ago 7m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (96s) 72s ago 5m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (7m) 33s ago 7m 60.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (5m) 72s ago 5m 48.2M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (83s) 33s ago 6m 9110k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (76s) 72s ago 5m 5356k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 33s ago 5m 297M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 33s ago 5m 300M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 33s ago 5m 251M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (5m) 72s ago 5m 354M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (4m) 72s ago 4m 304M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (4m) 72s ago 4m 266M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (60s) 33s ago 6m 40.2M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:16:36.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.847+0000 7f14537fe640 1 -- 192.168.123.103:0/1631798083 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f1430002bf0 con 0x7f1444077350 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 msgr2=0x7f1444079810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 0x7f1444079810 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f146c19bf40 tx=0x7f1454009290 comp rx=0 tx=0).stop 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 msgr2=0x7f146c1a0620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c1a0620 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f146002f7a0 tx=0x7f1460004290 comp rx=0 tx=0).stop 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 shutdown_connections 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f1444077350 0x7f1444079810 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f146c108800 0x7f146c1a0b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 --2- 192.168.123.103:0/1631798083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f146c102800 0x7f146c1a0620 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.850+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 >> 192.168.123.103:0/1631798083 conn(0x7f146c0fe540 msgr2=0x7f146c0fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.851+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 shutdown_connections 2026-03-09T16:16:36.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:36.851+0000 7f146b577640 1 -- 192.168.123.103:0/1631798083 wait complete. 2026-03-09T16:16:36.903 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T16:16:36.903 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:16:36.903 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs true' 2026-03-09T16:16:37.187 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:37.241 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:37 vm03.local ceph-mon[51019]: from='client.24505 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:37.241 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:37 vm03.local ceph-mon[51019]: pgmap v58: 65 pgs: 65 active+clean; 308 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.5 MiB/s wr, 148 op/s 2026-03-09T16:16:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:37 vm05.local ceph-mon[58702]: from='client.24505 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:37 vm05.local ceph-mon[58702]: pgmap v58: 65 pgs: 65 active+clean; 308 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.5 MiB/s wr, 148 op/s 2026-03-09T16:16:37.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 -- 192.168.123.103:0/1174532910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f40ff170 msgr2=0x7f36f40ff5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 --2- 192.168.123.103:0/1174532910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f40ff170 0x7f36f40ff5b0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f36e80098e0 tx=0x7f36e802f1d0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 -- 192.168.123.103:0/1174532910 shutdown_connections 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 --2- 192.168.123.103:0/1174532910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f40ff170 0x7f36f40ff5b0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 --2- 192.168.123.103:0/1174532910 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f4101820 0x7f36f40fec30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.594+0000 7f36fc171640 1 -- 192.168.123.103:0/1174532910 >> 192.168.123.103:0/1174532910 conn(0x7f36f40faa80 msgr2=0x7f36f40fcea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.595+0000 7f36fc171640 1 -- 192.168.123.103:0/1174532910 shutdown_connections 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.595+0000 7f36fc171640 1 -- 192.168.123.103:0/1174532910 wait complete. 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.595+0000 7f36fc171640 1 Processor -- start 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.595+0000 7f36fc171640 1 -- start start 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.595+0000 7f36fc171640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 0x7f36f419a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36fc171640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 0x7f36f419af30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36fc171640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36f419b610 con 0x7f36f40ff170 2026-03-09T16:16:37.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36fc171640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36f419f360 con 0x7f36f4101820 2026-03-09T16:16:37.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36f96e5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 0x7f36f419af30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36f96e5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 0x7f36f419af30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46636/0 (socket says 192.168.123.103:46636) 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36f96e5640 1 -- 192.168.123.103:0/3387787362 learned_addr learned my addr 192.168.123.103:0/3387787362 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.596+0000 7f36f9ee6640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 0x7f36f419a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36f9ee6640 1 -- 192.168.123.103:0/3387787362 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 msgr2=0x7f36f419af30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36f9ee6640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 0x7f36f419af30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36f9ee6640 1 -- 192.168.123.103:0/3387787362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f36e8009590 con 0x7f36f40ff170 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36f9ee6640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 0x7f36f419a9f0 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f36e400d560 tx=0x7f36e400da30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f36e4004a50 con 0x7f36f40ff170 2026-03-09T16:16:37.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.597+0000 7f36fc171640 1 -- 192.168.123.103:0/3387787362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f36f419f640 con 0x7f36f40ff170 2026-03-09T16:16:37.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.598+0000 7f36fc171640 1 -- 192.168.123.103:0/3387787362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f36f419fb90 con 0x7f36f40ff170 2026-03-09T16:16:37.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.598+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f36e40054e0 con 0x7f36f40ff170 2026-03-09T16:16:37.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.598+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f36e400fc20 con 0x7f36f40ff170 2026-03-09T16:16:37.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.599+0000 7f36fc171640 1 -- 192.168.123.103:0/3387787362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f36bc005350 con 0x7f36f40ff170 2026-03-09T16:16:37.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.600+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f36e4016030 con 0x7f36f40ff170 2026-03-09T16:16:37.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.600+0000 7f36e2ffd640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 0x7f36c80799d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:37.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.600+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f36e4099570 con 0x7f36f40ff170 2026-03-09T16:16:37.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.603+0000 7f36f96e5640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 0x7f36c80799d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:37.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.604+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f36e40621d0 con 0x7f36f40ff170 2026-03-09T16:16:37.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.624+0000 7f36f96e5640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 0x7f36c80799d0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f36f419bfa0 tx=0x7f36e803a040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:37.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.730+0000 7f36fc171640 1 -- 192.168.123.103:0/3387787362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7f36bc0058d0 con 0x7f36f40ff170 2026-03-09T16:16:37.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.737+0000 7f36e2ffd640 1 -- 192.168.123.103:0/3387787362 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v39)=0 v39) v1 ==== 125+0+0 (secure 0 0 0) 0x7f36e4061920 con 0x7f36f40ff170 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 msgr2=0x7f36c80799d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 0x7f36c80799d0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f36f419bfa0 tx=0x7f36e803a040 comp rx=0 tx=0).stop 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 msgr2=0x7f36f419a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 0x7f36f419a9f0 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f36e400d560 tx=0x7f36e400da30 comp rx=0 tx=0).stop 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 shutdown_connections 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f36c8077510 0x7f36c80799d0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36f4101820 0x7f36f419af30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 --2- 192.168.123.103:0/3387787362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36f40ff170 0x7f36f419a9f0 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.740+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 >> 192.168.123.103:0/3387787362 conn(0x7f36f40faa80 msgr2=0x7f36f40fae60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.741+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 shutdown_connections 2026-03-09T16:16:37.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:37.741+0000 7f36e0ff9640 1 -- 192.168.123.103:0/3387787362 wait complete. 2026-03-09T16:16:37.818 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T16:16:37.818 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:16:37.818 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T16:16:38.038 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 -- 192.168.123.103:0/1890794322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072cf0 msgr2=0x7fef3410cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 --2- 192.168.123.103:0/1890794322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072cf0 0x7fef3410cd90 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7fef280098e0 tx=0x7fef2802f1b0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 -- 192.168.123.103:0/1890794322 shutdown_connections 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 --2- 192.168.123.103:0/1890794322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072cf0 0x7fef3410cd90 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 --2- 192.168.123.103:0/1890794322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072340 0x7fef34072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 -- 192.168.123.103:0/1890794322 >> 192.168.123.103:0/1890794322 conn(0x7fef3406b7f0 msgr2=0x7fef3406bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:38.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 -- 192.168.123.103:0/1890794322 shutdown_connections 2026-03-09T16:16:38.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.487+0000 7fef3c663640 1 -- 192.168.123.103:0/1890794322 wait complete. 2026-03-09T16:16:38.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.488+0000 7fef3c663640 1 Processor -- start 2026-03-09T16:16:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3c663640 1 -- start start 2026-03-09T16:16:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3c663640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072340 0x7fef341ad4f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3c663640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3c663640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef341ae050 con 0x7fef34072340 2026-03-09T16:16:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3c663640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef341a7630 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46654/0 (socket says 192.168.123.103:46654) 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 -- 192.168.123.103:0/742422621 learned_addr learned my addr 192.168.123.103:0/742422621 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef3a3d8640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072340 0x7fef341ad4f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 -- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072340 msgr2=0x7fef341ad4f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072340 0x7fef341ad4f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.492+0000 7fef39bd7640 1 -- 192.168.123.103:0/742422621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fef28009590 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef39bd7640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fef280044e0 tx=0x7fef28004510 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fef2803d070 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fef2802fbe0 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fef28041730 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fef341a7830 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.493+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fef341a7ca0 con 0x7fef34072cf0 2026-03-09T16:16:38.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.494+0000 7fef217fa640 1 -- 192.168.123.103:0/742422621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fef00005350 con 0x7fef34072cf0 2026-03-09T16:16:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.495+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7fef28038550 con 0x7fef34072cf0 2026-03-09T16:16:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.496+0000 7fef237fe640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 0x7fef10079810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.496+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fef280be410 con 0x7fef34072cf0 2026-03-09T16:16:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.496+0000 7fef3a3d8640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 0x7fef10079810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.496+0000 7fef3a3d8640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 0x7fef10079810 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fef24009de0 tx=0x7fef24009340 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:38.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.498+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fef280870f0 con 0x7fef34072cf0 2026-03-09T16:16:38.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.673+0000 7fef217fa640 1 -- 192.168.123.103:0/742422621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7fef00005600 con 0x7fef34072cf0 2026-03-09T16:16:38.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.675+0000 7fef237fe640 1 -- 192.168.123.103:0/742422621 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v39)=0 v39) v1 ==== 155+0+0 (secure 0 0 0) 0x7fef28086840 con 0x7fef34072cf0 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 msgr2=0x7fef10079810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 0x7fef10079810 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fef24009de0 tx=0x7fef24009340 comp rx=0 tx=0).stop 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 msgr2=0x7fef341ada30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fef280044e0 tx=0x7fef28004510 comp rx=0 tx=0).stop 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 shutdown_connections 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fef10077350 0x7fef10079810 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fef34072cf0 0x7fef341ada30 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 --2- 192.168.123.103:0/742422621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fef34072340 0x7fef341ad4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 >> 192.168.123.103:0/742422621 conn(0x7fef3406b7f0 msgr2=0x7fef3410dfe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 shutdown_connections 2026-03-09T16:16:38.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:38.678+0000 7fef3c663640 1 -- 192.168.123.103:0/742422621 wait complete. 2026-03-09T16:16:38.780 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T16:16:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:38 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/3387787362' entity='client.admin' 2026-03-09T16:16:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:38 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:38 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:38 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:39.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:38 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:39.059 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:39.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:38 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/3387787362' entity='client.admin' 2026-03-09T16:16:39.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:38 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:39.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:38 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:39.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:38 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:39.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:38 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:39.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.407+0000 7f879eff2640 1 -- 192.168.123.103:0/3636090184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 msgr2=0x7f8798104b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:39.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.407+0000 7f879eff2640 1 --2- 192.168.123.103:0/3636090184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f8798104b60 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f87880099b0 tx=0x7f878802f220 comp rx=0 tx=0).stop 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.410+0000 7f879eff2640 1 -- 192.168.123.103:0/3636090184 shutdown_connections 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.410+0000 7f879eff2640 1 --2- 192.168.123.103:0/3636090184 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f87980febe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.410+0000 7f879eff2640 1 --2- 192.168.123.103:0/3636090184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f8798104b60 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.410+0000 7f879eff2640 1 -- 192.168.123.103:0/3636090184 >> 192.168.123.103:0/3636090184 conn(0x7f87980fa4a0 msgr2=0x7f87980fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 -- 192.168.123.103:0/3636090184 shutdown_connections 2026-03-09T16:16:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 -- 192.168.123.103:0/3636090184 wait complete. 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 Processor -- start 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 -- start start 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f879819ac10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f879819b2a0 con 0x7f8798104780 2026-03-09T16:16:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.411+0000 7f879eff2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f879819efc0 con 0x7f87980fe780 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.412+0000 7f879dff0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.412+0000 7f879dff0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50096/0 (socket says 192.168.123.103:50096) 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.412+0000 7f879dff0640 1 -- 192.168.123.103:0/3650419649 learned_addr learned my addr 192.168.123.103:0/3650419649 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.412+0000 7f879d7ef640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f879819ac10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.413+0000 7f879dff0640 1 -- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 msgr2=0x7f879819ac10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.413+0000 7f879dff0640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f879819ac10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.413+0000 7f879dff0640 1 -- 192.168.123.103:0/3650419649 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8788009660 con 0x7f87980fe780 2026-03-09T16:16:39.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f879dff0640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f8788009980 tx=0x7f8788031cd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:39.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f878803d070 con 0x7f87980fe780 2026-03-09T16:16:39.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f879eff2640 1 -- 192.168.123.103:0/3650419649 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f879819f240 con 0x7f87980fe780 2026-03-09T16:16:39.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f879eff2640 1 -- 192.168.123.103:0/3650419649 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f879819f730 con 0x7f87980fe780 2026-03-09T16:16:39.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f87880043d0 con 0x7f87980fe780 2026-03-09T16:16:39.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.414+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8788031280 con 0x7f87980fe780 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.415+0000 7f879eff2640 1 -- 192.168.123.103:0/3650419649 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87980ffec0 con 0x7f87980fe780 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.419+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f8788049050 con 0x7f87980fe780 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.419+0000 7f8786ffd640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 0x7f8774079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.420+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f87880bdec0 con 0x7f87980fe780 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.420+0000 7f879d7ef640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 0x7f8774079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:39.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.420+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f87880c24d0 con 0x7f87980fe780 2026-03-09T16:16:39.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.420+0000 7f879d7ef640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 0x7f8774079a20 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f878c005fd0 tx=0x7f878c00c040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:39.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.550+0000 7f879eff2640 1 -- 192.168.123.103:0/3650419649 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f87980febe0 con 0x7f87980fe780 2026-03-09T16:16:39.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.553+0000 7f8786ffd640 1 -- 192.168.123.103:0/3650419649 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v39)=0 v39) v1 ==== 163+0+0 (secure 0 0 0) 0x7f8788086a70 con 0x7f87980fe780 2026-03-09T16:16:39.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.556+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 msgr2=0x7f8774079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:39.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.556+0000 7f8784ff9640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 0x7f8774079a20 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f878c005fd0 tx=0x7f878c00c040 comp rx=0 tx=0).stop 2026-03-09T16:16:39.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.556+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 msgr2=0x7f879819a6d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:39.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.556+0000 7f8784ff9640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f8788009980 tx=0x7f8788031cd0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.557+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 shutdown_connections 2026-03-09T16:16:39.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.557+0000 7f8784ff9640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f8774077560 0x7f8774079a20 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.557+0000 7f8784ff9640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8798104780 0x7f879819ac10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.557+0000 7f8784ff9640 1 --2- 192.168.123.103:0/3650419649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f87980fe780 0x7f879819a6d0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:39.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.557+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 >> 192.168.123.103:0/3650419649 conn(0x7f87980fa4a0 msgr2=0x7f8798108640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:39.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.559+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 shutdown_connections 2026-03-09T16:16:39.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:39.560+0000 7f8784ff9640 1 -- 192.168.123.103:0/3650419649 wait complete. 2026-03-09T16:16:39.617 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T16:16:39.869 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:39.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:39 vm03.local ceph-mon[51019]: pgmap v59: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 2.2 MiB/s wr, 398 op/s 2026-03-09T16:16:40.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:39 vm05.local ceph-mon[58702]: pgmap v59: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 2.2 MiB/s wr, 398 op/s 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.270+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/4141834404 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072140 msgr2=0x7fa4a0072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.270+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/4141834404 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072140 0x7fa4a0072520 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fa490009880 tx=0x7fa4900304e0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/4141834404 shutdown_connections 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/4141834404 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a0072af0 0x7fa4a010ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/4141834404 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072140 0x7fa4a0072520 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/4141834404 >> 192.168.123.103:0/4141834404 conn(0x7fa4a006c7e0 msgr2=0x7fa4a006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:40.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/4141834404 shutdown_connections 2026-03-09T16:16:40.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.271+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/4141834404 wait complete. 2026-03-09T16:16:40.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 Processor -- start 2026-03-09T16:16:40.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 -- start start 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a00842d0 0x7fa4a007d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4a007df80 con 0x7fa4a0072af0 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a6ca1640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4a007e0f0 con 0x7fa4a00842d0 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58720/0 (socket says 192.168.123.103:58720) 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 -- 192.168.123.103:0/527485447 learned_addr learned my addr 192.168.123.103:0/527485447 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa49ffff640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a00842d0 0x7fa4a007d8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 -- 192.168.123.103:0/527485447 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a00842d0 msgr2=0x7fa4a007d8a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a00842d0 0x7fa4a007d8a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.272+0000 7fa4a4a16640 1 -- 192.168.123.103:0/527485447 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa490009560 con 0x7fa4a0072af0 2026-03-09T16:16:40.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.273+0000 7fa4a4a16640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fa4900378c0 tx=0x7fa490037d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:40.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.274+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa490004200 con 0x7fa4a0072af0 2026-03-09T16:16:40.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.274+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa4a0081e70 con 0x7fa4a0072af0 2026-03-09T16:16:40.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.274+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa4a0082360 con 0x7fa4a0072af0 2026-03-09T16:16:40.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.275+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa4a0108570 con 0x7fa4a0072af0 2026-03-09T16:16:40.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.275+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa490050080 con 0x7fa4a0072af0 2026-03-09T16:16:40.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.275+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa490002db0 con 0x7fa4a0072af0 2026-03-09T16:16:40.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.278+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7fa4900026e0 con 0x7fa4a0072af0 2026-03-09T16:16:40.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.278+0000 7fa49dffb640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 0x7fa48c079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:40.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.279+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa4900be080 con 0x7fa4a0072af0 2026-03-09T16:16:40.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.279+0000 7fa49ffff640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 0x7fa48c079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:40.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.279+0000 7fa49ffff640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 0x7fa48c079a20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa4a0072f60 tx=0x7fa498002d20 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:40.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.281+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fa490086d40 con 0x7fa4a0072af0 2026-03-09T16:16:40.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.398+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7fa4a010ba70 con 0x7fa4a0072af0 2026-03-09T16:16:40.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.398+0000 7fa49dffb640 1 -- 192.168.123.103:0/527485447 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v39)=0 v39) v1 ==== 135+0+0 (secure 0 0 0) 0x7fa490086490 con 0x7fa4a0072af0 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 msgr2=0x7fa48c079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 0x7fa48c079a20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa4a0072f60 tx=0x7fa498002d20 comp rx=0 tx=0).stop 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 msgr2=0x7fa4a007d360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fa4900378c0 tx=0x7fa490037d20 comp rx=0 tx=0).stop 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 shutdown_connections 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fa48c077560 0x7fa48c079a20 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4a00842d0 0x7fa4a007d8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 --2- 192.168.123.103:0/527485447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4a0072af0 0x7fa4a007d360 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.401+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 >> 192.168.123.103:0/527485447 conn(0x7fa4a006c7e0 msgr2=0x7fa4a00714c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.402+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 shutdown_connections 2026-03-09T16:16:40.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:40.402+0000 7fa4a6ca1640 1 -- 192.168.123.103:0/527485447 wait complete. 2026-03-09T16:16:40.536 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-09T16:16:40.738 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:41.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 -- 192.168.123.103:0/2228936346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688072a40 msgr2=0x7fc68810ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:41.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 --2- 192.168.123.103:0/2228936346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688072a40 0x7fc68810ca90 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fc684009f90 tx=0x7fc68402f440 comp rx=0 tx=0).stop 2026-03-09T16:16:41.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 -- 192.168.123.103:0/2228936346 shutdown_connections 2026-03-09T16:16:41.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 --2- 192.168.123.103:0/2228936346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688072a40 0x7fc68810ca90 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 --2- 192.168.123.103:0/2228936346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 -- 192.168.123.103:0/2228936346 >> 192.168.123.103:0/2228936346 conn(0x7fc68806c7d0 msgr2=0x7fc68806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 -- 192.168.123.103:0/2228936346 shutdown_connections 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.097+0000 7fc68f094640 1 -- 192.168.123.103:0/2228936346 wait complete. 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 Processor -- start 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 -- start start 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688112c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 0x7fc6881b9ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6881136f0 con 0x7fc688072120 2026-03-09T16:16:41.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.098+0000 7fc68f094640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc688113860 con 0x7fc688113140 2026-03-09T16:16:41.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.099+0000 7fc68e092640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688112c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:41.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.099+0000 7fc68e092640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688112c00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58740/0 (socket says 192.168.123.103:58740) 2026-03-09T16:16:41.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.099+0000 7fc68e092640 1 -- 192.168.123.103:0/3643472947 learned_addr learned my addr 192.168.123.103:0/3643472947 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:41.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.099+0000 7fc68d891640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 0x7fc6881b9ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:41.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.100+0000 7fc68d891640 1 -- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 msgr2=0x7fc688112c00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:41.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.100+0000 7fc68d891640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688112c00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.100+0000 7fc68d891640 1 -- 192.168.123.103:0/3643472947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc684009c40 con 0x7fc688113140 2026-03-09T16:16:41.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.100+0000 7fc68d891640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 0x7fc6881b9ad0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fc684009f90 tx=0x7fc6840041c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:41.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.103+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc684002c70 con 0x7fc688113140 2026-03-09T16:16:41.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.103+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6881ba070 con 0x7fc688113140 2026-03-09T16:16:41.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.104+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6881ba5c0 con 0x7fc688113140 2026-03-09T16:16:41.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.105+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc684002dd0 con 0x7fc688113140 2026-03-09T16:16:41.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.105+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc6840407e0 con 0x7fc688113140 2026-03-09T16:16:41.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.106+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7fc684007ac0 con 0x7fc688113140 2026-03-09T16:16:41.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.107+0000 7fc6777fe640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 0x7fc660079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:41.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.107+0000 7fc68e092640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 0x7fc660079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:41.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.107+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc6840bdbd0 con 0x7fc688113140 2026-03-09T16:16:41.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.107+0000 7fc68e092640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 0x7fc660079af0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fc678009ea0 tx=0x7fc678009340 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:41.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.107+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc650005350 con 0x7fc688113140 2026-03-09T16:16:41.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.110+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fc6840868b0 con 0x7fc688113140 2026-03-09T16:16:41.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.234+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7fc650002bf0 con 0x7fc660077630 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.259+0000 7fc6777fe640 1 -- 192.168.123.103:0/3643472947 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7fc650002bf0 con 0x7fc660077630 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 msgr2=0x7fc660079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 0x7fc660079af0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fc678009ea0 tx=0x7fc678009340 comp rx=0 tx=0).stop 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 msgr2=0x7fc6881b9ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 0x7fc6881b9ad0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fc684009f90 tx=0x7fc6840041c0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 shutdown_connections 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fc660077630 0x7fc660079af0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc688113140 0x7fc6881b9ad0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 --2- 192.168.123.103:0/3643472947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc688072120 0x7fc688112c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 >> 192.168.123.103:0/3643472947 conn(0x7fc68806c7d0 msgr2=0x7fc688070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.261+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 shutdown_connections 2026-03-09T16:16:41.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:41.262+0000 7fc68f094640 1 -- 192.168.123.103:0/3643472947 wait complete. 2026-03-09T16:16:41.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:41 vm03.local ceph-mon[51019]: pgmap v60: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 340 op/s 2026-03-09T16:16:41.366 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T16:16:41.366 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:16:41.366 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-09T16:16:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:41 vm05.local ceph-mon[58702]: pgmap v60: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 340 op/s 2026-03-09T16:16:41.820 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='client.24525 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:42.428 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:42 vm03.local ceph-mon[51019]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 -- 192.168.123.103:0/1169012385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 msgr2=0x7fcca80b75b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 --2- 192.168.123.103:0/1169012385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca80b75b0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fcca40098e0 tx=0x7fcca402f200 comp rx=0 tx=0).stop 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 -- 192.168.123.103:0/1169012385 shutdown_connections 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 --2- 192.168.123.103:0/1169012385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca80b75b0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 --2- 192.168.123.103:0/1169012385 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca80a4dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.440+0000 7fccb6507640 1 -- 192.168.123.103:0/1169012385 >> 192.168.123.103:0/1169012385 conn(0x7fcca801a3c0 msgr2=0x7fcca801a7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.441+0000 7fccb6507640 1 -- 192.168.123.103:0/1169012385 shutdown_connections 2026-03-09T16:16:42.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.441+0000 7fccb6507640 1 -- 192.168.123.103:0/1169012385 wait complete. 2026-03-09T16:16:42.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 Processor -- start 2026-03-09T16:16:42.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 -- start start 2026-03-09T16:16:42.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca8145470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcca8145b00 con 0x7fcca80a49f0 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb6507640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcca8149870 con 0x7fcca80a5310 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58754/0 (socket says 192.168.123.103:58754) 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 -- 192.168.123.103:0/514182145 learned_addr learned my addr 192.168.123.103:0/514182145 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb4d04640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca8145470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 -- 192.168.123.103:0/514182145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 msgr2=0x7fcca8145470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca8145470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.442+0000 7fccb5505640 1 -- 192.168.123.103:0/514182145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcca4009590 con 0x7fcca80a49f0 2026-03-09T16:16:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fccb5505640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7fccac00e990 tx=0x7fccac00ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccac00cd30 con 0x7fcca80a49f0 2026-03-09T16:16:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fccb6507640 1 -- 192.168.123.103:0/514182145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcca8149b50 con 0x7fcca80a49f0 2026-03-09T16:16:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fccb6507640 1 -- 192.168.123.103:0/514182145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcca814a0a0 con 0x7fcca80a49f0 2026-03-09T16:16:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fccac00ce90 con 0x7fcca80a49f0 2026-03-09T16:16:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.443+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccac002b40 con 0x7fcca80a49f0 2026-03-09T16:16:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.444+0000 7fccb6507640 1 -- 192.168.123.103:0/514182145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc78005350 con 0x7fcca80a49f0 2026-03-09T16:16:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.445+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7fccac0040d0 con 0x7fcca80a49f0 2026-03-09T16:16:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.445+0000 7fcc9e7fc640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 0x7fcc88079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.445+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fccac014070 con 0x7fcca80a49f0 2026-03-09T16:16:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.446+0000 7fccb4d04640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 0x7fcc88079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.446+0000 7fccb4d04640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 0x7fcc88079a20 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fcca81464e0 tx=0x7fcca403a040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.448+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fccac062860 con 0x7fcca80a49f0 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='client.24525 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:42 vm05.local ceph-mon[58702]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T16:16:42.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.567+0000 7fccb6507640 1 -- 192.168.123.103:0/514182145 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcc78002bf0 con 0x7fcc88077560 2026-03-09T16:16:42.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.569+0000 7fcc9e7fc640 1 -- 192.168.123.103:0/514182145 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fcc78002bf0 con 0x7fcc88077560 2026-03-09T16:16:42.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.573+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 msgr2=0x7fcc88079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.573+0000 7fcc7ffff640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 0x7fcc88079a20 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fcca81464e0 tx=0x7fcca403a040 comp rx=0 tx=0).stop 2026-03-09T16:16:42.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.573+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 msgr2=0x7fcca8144f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.573+0000 7fcc7ffff640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7fccac00e990 tx=0x7fccac00ee60 comp rx=0 tx=0).stop 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 shutdown_connections 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7fcc88077560 0x7fcc88079a20 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca80a5310 0x7fcca8145470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 --2- 192.168.123.103:0/514182145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcca80a49f0 0x7fcca8144f30 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 >> 192.168.123.103:0/514182145 conn(0x7fcca801a3c0 msgr2=0x7fcca80a39e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 shutdown_connections 2026-03-09T16:16:42.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.575+0000 7fcc7ffff640 1 -- 192.168.123.103:0/514182145 wait complete. 2026-03-09T16:16:42.584 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 -- 192.168.123.103:0/956217955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00a5310 msgr2=0x7f18b00b75b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 --2- 192.168.123.103:0/956217955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00a5310 0x7f18b00b75b0 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7f18b4066a00 tx=0x7f18b4091d70 comp rx=0 tx=0).stop 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 -- 192.168.123.103:0/956217955 shutdown_connections 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 --2- 192.168.123.103:0/956217955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00a5310 0x7f18b00b75b0 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 --2- 192.168.123.103:0/956217955 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00a4dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 -- 192.168.123.103:0/956217955 >> 192.168.123.103:0/956217955 conn(0x7f18b001a3c0 msgr2=0x7f18b001a7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 -- 192.168.123.103:0/956217955 shutdown_connections 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.635+0000 7f18bb596640 1 -- 192.168.123.103:0/956217955 wait complete. 2026-03-09T16:16:42.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 Processor -- start 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 -- start start 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00b0b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 0x7f18b0157480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18b00b16c0 con 0x7f18b00b10c0 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.636+0000 7f18bb596640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18b00b1830 con 0x7f18b00a49f0 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18ba594640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00b0b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18ba594640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00b0b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50154/0 (socket says 192.168.123.103:50154) 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18ba594640 1 -- 192.168.123.103:0/999095598 learned_addr learned my addr 192.168.123.103:0/999095598 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:42.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18b9d93640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 0x7f18b0157480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18b9d93640 1 -- 192.168.123.103:0/999095598 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 msgr2=0x7f18b00b0b80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18b9d93640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00b0b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18b9d93640 1 -- 192.168.123.103:0/999095598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18b404f090 con 0x7f18b00b10c0 2026-03-09T16:16:42.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.637+0000 7f18b9d93640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 0x7f18b0157480 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f18b40645d0 tx=0x7f18b4067800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.638+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f18b40a0070 con 0x7f18b00b10c0 2026-03-09T16:16:42.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.638+0000 7f18bb596640 1 -- 192.168.123.103:0/999095598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f18b0157a20 con 0x7f18b00b10c0 2026-03-09T16:16:42.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.638+0000 7f18bb596640 1 -- 192.168.123.103:0/999095598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f18b0157f40 con 0x7f18b00b10c0 2026-03-09T16:16:42.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.639+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f18b409b470 con 0x7f18b00b10c0 2026-03-09T16:16:42.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.639+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f18b4094280 con 0x7f18b00b10c0 2026-03-09T16:16:42.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.640+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f18b40927e0 con 0x7f18b00b10c0 2026-03-09T16:16:42.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.641+0000 7f18a77fe640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 0x7f188c079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.642+0000 7f18ba594640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 0x7f188c079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.642+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f18b4121f50 con 0x7f18b00b10c0 2026-03-09T16:16:42.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.642+0000 7f18ba594640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 0x7f188c079af0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f18a00059c0 tx=0x7f18a000a430 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.642+0000 7f18bb596640 1 -- 192.168.123.103:0/999095598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f18b00a5de0 con 0x7f18b00b10c0 2026-03-09T16:16:42.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.646+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f18b40eab80 con 0x7f18b00b10c0 2026-03-09T16:16:42.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.755+0000 7f18bb596640 1 -- 192.168.123.103:0/999095598 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f18b00b1eb0 con 0x7f188c077630 2026-03-09T16:16:42.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.760+0000 7f18a77fe640 1 -- 192.168.123.103:0/999095598 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f18b00b1eb0 con 0x7f188c077630 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.764+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 msgr2=0x7f188c079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.764+0000 7f18a57fa640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 0x7f188c079af0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f18a00059c0 tx=0x7f18a000a430 comp rx=0 tx=0).stop 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.764+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 msgr2=0x7f18b0157480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.764+0000 7f18a57fa640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 0x7f18b0157480 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f18b40645d0 tx=0x7f18b4067800 comp rx=0 tx=0).stop 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.765+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 shutdown_connections 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.765+0000 7f18a57fa640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f188c077630 0x7f188c079af0 secure :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f18a00059c0 tx=0x7f18a000a430 comp rx=0 tx=0).stop 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.765+0000 7f18a57fa640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b00b10c0 0x7f18b0157480 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.765+0000 7f18a57fa640 1 --2- 192.168.123.103:0/999095598 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18b00a49f0 0x7f18b00b0b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.765+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 >> 192.168.123.103:0/999095598 conn(0x7f18b001a3c0 msgr2=0x7f18b00b6770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.767+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 shutdown_connections 2026-03-09T16:16:42.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.767+0000 7f18a57fa640 1 -- 192.168.123.103:0/999095598 wait complete. 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 -- 192.168.123.103:0/3301936034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072af0 msgr2=0x7f30ac10ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 --2- 192.168.123.103:0/3301936034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072af0 0x7f30ac10ba70 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f30a4008030 tx=0x7f30a4031e80 comp rx=0 tx=0).stop 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 -- 192.168.123.103:0/3301936034 shutdown_connections 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 --2- 192.168.123.103:0/3301936034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072af0 0x7f30ac10ba70 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 --2- 192.168.123.103:0/3301936034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac072140 0x7f30ac072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 -- 192.168.123.103:0/3301936034 >> 192.168.123.103:0/3301936034 conn(0x7f30ac06c7e0 msgr2=0x7f30ac06cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 -- 192.168.123.103:0/3301936034 shutdown_connections 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.832+0000 7f30b2abd640 1 -- 192.168.123.103:0/3301936034 wait complete. 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 Processor -- start 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 -- start start 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 0x7f30ac07d3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 0x7f30ac07d910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30ac07dff0 con 0x7f30ac084340 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b2abd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30ac07e160 con 0x7f30ac072140 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b0832640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 0x7f30ac07d3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30abfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 0x7f30ac07d910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b0832640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 0x7f30ac07d3d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50180/0 (socket says 192.168.123.103:50180) 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.833+0000 7f30b0832640 1 -- 192.168.123.103:0/2801023611 learned_addr learned my addr 192.168.123.103:0/2801023611 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.834+0000 7f30abfff640 1 -- 192.168.123.103:0/2801023611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 msgr2=0x7f30ac07d3d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.834+0000 7f30abfff640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 0x7f30ac07d3d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.834+0000 7f30abfff640 1 -- 192.168.123.103:0/2801023611 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f30a4007ce0 con 0x7f30ac084340 2026-03-09T16:16:42.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.834+0000 7f30abfff640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 0x7f30ac07d910 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7f30a4008030 tx=0x7f30a4002ea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.835+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30a40102e0 con 0x7f30ac084340 2026-03-09T16:16:42.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.835+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f30a4032ca0 con 0x7f30ac084340 2026-03-09T16:16:42.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.835+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30a403be10 con 0x7f30ac084340 2026-03-09T16:16:42.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.835+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f30ac081ee0 con 0x7f30ac084340 2026-03-09T16:16:42.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.835+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f30ac082380 con 0x7f30ac084340 2026-03-09T16:16:42.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.836+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f30ac108570 con 0x7f30ac084340 2026-03-09T16:16:42.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.837+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f30a400e450 con 0x7f30ac084340 2026-03-09T16:16:42.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.838+0000 7f30a9ffb640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 0x7f308c079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:42.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.838+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f30a40bfaf0 con 0x7f30ac084340 2026-03-09T16:16:42.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.841+0000 7f30b0832640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 0x7f308c079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:42.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.842+0000 7f30b0832640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 0x7f308c079af0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f309c009e10 tx=0x7f309c009290 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:42.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.843+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f30a4088720 con 0x7f30ac084340 2026-03-09T16:16:42.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.956+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f30ac078e00 con 0x7f308c077630 2026-03-09T16:16:42.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.961+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2801023611 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f30ac078e00 con 0x7f308c077630 2026-03-09T16:16:42.962 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (59s) 39s ago 6m 16.5M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 39s ago 6m 9013k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (6m) 78s ago 6m 8812k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (6m) 39s ago 6m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (6m) 78s ago 6m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (42s) 39s ago 6m 37.3M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (4m) 39s ago 4m 17.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (4m) 39s ago 4m 272M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (4m) 78s ago 4m 15.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (4m) 78s ago 4m 17.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (2m) 39s ago 7m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (102s) 78s ago 5m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (7m) 39s ago 7m 60.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 b86752d320b6 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (5m) 78s ago 5m 48.2M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 90242efb0978 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (89s) 39s ago 6m 9110k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (82s) 78s ago 6m 5356k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 39s ago 5m 297M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 39s ago 5m 300M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 39s ago 5m 251M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (5m) 78s ago 5m 354M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (4m) 78s ago 4m 304M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (4m) 78s ago 4m 266M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:16:42.963 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (66s) 39s ago 6m 40.2M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 msgr2=0x7f308c079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 0x7f308c079af0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f309c009e10 tx=0x7f309c009290 comp rx=0 tx=0).stop 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 msgr2=0x7f30ac07d910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 0x7f30ac07d910 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7f30a4008030 tx=0x7f30a4002ea0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 shutdown_connections 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f308c077630 0x7f308c079af0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30ac084340 0x7f30ac07d910 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 --2- 192.168.123.103:0/2801023611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30ac072140 0x7f30ac07d3d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.968+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 >> 192.168.123.103:0/2801023611 conn(0x7f30ac06c7e0 msgr2=0x7f30ac07b0f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.969+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 shutdown_connections 2026-03-09T16:16:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:42.969+0000 7f30b2abd640 1 -- 192.168.123.103:0/2801023611 wait complete. 2026-03-09T16:16:43.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/1235715503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 msgr2=0x7f9eb4072490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 --2- 192.168.123.103:0/1235715503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb4072490 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f9ea400b0a0 tx=0x7f9ea402f550 comp rx=0 tx=0).stop 2026-03-09T16:16:43.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/1235715503 shutdown_connections 2026-03-09T16:16:43.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 --2- 192.168.123.103:0/1235715503 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 0x7f9eb410b9f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 --2- 192.168.123.103:0/1235715503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb4072490 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.030+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/1235715503 >> 192.168.123.103:0/1235715503 conn(0x7f9eb406c7e0 msgr2=0x7f9eb406cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/1235715503 shutdown_connections 2026-03-09T16:16:43.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/1235715503 wait complete. 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 Processor -- start 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 -- start start 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 0x7f9eb41ad960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9eb41adff0 con 0x7f9eb40729d0 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.036+0000 7f9ebb5d7640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9eb41a75e0 con 0x7f9eb40720b0 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50202/0 (socket says 192.168.123.103:50202) 2026-03-09T16:16:43.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 -- 192.168.123.103:0/2688822661 learned_addr learned my addr 192.168.123.103:0/2688822661 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb8b4b640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 0x7f9eb41ad960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 -- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 msgr2=0x7f9eb41ad960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 0x7f9eb41ad960 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 -- 192.168.123.103:0/2688822661 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ea4009d00 con 0x7f9eb40720b0 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.037+0000 7f9eb934c640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f9ea4009cd0 tx=0x7f9ea4009510 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.038+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ea403d070 con 0x7f9eb40720b0 2026-03-09T16:16:43.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.038+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/2688822661 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9eb41a77e0 con 0x7f9eb40720b0 2026-03-09T16:16:43.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.038+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/2688822661 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9eb41a7d00 con 0x7f9eb40720b0 2026-03-09T16:16:43.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.038+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9ea4032070 con 0x7f9eb40720b0 2026-03-09T16:16:43.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.039+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ea4037b90 con 0x7f9eb40720b0 2026-03-09T16:16:43.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.040+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f9ea4037cf0 con 0x7f9eb40720b0 2026-03-09T16:16:43.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.041+0000 7f9eb27fc640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 0x7f9e88079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.041+0000 7f9eb8b4b640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 0x7f9e88079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.041+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9ea40be4d0 con 0x7f9eb40720b0 2026-03-09T16:16:43.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.041+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/2688822661 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9eb4108570 con 0x7f9eb40720b0 2026-03-09T16:16:43.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.041+0000 7f9eb8b4b640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 0x7f9e88079a20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9eb40714c0 tx=0x7f9e9c008040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.045+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f9ea4087080 con 0x7f9eb40720b0 2026-03-09T16:16:43.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.225+0000 7f9ebb5d7640 1 -- 192.168.123.103:0/2688822661 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9eb41a8b10 con 0x7f9eb40720b0 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.226+0000 7f9eb27fc640 1 -- 192.168.123.103:0/2688822661 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f9ea4046070 con 0x7f9eb40720b0 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:16:43.228 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:16:43.229 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:16:43.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.234+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 msgr2=0x7f9e88079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.234+0000 7f9e83fff640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 0x7f9e88079a20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9eb40714c0 tx=0x7f9e9c008040 comp rx=0 tx=0).stop 2026-03-09T16:16:43.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.234+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 msgr2=0x7f9eb41ad420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.234+0000 7f9e83fff640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f9ea4009cd0 tx=0x7f9ea4009510 comp rx=0 tx=0).stop 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 shutdown_connections 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f9e88077560 0x7f9e88079a20 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9eb40729d0 0x7f9eb41ad960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 --2- 192.168.123.103:0/2688822661 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9eb40720b0 0x7f9eb41ad420 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 >> 192.168.123.103:0/2688822661 conn(0x7f9eb406c7e0 msgr2=0x7f9eb4070480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 shutdown_connections 2026-03-09T16:16:43.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.238+0000 7f9e83fff640 1 -- 192.168.123.103:0/2688822661 wait complete. 2026-03-09T16:16:43.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.351+0000 7f6d13fff640 1 -- 192.168.123.103:0/3240388052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072ad0 msgr2=0x7f6d1410b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.351+0000 7f6d13fff640 1 --2- 192.168.123.103:0/3240388052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072ad0 0x7f6d1410b9a0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f6d0c009040 tx=0x7f6d0c02fc10 comp rx=0 tx=0).stop 2026-03-09T16:16:43.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 -- 192.168.123.103:0/3240388052 shutdown_connections 2026-03-09T16:16:43.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 --2- 192.168.123.103:0/3240388052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072ad0 0x7f6d1410b9a0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 --2- 192.168.123.103:0/3240388052 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d14072120 0x7f6d14072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 -- 192.168.123.103:0/3240388052 >> 192.168.123.103:0/3240388052 conn(0x7f6d1406c7d0 msgr2=0x7f6d1406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 -- 192.168.123.103:0/3240388052 shutdown_connections 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 -- 192.168.123.103:0/3240388052 wait complete. 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 Processor -- start 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.352+0000 7f6d13fff640 1 -- start start 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d13fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d13fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d1407dac0 0x7f6d1407df20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d13fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d14084530 con 0x7f6d1407dac0 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d13fff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d14082040 con 0x7f6d14072120 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50230/0 (socket says 192.168.123.103:50230) 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 -- 192.168.123.103:0/1721380195 learned_addr learned my addr 192.168.123.103:0/1721380195 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 -- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d1407dac0 msgr2=0x7f6d1407df20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d1407dac0 0x7f6d1407df20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 -- 192.168.123.103:0/1721380195 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d0c008cf0 con 0x7f6d14072120 2026-03-09T16:16:43.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6d12ffd640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6d040079d0 tx=0x7f6d04007ea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.353+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d04010070 con 0x7f6d14072120 2026-03-09T16:16:43.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.354+0000 7f6d13fff640 1 -- 192.168.123.103:0/1721380195 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d14082320 con 0x7f6d14072120 2026-03-09T16:16:43.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.354+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6d0400ad90 con 0x7f6d14072120 2026-03-09T16:16:43.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.354+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d040153d0 con 0x7f6d14072120 2026-03-09T16:16:43.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.354+0000 7f6d13fff640 1 -- 192.168.123.103:0/1721380195 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d14082870 con 0x7f6d14072120 2026-03-09T16:16:43.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.355+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f6d0401d050 con 0x7f6d14072120 2026-03-09T16:16:43.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.355+0000 7f6cf3fff640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 0x7f6d00079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.356+0000 7f6d127fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 0x7f6d00079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.356+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6d04099bd0 con 0x7f6d14072120 2026-03-09T16:16:43.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.364+0000 7f6d127fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 0x7f6d00079a20 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6d1410db60 tx=0x7f6d0c004b40 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.364+0000 7f6d13fff640 1 -- 192.168.123.103:0/1721380195 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ce0005350 con 0x7f6d14072120 2026-03-09T16:16:43.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.378+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f6d04062780 con 0x7f6d14072120 2026-03-09T16:16:43.537 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:43 vm03.local ceph-mon[51019]: from='client.14732 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.537 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:43 vm03.local ceph-mon[51019]: from='client.14736 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.537 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:43 vm03.local ceph-mon[51019]: from='client.14740 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.537 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:43 vm03.local ceph-mon[51019]: pgmap v61: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 340 op/s 2026-03-09T16:16:43.537 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:43 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/2688822661' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:43.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.612+0000 7f6d13fff640 1 -- 192.168.123.103:0/1721380195 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f6ce0005e10 con 0x7f6d14072120 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 41 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14476} 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:43.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.614+0000 7f6cf3fff640 1 -- 192.168.123.103:0/1721380195 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1849 (secure 0 0 0) 0x7f6d04061ed0 con 0x7f6d14072120 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 msgr2=0x7f6d00079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 0x7f6d00079a20 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6d1410db60 tx=0x7f6d0c004b40 comp rx=0 tx=0).stop 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 msgr2=0x7f6d1407d580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6d040079d0 tx=0x7f6d04007ea0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 shutdown_connections 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f6d00077560 0x7f6d00079a20 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6d1407dac0 0x7f6d1407df20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 --2- 192.168.123.103:0/1721380195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d14072120 0x7f6d1407d580 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 >> 192.168.123.103:0/1721380195 conn(0x7f6d1406c7d0 msgr2=0x7f6d1407b5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.617+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 shutdown_connections 2026-03-09T16:16:43.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.619+0000 7f6cf27fc640 1 -- 192.168.123.103:0/1721380195 wait complete. 2026-03-09T16:16:43.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.704+0000 7f03ff577640 1 -- 192.168.123.103:0/1875065444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072ad0 msgr2=0x7f040010b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.704+0000 7f03ff577640 1 --2- 192.168.123.103:0/1875065444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072ad0 0x7f040010b9a0 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7f03f8009040 tx=0x7f03f8031a90 comp rx=0 tx=0).stop 2026-03-09T16:16:43.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- 192.168.123.103:0/1875065444 shutdown_connections 2026-03-09T16:16:43.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 --2- 192.168.123.103:0/1875065444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072ad0 0x7f040010b9a0 unknown :-1 s=CLOSED pgs=325 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 --2- 192.168.123.103:0/1875065444 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0400072120 0x7f0400072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- 192.168.123.103:0/1875065444 >> 192.168.123.103:0/1875065444 conn(0x7f040006c7d0 msgr2=0x7f040006cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- 192.168.123.103:0/1875065444 shutdown_connections 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- 192.168.123.103:0/1875065444 wait complete. 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 Processor -- start 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- start start 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072120 0x7f040007d4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0400084470 con 0x7f0400072120 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.705+0000 7f03ff577640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04000845e0 con 0x7f040007da00 2026-03-09T16:16:43.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.706+0000 7f03fdd74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.706+0000 7f03fdd74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:50236/0 (socket says 192.168.123.103:50236) 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.706+0000 7f03fdd74640 1 -- 192.168.123.103:0/2281822841 learned_addr learned my addr 192.168.123.103:0/2281822841 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03fdd74640 1 -- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072120 msgr2=0x7f040007d4c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03fdd74640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072120 0x7f040007d4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03fdd74640 1 -- 192.168.123.103:0/2281822841 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03f8008cf0 con 0x7f040007da00 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03fdd74640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f03f8009170 tx=0x7f03f8004060 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03f8007ce0 con 0x7f040007da00 2026-03-09T16:16:43.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03ff577640 1 -- 192.168.123.103:0/2281822841 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0400081fe0 con 0x7f040007da00 2026-03-09T16:16:43.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.707+0000 7f03ff577640 1 -- 192.168.123.103:0/2281822841 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0400082530 con 0x7f040007da00 2026-03-09T16:16:43.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.710+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f03f8007e40 con 0x7f040007da00 2026-03-09T16:16:43.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.710+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03f8008240 con 0x7f040007da00 2026-03-09T16:16:43.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.710+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f03f803e070 con 0x7f040007da00 2026-03-09T16:16:43.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.711+0000 7f03ef7fe640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 0x7f03e4079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:43.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.713+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f03f8033040 con 0x7f040007da00 2026-03-09T16:16:43.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.713+0000 7f03ff577640 1 -- 192.168.123.103:0/2281822841 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f03cc005350 con 0x7f040007da00 2026-03-09T16:16:43.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.714+0000 7f03fe575640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 0x7f03e4079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:43.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.716+0000 7f03fe575640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 0x7f03e4079bc0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f03f0005fd0 tx=0x7f03f0005950 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:43.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.718+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f03f8092690 con 0x7f040007da00 2026-03-09T16:16:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:43 vm05.local ceph-mon[58702]: from='client.14732 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:43 vm05.local ceph-mon[58702]: from='client.14736 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:43 vm05.local ceph-mon[58702]: from='client.14740 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:43 vm05.local ceph-mon[58702]: pgmap v61: 65 pgs: 65 active+clean; 310 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 340 op/s 2026-03-09T16:16:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:43 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/2688822661' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:43.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.904+0000 7f03ff577640 1 -- 192.168.123.103:0/2281822841 --> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f03cc002bf0 con 0x7f03e4077700 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/23 daemons upgraded", 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mon daemons", 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:16:43.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.910+0000 7f03ef7fe640 1 -- 192.168.123.103:0/2281822841 <== mgr.24403 v2:192.168.123.103:6800/808004487 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f03cc002bf0 con 0x7f03e4077700 2026-03-09T16:16:43.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.917+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 msgr2=0x7f03e4079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.917+0000 7f03ed7fa640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 0x7f03e4079bc0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f03f0005fd0 tx=0x7f03f0005950 comp rx=0 tx=0).stop 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.917+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 msgr2=0x7f040007de60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.917+0000 7f03ed7fa640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f03f8009170 tx=0x7f03f8004060 comp rx=0 tx=0).stop 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 shutdown_connections 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f03e4077700 0x7f03e4079bc0 secure :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f03f0005fd0 tx=0x7f03f0005950 comp rx=0 tx=0).stop 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f040007da00 0x7f040007de60 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 --2- 192.168.123.103:0/2281822841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0400072120 0x7f040007d4c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 >> 192.168.123.103:0/2281822841 conn(0x7f040006c7d0 msgr2=0x7f040006fd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 shutdown_connections 2026-03-09T16:16:43.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:43.918+0000 7f03ed7fa640 1 -- 192.168.123.103:0/2281822841 wait complete. 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 -- 192.168.123.103:0/782148682 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 msgr2=0x7f67e010ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 --2- 192.168.123.103:0/782148682 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e010ca90 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f67d40099b0 tx=0x7f67d402f220 comp rx=0 tx=0).stop 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 -- 192.168.123.103:0/782148682 shutdown_connections 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 --2- 192.168.123.103:0/782148682 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e010ca90 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 --2- 192.168.123.103:0/782148682 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e0072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 -- 192.168.123.103:0/782148682 >> 192.168.123.103:0/782148682 conn(0x7f67e006c7d0 msgr2=0x7f67e006cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:44.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.073+0000 7f67e52a2640 1 -- 192.168.123.103:0/782148682 shutdown_connections 2026-03-09T16:16:44.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.075+0000 7f67e52a2640 1 -- 192.168.123.103:0/782148682 wait complete. 2026-03-09T16:16:44.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.076+0000 7f67e52a2640 1 Processor -- start 2026-03-09T16:16:44.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.077+0000 7f67e52a2640 1 -- start start 2026-03-09T16:16:44.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.077+0000 7f67e52a2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e011a2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:44.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.077+0000 7f67e52a2640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e01162d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:44.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.077+0000 7f67e52a2640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67e01168a0 con 0x7f67e0072120 2026-03-09T16:16:44.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.077+0000 7f67e52a2640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67e0116a10 con 0x7f67e0072a40 2026-03-09T16:16:44.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.081+0000 7f67dffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e011a2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:44.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.081+0000 7f67dffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e011a2a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58850/0 (socket says 192.168.123.103:58850) 2026-03-09T16:16:44.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.081+0000 7f67dffff640 1 -- 192.168.123.103:0/206089558 learned_addr learned my addr 192.168.123.103:0/206089558 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:16:44.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.081+0000 7f67df7fe640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e01162d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:44.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.082+0000 7f67df7fe640 1 -- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 msgr2=0x7f67e011a2a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:44.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.082+0000 7f67df7fe640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e011a2a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.082+0000 7f67df7fe640 1 -- 192.168.123.103:0/206089558 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67d4009660 con 0x7f67e0072a40 2026-03-09T16:16:44.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.082+0000 7f67df7fe640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e01162d0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f67d40099b0 tx=0x7f67d4004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:44.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.083+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67d403d070 con 0x7f67e0072a40 2026-03-09T16:16:44.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.083+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67e0116c90 con 0x7f67e0072a40 2026-03-09T16:16:44.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.083+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67e01ba470 con 0x7f67e0072a40 2026-03-09T16:16:44.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.084+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f67d4038520 con 0x7f67e0072a40 2026-03-09T16:16:44.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.084+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67d4041620 con 0x7f67e0072a40 2026-03-09T16:16:44.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.085+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67ac005350 con 0x7f67e0072a40 2026-03-09T16:16:44.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.086+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99976+0+0 (secure 0 0 0) 0x7f67d404b430 con 0x7f67e0072a40 2026-03-09T16:16:44.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.086+0000 7f67dd7fa640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 0x7f67b0079a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:16:44.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.086+0000 7f67dffff640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 0x7f67b0079a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:16:44.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.087+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f67d40bdce0 con 0x7f67e0072a40 2026-03-09T16:16:44.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.087+0000 7f67dffff640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 0x7f67b0079a20 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f67d0009c10 tx=0x7f67d000b040 comp rx=0 tx=0).ready entity=mgr.24403 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:16:44.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.089+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f67d4086940 con 0x7f67e0072a40 2026-03-09T16:16:44.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.371+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f67ac0051c0 con 0x7f67e0072a40 2026-03-09T16:16:44.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.372+0000 7f67dd7fa640 1 -- 192.168.123.103:0/206089558 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f67d4086090 con 0x7f67e0072a40 2026-03-09T16:16:44.377 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:44.377 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:44.377 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:16:44.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.381+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 msgr2=0x7f67b0079a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:44.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.381+0000 7f67e52a2640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 0x7f67b0079a20 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f67d0009c10 tx=0x7f67d000b040 comp rx=0 tx=0).stop 2026-03-09T16:16:44.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.381+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 msgr2=0x7f67e01162d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:16:44.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.381+0000 7f67e52a2640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e01162d0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f67d40099b0 tx=0x7f67d4004290 comp rx=0 tx=0).stop 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 shutdown_connections 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:6800/808004487,v1:192.168.123.103:6801/808004487] conn(0x7f67b0077560 0x7f67b0079a20 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67e0072a40 0x7f67e01162d0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 --2- 192.168.123.103:0/206089558 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67e0072120 0x7f67e011a2a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 >> 192.168.123.103:0/206089558 conn(0x7f67e006c7d0 msgr2=0x7f67e0070300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.383+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 shutdown_connections 2026-03-09T16:16:44.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:16:44.384+0000 7f67e52a2640 1 -- 192.168.123.103:0/206089558 wait complete. 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: Upgrade: Target is version 19.2.3-678-ge911bdeb (squid) 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: Upgrade: Setting container_image for all mgr 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/1721380195' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='client.24547 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:44.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local ceph-mon[51019]: from='client.? 192.168.123.103:0/206089558' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:16:44.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:44 vm03.local systemd[1]: Stopping Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: Upgrade: Target is version 19.2.3-678-ge911bdeb (squid) 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: Upgrade: Setting container_image for all mgr 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/1721380195' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='client.24547 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:16:45.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:16:45.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:44 vm05.local ceph-mon[58702]: from='client.? 192.168.123.103:0/206089558' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:16:45.262 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[51015]: 2026-03-09T16:16:45.024+0000 7f3a601f4640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:16:45.263 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[51015]: 2026-03-09T16:16:45.024+0000 7f3a601f4640 -1 mon.vm03@0(leader) e2 *** Got Signal Terminated *** 2026-03-09T16:16:45.263 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133839]: 2026-03-09 16:16:45.122300045 +0000 UTC m=+0.161308841 container died b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.vendor=CentOS) 2026-03-09T16:16:45.263 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133839]: 2026-03-09 16:16:45.196606659 +0000 UTC m=+0.235615446 container remove b86752d320b61b3ceca5109a3888bfe85ef5a66fbb23f1bd16a00fa292da0bd4 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:16:45.263 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local bash[133839]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03 2026-03-09T16:16:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service: Deactivated successfully. 2026-03-09T16:16:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local systemd[1]: Stopped Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:16:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service: Consumed 6.631s CPU time. 2026-03-09T16:16:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local systemd[1]: Starting Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133959]: 2026-03-09 16:16:45.681338669 +0000 UTC m=+0.033330246 container create f90a2e8dc7517dba6ec5d9aa63bf5b5f5eeaee7d63077f32b50d4ceada9cd908 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133959]: 2026-03-09 16:16:45.73076332 +0000 UTC m=+0.082754907 container init f90a2e8dc7517dba6ec5d9aa63bf5b5f5eeaee7d63077f32b50d4ceada9cd908 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS) 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133959]: 2026-03-09 16:16:45.734919849 +0000 UTC m=+0.086911426 container start f90a2e8dc7517dba6ec5d9aa63bf5b5f5eeaee7d63077f32b50d4ceada9cd908 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local bash[133959]: f90a2e8dc7517dba6ec5d9aa63bf5b5f5eeaee7d63077f32b50d4ceada9cd908 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local podman[133959]: 2026-03-09 16:16:45.668239591 +0000 UTC m=+0.020231168 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local systemd[1]: Started Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: pidfile_write: ignore empty --pid-file 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: load: jerasure load: lrc 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: RocksDB version: 7.9.2 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Git sha 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: DB SUMMARY 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: DB Session ID: N1JEPQKYL89UN93XIW2D 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: CURRENT file: CURRENT 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: MANIFEST file: MANIFEST-000015 size: 764 Bytes 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000023.sst 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000021.log size: 3414747 ; 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.error_if_exists: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.create_if_missing: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.paranoid_checks: 1 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.env: 0x55de0de73dc0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.info_log: 0x55de0fc9ed40 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.statistics: (nil) 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.use_fsync: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_log_file_size: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_fallocate: 1 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.use_direct_reads: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.db_log_dir: 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.wal_dir: 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T16:16:46.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.write_buffer_manager: 0x55de0fca3900 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.unordered_write: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.row_cache: None 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.wal_filter: None 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.two_write_queues: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.wal_compression: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.atomic_flush: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.log_readahead_size: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_background_jobs: 2 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_background_compactions: -1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_subcompactions: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_open_files: -1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_background_flushes: -1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Compression algorithms supported: 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kZSTD supported: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kXpressCompression supported: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kBZip2Compression supported: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kLZ4Compression supported: 1 2026-03-09T16:16:46.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kZlibCompression supported: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: kSnappyCompression supported: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.merge_operator: 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_filter: None 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55de0fc9e660) 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x55de0fcc3350 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression: NoCompression 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.num_levels: 7 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T16:16:46.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.inplace_update_support: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.bloom_locality: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.max_successive_merges: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T16:16:46.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.ttl: 2592000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enable_blob_files: false 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.min_blob_size: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 7506, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b715cc5-6ec8-4c80-a4cb-77ec8267e53d 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073005788588, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073005803940, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 2913511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7507, "largest_seqno": 9416, "table_properties": {"data_size": 2904183, "index_size": 6041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 20611, "raw_average_key_size": 23, "raw_value_size": 2885203, "raw_average_value_size": 3312, "num_data_blocks": 286, "num_entries": 871, "num_filter_entries": 871, "num_deletions": 8, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773073005, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b715cc5-6ec8-4c80-a4cb-77ec8267e53d", "db_session_id": "N1JEPQKYL89UN93XIW2D", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073005804051, "job": 1, "event": "recovery_finished"} 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm03/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55de0fcc4e00 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: DB pointer 0x55de0fdd0000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** DB Stats ** 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: L0 1/0 2.78 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 278.0 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: L6 1/0 7.51 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Sum 2/0 10.29 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 278.0 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 278.0 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 278.0 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Flush(GB): cumulative 0.003, interval 0.003 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative compaction: 0.00 GB write, 113.38 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:16:46.146 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval compaction: 0.00 GB write, 113.38 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache BinnedLRUCache@0x55de0fcc3350#2 capacity: 512.00 MB usage: 32.47 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1e-05 secs_since: 0 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache entry stats(count,size,portion): FilterBlock(2,10.31 KB,0.00196695%) IndexBlock(2,22.16 KB,0.00422597%) Misc(1,0.00 KB,0%) 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: starting mon.vm03 rank 0 at public addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] at bind addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon_data /var/lib/ceph/mon/ceph-vm03 fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???) e2 preinit fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).mds e12 new map 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).mds e12 print_map 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: e12 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: legacy client fscid: 1 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Filesystem 'cephfs' (1) 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: fs_name cephfs 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: epoch 12 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: tableserver 0 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: root 0 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: session_timeout 60 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: session_autoclose 300 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_file_size 1099511627776 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_xattr_size 65536 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: required_client_features {} 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: last_failure 0 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: last_failure_osd_epoch 41 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_mds 1 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: in 0 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: up {0=14476} 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: failed 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: damaged 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: stopped 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_pools [3] 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_pool 2 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: inline_data enabled 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: balancer 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: bal_rank_mask -1 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: standby_count_wanted 1 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: qdb_cluster leader: 0 members: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Standby daemons: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).osd e45 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:16:46.147 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:45 vm03.local ceph-mon[133973]: mon.vm03@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: Deploying daemon mon.vm03 on vm03 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: pgmap v62: 65 pgs: 65 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.9 MiB/s wr, 427 op/s 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: pgmap v63: 65 pgs: 65 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.2 MiB/s wr, 336 op/s 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: mon.vm03 calling monitor election 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: monmap epoch 2 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: last_changed 2026-03-09T16:10:44.821492+0000 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: created 2026-03-09T16:09:32.695561+0000 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: min_mon_release 18 (reef) 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: election_strategy: 1 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: 1: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: mgrmap e31: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: from='mgr.24403 ' entity='' 2026-03-09T16:16:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:48 vm05.local ceph-mon[58702]: mgrmap e32: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: Deploying daemon mon.vm03 on vm03 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: pgmap v62: 65 pgs: 65 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.9 MiB/s wr, 427 op/s 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: pgmap v63: 65 pgs: 65 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.2 MiB/s wr, 336 op/s 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: mon.vm03 calling monitor election 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: from='mgr.24403 192.168.123.103:0/4220713137' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: monmap epoch 2 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: last_changed 2026-03-09T16:10:44.821492+0000 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: created 2026-03-09T16:09:32.695561+0000 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: min_mon_release 18 (reef) 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: election_strategy: 1 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: 1: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T16:16:48.381 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: mgrmap e31: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:48.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:48.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:16:48.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:16:48.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: from='mgr.24403 ' entity='' 2026-03-09T16:16:48.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:48 vm03.local ceph-mon[133973]: mgrmap e32: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: Standby manager daemon vm05.dygxfv restarted 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:16:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:53 vm05.local ceph-mon[58702]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: Standby manager daemon vm05.dygxfv restarted 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: Standby manager daemon vm05.dygxfv started 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/crt"}]: dispatch 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.dygxfv/key"}]: dispatch 2026-03-09T16:16:53.575 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:53 vm03.local ceph-mon[133973]: from='mgr.? 192.168.123.105:0/903063182' entity='mgr.vm05.dygxfv' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: mgrmap e33: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: mgrmap e34: vm03.gbgzmu(active, starting, since 0.0526341s), standbys: vm05.dygxfv 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:16:54.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:16:54.393 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: mgrmap e33: vm03.gbgzmu(active, since 2m), standbys: vm05.dygxfv 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: Active manager daemon vm03.gbgzmu restarted 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: Activating manager daemon vm03.gbgzmu 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: mgrmap e34: vm03.gbgzmu(active, starting, since 0.0526341s), standbys: vm05.dygxfv 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm03.gbgzmu", "id": "vm03.gbgzmu"}]: dispatch 2026-03-09T16:16:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr metadata", "who": "vm05.dygxfv", "id": "vm05.dygxfv"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: Manager daemon vm03.gbgzmu is now available 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/mirror_snapshot_schedule"}]: dispatch 2026-03-09T16:16:54.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:54 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.gbgzmu/trash_purge_schedule"}]: dispatch 2026-03-09T16:16:56.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:56 vm03.local ceph-mon[133973]: mgrmap e35: vm03.gbgzmu(active, since 1.25492s), standbys: vm05.dygxfv 2026-03-09T16:16:56.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:56 vm03.local ceph-mon[133973]: pgmap v3: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:56.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:56 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:56.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:56 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:56.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:56 vm03.local ceph-mon[133973]: pgmap v4: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:56 vm05.local ceph-mon[58702]: mgrmap e35: vm03.gbgzmu(active, since 1.25492s), standbys: vm05.dygxfv 2026-03-09T16:16:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:56 vm05.local ceph-mon[58702]: pgmap v3: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:56 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:56 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:56 vm05.local ceph-mon[58702]: pgmap v4: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:57 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:57.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:57 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:57 vm03.local ceph-mon[133973]: mgrmap e36: vm03.gbgzmu(active, since 2s), standbys: vm05.dygxfv 2026-03-09T16:16:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:57 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:57.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:57 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:57 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:57 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:57 vm05.local ceph-mon[58702]: mgrmap e36: vm03.gbgzmu(active, since 2s), standbys: vm05.dygxfv 2026-03-09T16:16:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:57 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:57 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: [09/Mar/2026:16:16:57] ENGINE Bus STARTING 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: Detected new or changed devices on vm05 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: [09/Mar/2026:16:16:57] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: [09/Mar/2026:16:16:57] ENGINE Client ('192.168.123.103', 39386) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:16:58.896 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: [09/Mar/2026:16:16:57] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:16:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: [09/Mar/2026:16:16:57] ENGINE Bus STARTED 2026-03-09T16:16:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: pgmap v5: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: [09/Mar/2026:16:16:57] ENGINE Bus STARTING 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: Detected new or changed devices on vm05 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: [09/Mar/2026:16:16:57] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: [09/Mar/2026:16:16:57] ENGINE Client ('192.168.123.103', 39386) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: [09/Mar/2026:16:16:57] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: [09/Mar/2026:16:16:57] ENGINE Bus STARTED 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: pgmap v5: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:16:59.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:58 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: mgrmap e37: vm03.gbgzmu(active, since 4s), standbys: vm05.dygxfv 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:00.144 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:16:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: mgrmap e37: vm03.gbgzmu(active, since 4s), standbys: vm05.dygxfv 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:00.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:16:59 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Detected new or changed devices on vm03 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:17:01.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: pgmap v6: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Detected new or changed devices on vm03 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.conf 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.conf 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: pgmap v6: 65 pgs: 65 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: Updating vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/config/ceph.client.admin.keyring 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.145 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:00 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:02.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:02.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T16:17:02.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:02.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:02.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:02.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local ceph-mon[58702]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:02.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:01 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:02.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05[58698]: 2026-03-09T16:17:02.028+0000 7f829c425640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm05 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:17:02.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05[58698]: 2026-03-09T16:17:02.028+0000 7f829c425640 -1 mon.vm05@1(peon) e2 *** Got Signal Terminated *** 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:02.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108419]: 2026-03-09 16:17:02.276842264 +0000 UTC m=+0.267146329 container died 90242efb09784bfee2afb9ed5e8f08c7776dfae4c820437324ea7efcd15c25ce (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108419]: 2026-03-09 16:17:02.295858508 +0000 UTC m=+0.286162592 container remove 90242efb09784bfee2afb9ed5e8f08c7776dfae4c820437324ea7efcd15c25ce (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=reef, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local bash[108419]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm05.service: Deactivated successfully. 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local systemd[1]: Stopped Ceph mon.vm05 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:02.583 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm05.service: Consumed 4.161s CPU time. 2026-03-09T16:17:02.835 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local systemd[1]: Starting Ceph mon.vm05 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:02.835 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108528]: 2026-03-09 16:17:02.791570539 +0000 UTC m=+0.026729381 container create b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-09T16:17:02.835 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108528]: 2026-03-09 16:17:02.830696463 +0000 UTC m=+0.065855316 container init b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-09T16:17:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108528]: 2026-03-09 16:17:02.835530141 +0000 UTC m=+0.070688994 container start b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default) 2026-03-09T16:17:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local bash[108528]: b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b 2026-03-09T16:17:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local podman[108528]: 2026-03-09 16:17:02.780547896 +0000 UTC m=+0.015706740 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local systemd[1]: Started Ceph mon.vm05 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: pidfile_write: ignore empty --pid-file 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: load: jerasure load: lrc 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: RocksDB version: 7.9.2 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Git sha 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: DB SUMMARY 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: DB Session ID: ZBCEXBLS9641LCQW7OU1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: CURRENT file: CURRENT 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: MANIFEST file: MANIFEST-000010 size: 896 Bytes 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm05/store.db dir, Total Num: 1, files: 000021.sst 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm05/store.db: 000019.log size: 134819 ; 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.error_if_exists: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.create_if_missing: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.paranoid_checks: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.env: 0x55dbdc52adc0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.info_log: 0x55dbddd0d900 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.statistics: (nil) 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.use_fsync: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_log_file_size: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_fallocate: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.use_direct_reads: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.db_log_dir: 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.wal_dir: 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T16:17:03.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.write_buffer_manager: 0x55dbddd11900 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.unordered_write: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.row_cache: None 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.wal_filter: None 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.two_write_queues: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.wal_compression: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.atomic_flush: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.log_readahead_size: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_background_jobs: 2 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_background_compactions: -1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_subcompactions: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_open_files: -1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_background_flushes: -1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Compression algorithms supported: 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kZSTD supported: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kXpressCompression supported: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kBZip2Compression supported: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kLZ4Compression supported: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kZlibCompression supported: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: kSnappyCompression supported: 1 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T16:17:03.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.merge_operator: 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_filter: None 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55dbddd0d580) 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_top_level_index_and_filter: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_type: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_index_type: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_shortening: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: checksum: 4 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: no_block_cache: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache: 0x55dbddd309b0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_name: BinnedLRUCache 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_options: 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: capacity : 536870912 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_shard_bits : 4 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: strict_capacity_limit : 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: high_pri_pool_ratio: 0.000 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_compressed: (nil) 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: persistent_cache: (nil) 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size: 4096 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size_deviation: 10 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_restart_interval: 16 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_block_restart_interval: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_block_size: 4096 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: partition_filters: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: use_delta_encoding: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: filter_policy: bloomfilter 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: whole_key_filtering: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: verify_compression: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: read_amp_bytes_per_bit: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: format_version: 5 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_index_compression: 1 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_align: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_auto_readahead_size: 262144 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: prepopulate_block_cache: 0 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: initial_auto_readahead_size: 8192 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression: NoCompression 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.num_levels: 7 2026-03-09T16:17:03.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.inplace_update_support: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.bloom_locality: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.max_successive_merges: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T16:17:03.280 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.ttl: 2592000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enable_blob_files: false 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.min_blob_size: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 23, last_sequence is 10158, log_number is 19,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 19 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 19 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c060cd5f-9607-4af3-8dde-0dc24b7e724a 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073022895967, "job": 1, "event": "recovery_started", "wal_files": [19]} 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #19 mode 2 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073022898946, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 24, "file_size": 90497, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10163, "largest_seqno": 10229, "table_properties": {"data_size": 89093, "index_size": 237, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 133, "raw_key_size": 1279, "raw_average_key_size": 27, "raw_value_size": 87961, "raw_average_value_size": 1871, "num_data_blocks": 10, "num_entries": 47, "num_filter_entries": 47, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773073022, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c060cd5f-9607-4af3-8dde-0dc24b7e724a", "db_session_id": "ZBCEXBLS9641LCQW7OU1", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773073022899117, "job": 1, "event": "recovery_finished"} 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:5047] Creating manifest 26 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55dbddd32e00 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: DB pointer 0x55dbddd42000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** DB Stats ** 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L0 1/0 88.38 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 62.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L6 1/0 10.33 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Sum 2/0 10.41 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 62.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 62.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 62.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T16:17:03.281 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative compaction: 0.00 GB write, 11.07 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval compaction: 0.00 GB write, 11.07 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache BinnedLRUCache@0x55dbddd309b0#2 capacity: 512.00 MB usage: 0.48 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 7e-06 secs_since: 0 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.17 KB,3.27826e-05%) IndexBlock(1,0.31 KB,5.96046e-05%) Misc(1,0.00 KB,0%) 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: starting mon.vm05 rank 1 at public addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] at bind addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon_data /var/lib/ceph/mon/ceph-vm05 fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???) e2 preinit fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).mds e12 new map 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).mds e12 print_map 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: e12 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: legacy client fscid: 1 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Filesystem 'cephfs' (1) 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: fs_name cephfs 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: epoch 12 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: tableserver 0 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: root 0 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_timeout 60 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_autoclose 300 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_file_size 1099511627776 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_xattr_size 65536 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: required_client_features {} 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure 0 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure_osd_epoch 41 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_mds 1 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: in 0 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: up {0=14476} 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: failed 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: damaged 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: stopped 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_pools [3] 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_pool 2 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: inline_data enabled 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: balancer 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: bal_rank_mask -1 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: standby_count_wanted 1 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: qdb_cluster leader: 0 members: 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Standby daemons: 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).osd e46 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T16:17:03.283 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:02 vm05.local ceph-mon[108543]: mon.vm05@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: mon.vm03 calling monitor election 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: mon.vm05 calling monitor election 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: monmap epoch 3 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: last_changed 2026-03-09T16:17:03.518251+0000 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: created 2026-03-09T16:09:32.695561+0000 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: min_mon_release 19 (squid) 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: election_strategy: 1 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: 1: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: mgrmap e37: vm03.gbgzmu(active, since 9s), standbys: vm05.dygxfv 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:04.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:04 vm03.local ceph-mon[133973]: pgmap v8: 65 pgs: 65 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 392 KiB/s wr, 63 op/s 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: mon.vm03 calling monitor election 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: mon.vm05 calling monitor election 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: mon.vm03 is new leader, mons vm03,vm05 in quorum (ranks 0,1) 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: monmap epoch 3 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: last_changed 2026-03-09T16:17:03.518251+0000 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: created 2026-03-09T16:09:32.695561+0000 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: min_mon_release 19 (squid) 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: election_strategy: 1 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: 1: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm03.kygyjl=up:active} 3 up:standby 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: mgrmap e37: vm03.gbgzmu(active, since 9s), standbys: vm05.dygxfv 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:04 vm05.local ceph-mon[108543]: pgmap v8: 65 pgs: 65 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 392 KiB/s wr, 63 op/s 2026-03-09T16:17:06.339 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.339 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.339 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.339 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:06.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:07.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:07 vm03.local ceph-mon[133973]: pgmap v9: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.0 MiB/s wr, 132 op/s 2026-03-09T16:17:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:07 vm05.local ceph-mon[108543]: pgmap v9: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.0 MiB/s wr, 132 op/s 2026-03-09T16:17:08.155 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-09T16:17:08.155 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring mgr.vm03.gbgzmu (monmap changed)... 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: pgmap v10: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 946 KiB/s wr, 117 op/s 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Unable to update caps for client.ceph-exporter.vm03 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:08 vm05.local ceph-mon[108543]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring mgr.vm03.gbgzmu (monmap changed)... 2026-03-09T16:17:08.534 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.gbgzmu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring daemon mgr.vm03.gbgzmu on vm03 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: pgmap v10: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 946 KiB/s wr, 117 op/s 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Unable to update caps for client.ceph-exporter.vm03 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:08.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:08 vm03.local ceph-mon[133973]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T16:17:08.739 DEBUG:teuthology.parallel:result is None 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:17:10.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:17:10.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.0 on vm03 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.1 on vm03 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: pgmap v11: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 946 KiB/s wr, 117 op/s 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T16:17:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.2 on vm03 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.0 on vm03 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.1 on vm03 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: pgmap v11: 65 pgs: 65 active+clean; 289 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 946 KiB/s wr, 117 op/s 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.2 on vm03 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:11.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: Reconfiguring mds.cephfs.vm03.kygyjl (monmap changed)... 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: Reconfiguring daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: Reconfiguring mds.cephfs.vm03.kntrco (monmap changed)... 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: Reconfiguring daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:17:12.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: Reconfiguring mds.cephfs.vm03.kygyjl (monmap changed)... 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: Reconfiguring daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: Reconfiguring mds.cephfs.vm03.kntrco (monmap changed)... 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: Reconfiguring daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.dygxfv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T16:17:12.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-09T16:17:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Unable to update caps for client.ceph-exporter.vm05 2026-03-09T16:17:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring daemon crash.vm05 on vm05 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring mgr.vm05.dygxfv (monmap changed)... 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: Reconfiguring daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: pgmap v12: 65 pgs: 65 active+clean; 273 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 45 KiB/s rd, 1.3 MiB/s wr, 191 op/s 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:17:13.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Unable to update caps for client.ceph-exporter.vm05 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring daemon crash.vm05 on vm05 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring mgr.vm05.dygxfv (monmap changed)... 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: Reconfiguring daemon mgr.vm05.dygxfv on vm05 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: pgmap v12: 65 pgs: 65 active+clean; 273 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 45 KiB/s rd, 1.3 MiB/s wr, 191 op/s 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:13.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:17:13.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring daemon mon.vm05 on vm05 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.3 on vm05 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.4 on vm05 2026-03-09T16:17:14.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:14 vm03.local ceph-mon[133973]: pgmap v13: 65 pgs: 65 active+clean; 273 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 145 op/s 2026-03-09T16:17:14.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/1657698504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98108800 msgr2=0x7fbd98108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/1657698504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98108800 0x7fbd98108be0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fbd840099b0 tx=0x7fbd8402f240 comp rx=0 tx=0).stop 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/1657698504 shutdown_connections 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/1657698504 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 0x7fbd98102c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/1657698504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98108800 0x7fbd98108be0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/1657698504 >> 192.168.123.103:0/1657698504 conn(0x7fbd980fe540 msgr2=0x7fbd98100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/1657698504 shutdown_connections 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.480+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/1657698504 wait complete. 2026-03-09T16:17:14.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 Processor -- start 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 -- start start 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 0x7fbd98072a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd98079780 con 0x7fbd98072fc0 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9dbb6640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd980798f0 con 0x7fbd98102800 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.482+0000 7fbd9cbb4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 0x7fbd98072a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47166/0 (socket says 192.168.123.103:47166) 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 -- 192.168.123.103:0/3663465231 learned_addr learned my addr 192.168.123.103:0/3663465231 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 -- 192.168.123.103:0/3663465231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 msgr2=0x7fbd98072a80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 0x7fbd98072a80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.483+0000 7fbd97fff640 1 -- 192.168.123.103:0/3663465231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd84009660 con 0x7fbd98072fc0 2026-03-09T16:17:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd97fff640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fbd8800e990 tx=0x7fbd8800ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd8800cd30 con 0x7fbd98072fc0 2026-03-09T16:17:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd98075d00 con 0x7fbd98072fc0 2026-03-09T16:17:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd98076220 con 0x7fbd98072fc0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbd8800ce90 con 0x7fbd98072fc0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.484+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd88010640 con 0x7fbd98072fc0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.486+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbd880107a0 con 0x7fbd98072fc0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.486+0000 7fbd95ffb640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 0x7fbd70079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.486+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbd88014070 con 0x7fbd98072fc0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.486+0000 7fbd9cbb4640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 0x7fbd70079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.486+0000 7fbd9cbb4640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 0x7fbd70079f90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fbd84002410 tx=0x7fbd8403a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:14.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.488+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd60005350 con 0x7fbd98072fc0 2026-03-09T16:17:14.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.492+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbd880639d0 con 0x7fbd98072fc0 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring daemon mon.vm05 on vm05 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.3 on vm05 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:14.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T16:17:14.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:17:14.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:14.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.4 on vm05 2026-03-09T16:17:14.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:14 vm05.local ceph-mon[108543]: pgmap v13: 65 pgs: 65 active+clean; 273 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 145 op/s 2026-03-09T16:17:14.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.621+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbd60002bf0 con 0x7fbd70077ad0 2026-03-09T16:17:14.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.623+0000 7fbd95ffb640 1 -- 192.168.123.103:0/3663465231 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7fbd60002bf0 con 0x7fbd70077ad0 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.626+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 msgr2=0x7fbd70079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 0x7fbd70079f90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fbd84002410 tx=0x7fbd8403a040 comp rx=0 tx=0).stop 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 msgr2=0x7fbd98075700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fbd8800e990 tx=0x7fbd8800ee60 comp rx=0 tx=0).stop 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 shutdown_connections 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbd70077ad0 0x7fbd70079f90 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd98072fc0 0x7fbd98075700 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 --2- 192.168.123.103:0/3663465231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd98102800 0x7fbd98072a80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 >> 192.168.123.103:0/3663465231 conn(0x7fbd980fe540 msgr2=0x7fbd980ff4f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 shutdown_connections 2026-03-09T16:17:14.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.627+0000 7fbd9dbb6640 1 -- 192.168.123.103:0/3663465231 wait complete. 2026-03-09T16:17:14.637 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 -- 192.168.123.103:0/603478785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0729d0 msgr2=0x7ff73c10b9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 --2- 192.168.123.103:0/603478785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0729d0 0x7ff73c10b9f0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7ff73400b0a0 tx=0x7ff73402f550 comp rx=0 tx=0).stop 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 -- 192.168.123.103:0/603478785 shutdown_connections 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 --2- 192.168.123.103:0/603478785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0729d0 0x7ff73c10b9f0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 --2- 192.168.123.103:0/603478785 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0720b0 0x7ff73c072490 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.734+0000 7ff742da5640 1 -- 192.168.123.103:0/603478785 >> 192.168.123.103:0/603478785 conn(0x7ff73c06c7e0 msgr2=0x7ff73c06cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:14.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.735+0000 7ff742da5640 1 -- 192.168.123.103:0/603478785 shutdown_connections 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 -- 192.168.123.103:0/603478785 wait complete. 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 Processor -- start 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 -- start start 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0720b0 0x7ff73c1ad510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff73c1ae070 con 0x7ff73c0720b0 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.736+0000 7ff742da5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff73c1a7600 con 0x7ff73c0729d0 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46238/0 (socket says 192.168.123.103:46238) 2026-03-09T16:17:14.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 -- 192.168.123.103:0/1422839137 learned_addr learned my addr 192.168.123.103:0/1422839137 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:14.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 -- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0720b0 msgr2=0x7ff73c1ad510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0720b0 0x7ff73c1ad510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.737+0000 7ff73bfff640 1 -- 192.168.123.103:0/1422839137 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff734009d00 con 0x7ff73c0729d0 2026-03-09T16:17:14.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.738+0000 7ff73bfff640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff7340060b0 tx=0x7ff7340092e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:14.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.738+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff73403d070 con 0x7ff73c0729d0 2026-03-09T16:17:14.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.738+0000 7ff742da5640 1 -- 192.168.123.103:0/1422839137 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff73c1a7800 con 0x7ff73c0729d0 2026-03-09T16:17:14.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.738+0000 7ff742da5640 1 -- 192.168.123.103:0/1422839137 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff73c1a7d50 con 0x7ff73c0729d0 2026-03-09T16:17:14.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.739+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff734032030 con 0x7ff73c0729d0 2026-03-09T16:17:14.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.739+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff734037a40 con 0x7ff73c0729d0 2026-03-09T16:17:14.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.739+0000 7ff742da5640 1 -- 192.168.123.103:0/1422839137 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff708005350 con 0x7ff73c0729d0 2026-03-09T16:17:14.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.740+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff734049050 con 0x7ff73c0729d0 2026-03-09T16:17:14.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.740+0000 7ff739ffb640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 0x7ff70c079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.740+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff7340be710 con 0x7ff73c0729d0 2026-03-09T16:17:14.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.744+0000 7ff740b1a640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 0x7ff70c079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.745+0000 7ff740b1a640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 0x7ff70c079cb0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff72c0059c0 tx=0x7ff72c005950 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:14.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.746+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff734086c20 con 0x7ff73c0729d0 2026-03-09T16:17:14.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.890+0000 7ff742da5640 1 -- 192.168.123.103:0/1422839137 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff708002bf0 con 0x7ff70c0777f0 2026-03-09T16:17:14.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.892+0000 7ff739ffb640 1 -- 192.168.123.103:0/1422839137 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7ff708002bf0 con 0x7ff70c0777f0 2026-03-09T16:17:14.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.896+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 msgr2=0x7ff70c079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.896+0000 7ff71b7fe640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 0x7ff70c079cb0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff72c0059c0 tx=0x7ff72c005950 comp rx=0 tx=0).stop 2026-03-09T16:17:14.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.896+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 msgr2=0x7ff73c1ada50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff7340060b0 tx=0x7ff7340092e0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 shutdown_connections 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff70c0777f0 0x7ff70c079cb0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff73c0729d0 0x7ff73c1ada50 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 --2- 192.168.123.103:0/1422839137 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff73c0720b0 0x7ff73c1ad510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 >> 192.168.123.103:0/1422839137 conn(0x7ff73c06c7e0 msgr2=0x7ff73c070680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 shutdown_connections 2026-03-09T16:17:14.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.897+0000 7ff71b7fe640 1 -- 192.168.123.103:0/1422839137 wait complete. 2026-03-09T16:17:14.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.988+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/3631020535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072140 msgr2=0x7f1cd8072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.988+0000 7f1cdfdb4640 1 --2- 192.168.123.103:0/3631020535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072140 0x7f1cd8072520 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f1cd400bb70 tx=0x7f1cd4030ff0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/3631020535 shutdown_connections 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 --2- 192.168.123.103:0/3631020535 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072af0 0x7f1cd810ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 --2- 192.168.123.103:0/3631020535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072140 0x7f1cd8072520 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/3631020535 >> 192.168.123.103:0/3631020535 conn(0x7f1cd806c7e0 msgr2=0x7f1cd806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/3631020535 shutdown_connections 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/3631020535 wait complete. 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 Processor -- start 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- start start 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072af0 0x7f1cd807d9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cd80845c0 con 0x7f1cd8072af0 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.989+0000 7f1cdfdb4640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cd807df10 con 0x7f1cd8072140 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46264/0 (socket says 192.168.123.103:46264) 2026-03-09T16:17:14.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 -- 192.168.123.103:0/4249099113 learned_addr learned my addr 192.168.123.103:0/4249099113 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:14.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 -- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072af0 msgr2=0x7f1cd807d9d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:14.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072af0 0x7f1cd807d9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:14.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 -- 192.168.123.103:0/4249099113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1cd400b820 con 0x7f1cd8072140 2026-03-09T16:17:14.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.990+0000 7f1cddb29640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f1cd4031500 tx=0x7f1cd4002750 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:14.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1cd4031aa0 con 0x7f1cd8072140 2026-03-09T16:17:14.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/4249099113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1cd807e190 con 0x7f1cd8072140 2026-03-09T16:17:14.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/4249099113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1cd8082170 con 0x7f1cd8072140 2026-03-09T16:17:14.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1cd4031c00 con 0x7f1cd8072140 2026-03-09T16:17:14.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/4249099113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1cd8108570 con 0x7f1cd8072140 2026-03-09T16:17:14.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.991+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1cd403a510 con 0x7f1cd8072140 2026-03-09T16:17:14.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.995+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f1cd4043a30 con 0x7f1cd8072140 2026-03-09T16:17:14.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.996+0000 7f1cceffd640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 0x7f1cb4079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:14.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.996+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f1cd40bef20 con 0x7f1cd8072140 2026-03-09T16:17:14.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.996+0000 7f1cdd328640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 0x7f1cb4079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:14.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.997+0000 7f1cdd328640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 0x7f1cb4079f90 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f1cd00062a0 tx=0x7f1cd0006210 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:14.999+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1cd40875c0 con 0x7f1cd8072140 2026-03-09T16:17:15.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.122+0000 7f1cdfdb4640 1 -- 192.168.123.103:0/4249099113 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1cd8082c50 con 0x7f1cb4077ad0 2026-03-09T16:17:15.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.128+0000 7f1cceffd640 1 -- 192.168.123.103:0/4249099113 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f1cd8082c50 con 0x7f1cb4077ad0 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (91s) 17s ago 7m 16.7M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 17s ago 7m 9080k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (6m) 10s ago 6m 9839k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (7m) 17s ago 7m 7625k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 05e9be717970 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (6m) 10s ago 6m 7608k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 32f80ccecaa9 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (74s) 17s ago 6m 74.4M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (4m) 17s ago 4m 18.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (5m) 17s ago 5m 228M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (5m) 10s ago 5m 16.6M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (4m) 10s ago 4m 18.1M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (2m) 17s ago 7m 560M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (2m) 10s ago 6m 494M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (29s) 17s ago 7m 44.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (12s) 10s ago 6m 32.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 17s ago 7m 9437k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (114s) 10s ago 6m 9466k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (6m) 17s ago 6m 337M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2ea78f0d62f8 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (6m) 17s ago 6m 347M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6169f9824413 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 17s ago 5m 289M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (5m) 10s ago 5m 447M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (5m) 10s ago 5m 369M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (5m) 10s ago 5m 318M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:17:15.130 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (98s) 17s ago 6m 41.0M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 msgr2=0x7f1cb4079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 0x7f1cb4079f90 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f1cd00062a0 tx=0x7f1cd0006210 comp rx=0 tx=0).stop 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 msgr2=0x7f1cd807d490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f1cd4031500 tx=0x7f1cd4002750 comp rx=0 tx=0).stop 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 shutdown_connections 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1cb4077ad0 0x7f1cb4079f90 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1cd8072af0 0x7f1cd807d9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 --2- 192.168.123.103:0/4249099113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cd8072140 0x7f1cd807d490 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 >> 192.168.123.103:0/4249099113 conn(0x7f1cd806c7e0 msgr2=0x7f1cd80716f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 shutdown_connections 2026-03-09T16:17:15.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.132+0000 7f1cccff9640 1 -- 192.168.123.103:0/4249099113 wait complete. 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 -- 192.168.123.103:0/3971943508 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1029d0 msgr2=0x7f670c102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 --2- 192.168.123.103:0/3971943508 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1029d0 0x7f670c102e30 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f66fc0098e0 tx=0x7f66fc02f1b0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 -- 192.168.123.103:0/3971943508 shutdown_connections 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 --2- 192.168.123.103:0/3971943508 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1029d0 0x7f670c102e30 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 --2- 192.168.123.103:0/3971943508 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1089d0 0x7f670c108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.202+0000 7f671289e640 1 -- 192.168.123.103:0/3971943508 >> 192.168.123.103:0/3971943508 conn(0x7f670c0fe710 msgr2=0x7f670c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 -- 192.168.123.103:0/3971943508 shutdown_connections 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 -- 192.168.123.103:0/3971943508 wait complete. 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 Processor -- start 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 -- start start 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 0x7f670c06b6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 0x7f670c06bbf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f670c06c1c0 con 0x7f670c1089d0 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f671289e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f670c06c330 con 0x7f670c1029d0 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.203+0000 7f670bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 0x7f670c06b6b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670bfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 0x7f670c06b6b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46288/0 (socket says 192.168.123.103:46288) 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670bfff640 1 -- 192.168.123.103:0/3846469616 learned_addr learned my addr 192.168.123.103:0/3846469616 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:15.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670b7fe640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 0x7f670c06bbf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670b7fe640 1 -- 192.168.123.103:0/3846469616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 msgr2=0x7f670c06b6b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670b7fe640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 0x7f670c06b6b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670b7fe640 1 -- 192.168.123.103:0/3846469616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f66fc009590 con 0x7f670c1089d0 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f670b7fe640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 0x7f670c06bbf0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f66fc0098e0 tx=0x7f66fc031c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.204+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f66fc03d070 con 0x7f670c1089d0 2026-03-09T16:17:15.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.205+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f66fc02fe90 con 0x7f670c1089d0 2026-03-09T16:17:15.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.206+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f66fc031700 con 0x7f670c1089d0 2026-03-09T16:17:15.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.206+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f670c110b90 con 0x7f670c1089d0 2026-03-09T16:17:15.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.206+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f670c111050 con 0x7f670c1089d0 2026-03-09T16:17:15.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.208+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f66fc02f9d0 con 0x7f670c1089d0 2026-03-09T16:17:15.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.208+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f670c104110 con 0x7f670c1089d0 2026-03-09T16:17:15.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.208+0000 7f67097fa640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 0x7f66e0079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.209+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f66fc0be470 con 0x7f670c1089d0 2026-03-09T16:17:15.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.212+0000 7f670bfff640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 0x7f66e0079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.212+0000 7f670bfff640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 0x7f66e0079d00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f66f8009ea0 tx=0x7f66f8009340 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.213+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f66fc086ab0 con 0x7f670c1089d0 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: Reconfiguring daemon osd.5 on vm05 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: Reconfiguring mds.cephfs.vm05.jgzfvu (monmap changed)... 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: Reconfiguring daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:15.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:15 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.356+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f670c102e60 con 0x7f670c1089d0 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: Reconfiguring daemon osd.5 on vm05 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: Reconfiguring mds.cephfs.vm05.jgzfvu (monmap changed)... 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: Reconfiguring daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:15.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:17:15.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:15 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.357+0000 7f67097fa640 1 -- 192.168.123.103:0/3846469616 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f66fc08d300 con 0x7f670c1089d0 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 10, 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:17:15.361 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:17:15.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 msgr2=0x7f66e0079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 0x7f66e0079d00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f66f8009ea0 tx=0x7f66f8009340 comp rx=0 tx=0).stop 2026-03-09T16:17:15.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 msgr2=0x7f670c06bbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 0x7f670c06bbf0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f66fc0098e0 tx=0x7f66fc031c70 comp rx=0 tx=0).stop 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 shutdown_connections 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f66e0077840 0x7f66e0079d00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f670c1089d0 0x7f670c06bbf0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 --2- 192.168.123.103:0/3846469616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f670c1029d0 0x7f670c06b6b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 >> 192.168.123.103:0/3846469616 conn(0x7f670c0fe710 msgr2=0x7f670c077830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 shutdown_connections 2026-03-09T16:17:15.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.363+0000 7f671289e640 1 -- 192.168.123.103:0/3846469616 wait complete. 2026-03-09T16:17:15.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.442+0000 7ffb5159e640 1 -- 192.168.123.103:0/846849132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 msgr2=0x7ffb4c100be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.442+0000 7ffb5159e640 1 --2- 192.168.123.103:0/846849132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c100be0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7ffb340099b0 tx=0x7ffb3402f220 comp rx=0 tx=0).stop 2026-03-09T16:17:15.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.443+0000 7ffb5159e640 1 -- 192.168.123.103:0/846849132 shutdown_connections 2026-03-09T16:17:15.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.443+0000 7ffb5159e640 1 --2- 192.168.123.103:0/846849132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c100be0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.443+0000 7ffb5159e640 1 --2- 192.168.123.103:0/846849132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 0x7ffb4c106b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.443+0000 7ffb5159e640 1 -- 192.168.123.103:0/846849132 >> 192.168.123.103:0/846849132 conn(0x7ffb4c0fc460 msgr2=0x7ffb4c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.444+0000 7ffb5159e640 1 -- 192.168.123.103:0/846849132 shutdown_connections 2026-03-09T16:17:15.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.444+0000 7ffb5159e640 1 -- 192.168.123.103:0/846849132 wait complete. 2026-03-09T16:17:15.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.444+0000 7ffb5159e640 1 Processor -- start 2026-03-09T16:17:15.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb5159e640 1 -- start start 2026-03-09T16:17:15.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb5159e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb5159e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 0x7ffb4c0739b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb5159e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb4c075480 con 0x7ffb4c100780 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb5159e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb4c073ef0 con 0x7ffb4c106780 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb4affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb4affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47232/0 (socket says 192.168.123.103:47232) 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb4affd640 1 -- 192.168.123.103:0/3210485820 learned_addr learned my addr 192.168.123.103:0/3210485820 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.445+0000 7ffb4affd640 1 -- 192.168.123.103:0/3210485820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 msgr2=0x7ffb4c0739b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb4a7fc640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 0x7ffb4c0739b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb4affd640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 0x7ffb4c0739b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb4affd640 1 -- 192.168.123.103:0/3210485820 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffb34009660 con 0x7ffb4c100780 2026-03-09T16:17:15.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb4affd640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7ffb4000da40 tx=0x7ffb4000df10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb40004280 con 0x7ffb4c100780 2026-03-09T16:17:15.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ffb4000be10 con 0x7ffb4c100780 2026-03-09T16:17:15.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb40005230 con 0x7ffb4c100780 2026-03-09T16:17:15.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb5159e640 1 -- 192.168.123.103:0/3210485820 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffb4c0741d0 con 0x7ffb4c100780 2026-03-09T16:17:15.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.446+0000 7ffb5159e640 1 -- 192.168.123.103:0/3210485820 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffb4c1a4c00 con 0x7ffb4c100780 2026-03-09T16:17:15.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.450+0000 7ffb5159e640 1 -- 192.168.123.103:0/3210485820 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffb10005350 con 0x7ffb4c100780 2026-03-09T16:17:15.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.450+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ffb40004930 con 0x7ffb4c100780 2026-03-09T16:17:15.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.451+0000 7ffb2bfff640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 0x7ffb20079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.451+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ffb40099ba0 con 0x7ffb4c100780 2026-03-09T16:17:15.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.453+0000 7ffb4a7fc640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 0x7ffb20079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.453+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ffb400620b0 con 0x7ffb4c100780 2026-03-09T16:17:15.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.453+0000 7ffb4a7fc640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 0x7ffb20079ec0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7ffb4c074ba0 tx=0x7ffb34005990 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.585+0000 7ffb5159e640 1 -- 192.168.123.103:0/3210485820 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ffb100058d0 con 0x7ffb4c100780 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.588+0000 7ffb2bfff640 1 -- 192.168.123.103:0/3210485820 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1919 (secure 0 0 0) 0x7ffb40061800 con 0x7ffb4c100780 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:17:15.589 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 41 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14476} 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:15.590 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:17:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 msgr2=0x7ffb20079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 0x7ffb20079ec0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7ffb4c074ba0 tx=0x7ffb34005990 comp rx=0 tx=0).stop 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 msgr2=0x7ffb4c073470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7ffb4000da40 tx=0x7ffb4000df10 comp rx=0 tx=0).stop 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 shutdown_connections 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ffb20077a00 0x7ffb20079ec0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb4c106780 0x7ffb4c0739b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 --2- 192.168.123.103:0/3210485820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb4c100780 0x7ffb4c073470 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 >> 192.168.123.103:0/3210485820 conn(0x7ffb4c0fc460 msgr2=0x7ffb4c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 shutdown_connections 2026-03-09T16:17:15.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.592+0000 7ffb29ffb640 1 -- 192.168.123.103:0/3210485820 wait complete. 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- 192.168.123.103:0/3853350564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072140 msgr2=0x7f5700072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 --2- 192.168.123.103:0/3853350564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072140 0x7f5700072520 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f56f00098e0 tx=0x7f56f002f1b0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- 192.168.123.103:0/3853350564 shutdown_connections 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 --2- 192.168.123.103:0/3853350564 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5700072af0 0x7f570010ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 --2- 192.168.123.103:0/3853350564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072140 0x7f5700072520 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- 192.168.123.103:0/3853350564 >> 192.168.123.103:0/3853350564 conn(0x7f570006c7e0 msgr2=0x7f570006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- 192.168.123.103:0/3853350564 shutdown_connections 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- 192.168.123.103:0/3853350564 wait complete. 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 Processor -- start 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.661+0000 7f5706ccd640 1 -- start start 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f5706ccd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072af0 0x7f570007d490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f5706ccd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f5706ccd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f570007e0b0 con 0x7f5700072af0 2026-03-09T16:17:15.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f5706ccd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f570007e220 con 0x7f57000843b0 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46308/0 (socket says 192.168.123.103:46308) 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 -- 192.168.123.103:0/900250306 learned_addr learned my addr 192.168.123.103:0/900250306 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 -- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072af0 msgr2=0x7f570007d490 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072af0 0x7f570007d490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.662+0000 7f56fffff640 1 -- 192.168.123.103:0/900250306 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56f0009590 con 0x7f57000843b0 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.663+0000 7f56fffff640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f56f800ea40 tx=0x7f56f800ef10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.663+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56f800ce60 con 0x7f57000843b0 2026-03-09T16:17:15.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.663+0000 7f5706ccd640 1 -- 192.168.123.103:0/900250306 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5700081f50 con 0x7f57000843b0 2026-03-09T16:17:15.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.663+0000 7f5706ccd640 1 -- 192.168.123.103:0/900250306 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57000824a0 con 0x7f57000843b0 2026-03-09T16:17:15.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.664+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f56f80040d0 con 0x7f57000843b0 2026-03-09T16:17:15.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.664+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56f8016710 con 0x7f57000843b0 2026-03-09T16:17:15.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.664+0000 7f5706ccd640 1 -- 192.168.123.103:0/900250306 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5700108570 con 0x7f57000843b0 2026-03-09T16:17:15.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.666+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f56f8004240 con 0x7f57000843b0 2026-03-09T16:17:15.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.666+0000 7f56fdffb640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 0x7f56d0079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.666+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f56f809a970 con 0x7f57000843b0 2026-03-09T16:17:15.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.667+0000 7f5704a42640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 0x7f56d0079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.667+0000 7f5704a42640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 0x7f56d0079f90 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f56f002f6c0 tx=0x7f56f00094d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.668+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f56f8062f00 con 0x7f57000843b0 2026-03-09T16:17:15.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.837+0000 7f5706ccd640 1 -- 192.168.123.103:0/900250306 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5700079820 con 0x7f56d0077ad0 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.839+0000 7f56fdffb640 1 -- 192.168.123.103:0/900250306 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f5700079820 con 0x7f56d0077ad0 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "4/23 daemons upgraded", 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading crash daemons", 2026-03-09T16:17:15.840 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:17:15.841 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:17:15.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 msgr2=0x7f56d0079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 0x7f56d0079f90 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f56f002f6c0 tx=0x7f56f00094d0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 msgr2=0x7f570007d9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f56f800ea40 tx=0x7f56f800ef10 comp rx=0 tx=0).stop 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 shutdown_connections 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56d0077ad0 0x7f56d0079f90 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57000843b0 0x7f570007d9d0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 --2- 192.168.123.103:0/900250306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5700072af0 0x7f570007d490 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 >> 192.168.123.103:0/900250306 conn(0x7f570006c7e0 msgr2=0x7f570006f360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 shutdown_connections 2026-03-09T16:17:15.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.843+0000 7f56df7fe640 1 -- 192.168.123.103:0/900250306 wait complete. 2026-03-09T16:17:15.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.967+0000 7fcd6e027640 1 -- 192.168.123.103:0/2806570918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073e30 msgr2=0x7fcd6810c990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.967+0000 7fcd6e027640 1 --2- 192.168.123.103:0/2806570918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073e30 0x7fcd6810c990 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fcd500099b0 tx=0x7fcd5002f2d0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.969+0000 7fcd6e027640 1 -- 192.168.123.103:0/2806570918 shutdown_connections 2026-03-09T16:17:15.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.969+0000 7fcd6e027640 1 --2- 192.168.123.103:0/2806570918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073e30 0x7fcd6810c990 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.969+0000 7fcd6e027640 1 --2- 192.168.123.103:0/2806570918 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073510 0x7fcd680738f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.969+0000 7fcd6e027640 1 -- 192.168.123.103:0/2806570918 >> 192.168.123.103:0/2806570918 conn(0x7fcd680fc290 msgr2=0x7fcd680fe6b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 -- 192.168.123.103:0/2806570918 shutdown_connections 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 -- 192.168.123.103:0/2806570918 wait complete. 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 Processor -- start 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 -- start start 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073510 0x7fcd6819a770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd6819b390 con 0x7fcd68073510 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6e027640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd6819e000 con 0x7fcd68073e30 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6c824640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6c824640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:46314/0 (socket says 192.168.123.103:46314) 2026-03-09T16:17:15.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.970+0000 7fcd6c824640 1 -- 192.168.123.103:0/3728415784 learned_addr learned my addr 192.168.123.103:0/3728415784 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.971+0000 7fcd6c824640 1 -- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073510 msgr2=0x7fcd6819a770 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.971+0000 7fcd6c824640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073510 0x7fcd6819a770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.971+0000 7fcd6c824640 1 -- 192.168.123.103:0/3728415784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd50009660 con 0x7fcd68073e30 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.971+0000 7fcd6c824640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fcd50002410 tx=0x7fcd50004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.972+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd5003d070 con 0x7fcd68073e30 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.972+0000 7fcd6e027640 1 -- 192.168.123.103:0/3728415784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd6819e280 con 0x7fcd68073e30 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.972+0000 7fcd6e027640 1 -- 192.168.123.103:0/3728415784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd6819e770 con 0x7fcd68073e30 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.972+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcd50031ec0 con 0x7fcd68073e30 2026-03-09T16:17:15.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.972+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd50038470 con 0x7fcd68073e30 2026-03-09T16:17:15.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.974+0000 7fcd6e027640 1 -- 192.168.123.103:0/3728415784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd68074c50 con 0x7fcd68073e30 2026-03-09T16:17:15.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.974+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fcd50049050 con 0x7fcd68073e30 2026-03-09T16:17:15.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.974+0000 7fcd5e7fc640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 0x7fcd44079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:15.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.974+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fcd500be6c0 con 0x7fcd68073e30 2026-03-09T16:17:15.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.980+0000 7fcd6d025640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 0x7fcd44079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:15.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.980+0000 7fcd6d025640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 0x7fcd44079f90 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fcd580098a0 tx=0x7fcd58006d90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:15.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:15.980+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcd50086c50 con 0x7fcd68073e30 2026-03-09T16:17:16.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.235+0000 7fcd6e027640 1 -- 192.168.123.103:0/3728415784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fcd6810fb70 con 0x7fcd68073e30 2026-03-09T16:17:16.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.244+0000 7fcd5e7fc640 1 -- 192.168.123.103:0/3728415784 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7fcd500863a0 con 0x7fcd68073e30 2026-03-09T16:17:16.245 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:16.245 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:16.245 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:17:16.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.248+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 msgr2=0x7fcd44079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:16.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.248+0000 7fcd37fff640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 0x7fcd44079f90 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fcd580098a0 tx=0x7fcd58006d90 comp rx=0 tx=0).stop 2026-03-09T16:17:16.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.248+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 msgr2=0x7fcd6819acb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:16.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.248+0000 7fcd37fff640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fcd50002410 tx=0x7fcd50004290 comp rx=0 tx=0).stop 2026-03-09T16:17:16.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.252+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 shutdown_connections 2026-03-09T16:17:16.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.252+0000 7fcd37fff640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fcd44077ad0 0x7fcd44079f90 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:16.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.252+0000 7fcd37fff640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd68073e30 0x7fcd6819acb0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:16.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.252+0000 7fcd37fff640 1 --2- 192.168.123.103:0/3728415784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcd68073510 0x7fcd6819a770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:16.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.252+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 >> 192.168.123.103:0/3728415784 conn(0x7fcd680fc290 msgr2=0x7fcd6810bf70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:16.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.253+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 shutdown_connections 2026-03-09T16:17:16.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:16.253+0000 7fcd37fff640 1 -- 192.168.123.103:0/3728415784 wait complete. 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: Reconfiguring mds.cephfs.vm05.sqhria (monmap changed)... 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: Reconfiguring daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3846469616' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3210485820' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all mon 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: from='client.44119 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:16.525 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:16 vm03.local ceph-mon[133973]: pgmap v14: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 212 op/s 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: Reconfiguring mds.cephfs.vm05.sqhria (monmap changed)... 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: Reconfiguring daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3846469616' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3210485820' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all mon 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: from='client.44119 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:16 vm05.local ceph-mon[108543]: pgmap v14: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 212 op/s 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3728415784' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: Upgrade: Updating crash.vm03 (1/2) 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:17.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:17 vm03.local ceph-mon[133973]: Deploying daemon crash.vm03 on vm03 2026-03-09T16:17:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3728415784' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:17:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: Upgrade: Updating crash.vm03 (1/2) 2026-03-09T16:17:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:17 vm05.local ceph-mon[108543]: Deploying daemon crash.vm03 on vm03 2026-03-09T16:17:18.772 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-09T16:17:18.772 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: pgmap v15: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 737 KiB/s wr, 140 op/s 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: Upgrade: Updating crash.vm05 (2/2) 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:18.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:18 vm05.local ceph-mon[108543]: Deploying daemon crash.vm05 on vm05 2026-03-09T16:17:18.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: pgmap v15: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 737 KiB/s wr, 140 op/s 2026-03-09T16:17:18.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: Upgrade: Updating crash.vm05 (2/2) 2026-03-09T16:17:18.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:18.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T16:17:18.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:18.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:18 vm03.local ceph-mon[133973]: Deploying daemon crash.vm05 on vm05 2026-03-09T16:17:19.237 DEBUG:teuthology.parallel:result is None 2026-03-09T16:17:19.237 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T16:17:19.270 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T16:17:19.271 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T16:17:19.310 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T16:17:19.310 DEBUG:teuthology.parallel:result is None 2026-03-09T16:17:20.467 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:20 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:20.467 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:20 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:20.467 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:20 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:20.467 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:20 vm05.local ceph-mon[108543]: pgmap v16: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 737 KiB/s wr, 140 op/s 2026-03-09T16:17:20.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:20 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:20.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:20 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:20.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:20 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:20.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:20 vm03.local ceph-mon[133973]: pgmap v16: 65 pgs: 65 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 737 KiB/s wr, 140 op/s 2026-03-09T16:17:21.812 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:21 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.812 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:21 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.812 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:21 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.812 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:21 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:21 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:21 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:21 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:21.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:21 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: pgmap v17: 65 pgs: 65 active+clean; 258 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.1 MiB/s wr, 193 op/s 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:22 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: pgmap v17: 65 pgs: 65 active+clean; 258 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.1 MiB/s wr, 193 op/s 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:22.740 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:22 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all crash 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: Upgrade: osd.0 is safe to restart 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: Upgrade: Updating osd.0 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: Deploying daemon osd.0 on vm03 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: pgmap v18: 65 pgs: 65 active+clean; 258 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 676 KiB/s wr, 120 op/s 2026-03-09T16:17:24.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all crash 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: Upgrade: osd.0 is safe to restart 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: Upgrade: Updating osd.0 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: Deploying daemon osd.0 on vm03 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: pgmap v18: 65 pgs: 65 active+clean; 258 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 676 KiB/s wr, 120 op/s 2026-03-09T16:17:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:24.817 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:24 vm03.local systemd[1]: Stopping Ceph osd.0 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:24.817 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:17:24.634+0000 7f7f85029640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:17:24.817 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:17:24.634+0000 7f7f85029640 -1 osd.0 46 *** Got signal Terminated *** 2026-03-09T16:17:24.817 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:24 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[70412]: 2026-03-09T16:17:24.634+0000 7f7f85029640 -1 osd.0 46 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:17:25.488 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142561]: 2026-03-09 16:17:25.295012692 +0000 UTC m=+0.675079129 container died 2ea78f0d62f8b4041bd5b8a37314047f356e005e1dbe5926bdd5c5dee3ff7456 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:17:25.488 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142561]: 2026-03-09 16:17:25.316469743 +0000 UTC m=+0.696536180 container remove 2ea78f0d62f8b4041bd5b8a37314047f356e005e1dbe5926bdd5c5dee3ff7456 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True) 2026-03-09T16:17:25.488 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local bash[142561]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0 2026-03-09T16:17:25.488 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.460715984 +0000 UTC m=+0.016971727 container create 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-09T16:17:25.488 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:25 vm03.local ceph-mon[133973]: osd.0 marked itself down and dead 2026-03-09T16:17:25.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:25 vm05.local ceph-mon[108543]: osd.0 marked itself down and dead 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.506482396 +0000 UTC m=+0.062738139 container init 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True) 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.509632613 +0000 UTC m=+0.065888356 container start 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.511964256 +0000 UTC m=+0.068219999 container attach 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.453028304 +0000 UTC m=+0.009284047 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local conmon[142639]: conmon 602c984d2751793fd47a : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc.scope/container/memory.events 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.640196593 +0000 UTC m=+0.196452336 container died 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142627]: 2026-03-09 16:17:25.684866194 +0000 UTC m=+0.241121937 container remove 602c984d2751793fd47a8a7242738f864c992b5a7765829ab9d9fc255f3581cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.0.service: Deactivated successfully. 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local systemd[1]: Stopped Ceph osd.0 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:25.783 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.0.service: Consumed 33.494s CPU time. 2026-03-09T16:17:26.215 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-mon[133973]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:17:26.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-mon[133973]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T16:17:26.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-mon[133973]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 413 KiB/s wr, 80 op/s 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local systemd[1]: Starting Ceph osd.0 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:25 vm03.local podman[142731]: 2026-03-09 16:17:25.989057128 +0000 UTC m=+0.016779658 container create fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:26.030779041 +0000 UTC m=+0.058501581 container init fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:26.033801588 +0000 UTC m=+0.061524128 container start fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:26.037075796 +0000 UTC m=+0.064798337 container attach fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:25.981599469 +0000 UTC m=+0.009322019 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.216 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:26 vm05.local ceph-mon[108543]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:17:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:26 vm05.local ceph-mon[108543]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T16:17:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:26 vm05.local ceph-mon[108543]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 413 KiB/s wr, 80 op/s 2026-03-09T16:17:26.890 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b0f8edfc-d750-4849-a607-887ee2c4b08c/osd-block-d36e00ca-e7bc-4475-866a-be22243d455f --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T16:17:26.891 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b0f8edfc-d750-4849-a607-887ee2c4b08c/osd-block-d36e00ca-e7bc-4475-866a-be22243d455f --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T16:17:27.229 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:27 vm03.local ceph-mon[133973]: osdmap e48: 6 total, 5 up, 6 in 2026-03-09T16:17:27.229 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:27.229 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:27.229 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/ln -snf /dev/ceph-b0f8edfc-d750-4849-a607-887ee2c4b08c/osd-block-d36e00ca-e7bc-4475-866a-be22243d455f /var/lib/ceph/osd/ceph-0/block 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/ln -snf /dev/ceph-b0f8edfc-d750-4849-a607-887ee2c4b08c/osd-block-d36e00ca-e7bc-4475-866a-be22243d455f /var/lib/ceph/osd/ceph-0/block 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate[142741]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local bash[142731]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local conmon[142741]: conmon fe635dfcb9a753a33fdd : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7.scope/container/memory.events 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:26.944496106 +0000 UTC m=+0.972218646 container died fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:26 vm03.local podman[142731]: 2026-03-09 16:17:26.971405983 +0000 UTC m=+0.999128513 container remove fe635dfcb9a753a33fdd31d27a8594136d61ab1f172e4bc254b8ee243d115fe7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0) 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local podman[142988]: 2026-03-09 16:17:27.063859031 +0000 UTC m=+0.015778263 container create fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3) 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local podman[142988]: 2026-03-09 16:17:27.104364096 +0000 UTC m=+0.056283328 container init fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local podman[142988]: 2026-03-09 16:17:27.107351357 +0000 UTC m=+0.059270589 container start fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-09T16:17:27.229 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local bash[142988]: fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca 2026-03-09T16:17:27.230 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local podman[142988]: 2026-03-09 16:17:27.05704132 +0000 UTC m=+0.008960562 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:27.230 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local systemd[1]: Started Ceph osd.0 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:27 vm05.local ceph-mon[108543]: osdmap e48: 6 total, 5 up, 6 in 2026-03-09T16:17:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:28.069 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:27 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:17:27.977+0000 7f0d0e5a7740 -1 Falling back to public interface 2026-03-09T16:17:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:28 vm03.local ceph-mon[133973]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 517 KiB/s wr, 100 op/s 2026-03-09T16:17:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:28 vm05.local ceph-mon[108543]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 517 KiB/s wr, 100 op/s 2026-03-09T16:17:29.504 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:29 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.504 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:29 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.504 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:29 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.504 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:29 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:29 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:29 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:29 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:29 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.7 KiB/s rd, 6.7 KiB/s wr, 20 op/s 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:30 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.7 KiB/s rd, 6.7 KiB/s wr, 20 op/s 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:30.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:30 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T16:17:31.624 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local systemd[1]: Stopping Ceph osd.1 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:31.624 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:17:31.427+0000 7f9dab2af640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:17:31.624 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:17:31.427+0000 7f9dab2af640 -1 osd.1 48 *** Got signal Terminated *** 2026-03-09T16:17:31.624 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[77484]: 2026-03-09T16:17:31.427+0000 7f9dab2af640 -1 osd.1 48 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: Upgrade: osd.1 is safe to restart 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: Upgrade: Updating osd.1 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: Deploying daemon osd.1 on vm03 2026-03-09T16:17:31.624 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:31 vm03.local ceph-mon[133973]: osd.1 marked itself down and dead 2026-03-09T16:17:31.890 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147128]: 2026-03-09 16:17:31.623773508 +0000 UTC m=+0.212052237 container died 6169f982441385746caf968acad95e6fecb89acae28e137058b061e6b7be9b67 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=reef) 2026-03-09T16:17:31.890 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147128]: 2026-03-09 16:17:31.660374545 +0000 UTC m=+0.248653264 container remove 6169f982441385746caf968acad95e6fecb89acae28e137058b061e6b7be9b67 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:17:31.890 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local bash[147128]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: Upgrade: osd.1 is safe to restart 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: Upgrade: Updating osd.1 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: Deploying daemon osd.1 on vm03 2026-03-09T16:17:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:31 vm05.local ceph-mon[108543]: osd.1 marked itself down and dead 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147378]: 2026-03-09 16:17:31.906327143 +0000 UTC m=+0.037304445 container create f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3) 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147378]: 2026-03-09 16:17:31.976813765 +0000 UTC m=+0.107791077 container init f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147378]: 2026-03-09 16:17:31.983946577 +0000 UTC m=+0.114923879 container start f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147378]: 2026-03-09 16:17:31.889297288 +0000 UTC m=+0.020274600 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:31 vm03.local podman[147378]: 2026-03-09 16:17:31.99392321 +0000 UTC m=+0.124900512 container attach f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:17:32.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147378]: 2026-03-09 16:17:32.145691329 +0000 UTC m=+0.276668631 container died f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid) 2026-03-09T16:17:32.419 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147378]: 2026-03-09 16:17:32.173150906 +0000 UTC m=+0.304128208 container remove f94e6e86c3791031abf3145d24f5f52c8b1c91557cae475999ae0b3743b8025c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-09T16:17:32.419 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.1.service: Deactivated successfully. 2026-03-09T16:17:32.419 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local systemd[1]: Stopped Ceph osd.1 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:32.419 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.1.service: Consumed 36.626s CPU time. 2026-03-09T16:17:32.419 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local systemd[1]: Starting Ceph osd.1 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:32.732 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147482]: 2026-03-09 16:17:32.571694393 +0000 UTC m=+0.030500069 container create e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True) 2026-03-09T16:17:32.732 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147482]: 2026-03-09 16:17:32.640334269 +0000 UTC m=+0.099139945 container init e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:17:32.732 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147482]: 2026-03-09 16:17:32.648852563 +0000 UTC m=+0.107658239 container start e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:17:32.732 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147482]: 2026-03-09 16:17:32.650608711 +0000 UTC m=+0.109414387 container attach e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:17:32.732 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local podman[147482]: 2026-03-09 16:17:32.557278 +0000 UTC m=+0.016083676 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:32.733 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:32.733 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-mon[133973]: Health check update: 2 osds down (OSD_DOWN) 2026-03-09T16:17:32.733 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-mon[133973]: osdmap e49: 6 total, 4 up, 6 in 2026-03-09T16:17:32.733 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-mon[133973]: pgmap v25: 65 pgs: 1 stale+active+undersized, 14 active+undersized, 4 stale+active+undersized+degraded, 7 stale+active+clean, 15 active+undersized+degraded, 24 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 57 KiB/s wr, 4 op/s; 51/264 objects degraded (19.318%) 2026-03-09T16:17:32.940 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:32 vm05.local ceph-mon[108543]: Health check update: 2 osds down (OSD_DOWN) 2026-03-09T16:17:32.940 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:32 vm05.local ceph-mon[108543]: osdmap e49: 6 total, 4 up, 6 in 2026-03-09T16:17:32.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:32 vm05.local ceph-mon[108543]: pgmap v25: 65 pgs: 1 stale+active+undersized, 14 active+undersized, 4 stale+active+undersized+degraded, 7 stale+active+clean, 15 active+undersized+degraded, 24 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 57 KiB/s wr, 4 op/s; 51/264 objects degraded (19.318%) 2026-03-09T16:17:33.141 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:17:32.885+0000 7f0d0e5a7740 -1 osd.0 46 log_to_monitors true 2026-03-09T16:17:33.141 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local bash[147482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.141 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.141 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:32 vm03.local bash[147482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-mon[133973]: Health check failed: Degraded data redundancy: 51/264 objects degraded (19.318%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3d189c7b-bade-4f10-ae59-a0275e25e467/osd-block-77efea00-570c-4571-a7a6-968cc4097343 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T16:17:33.568 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3d189c7b-bade-4f10-ae59-a0275e25e467/osd-block-77efea00-570c-4571-a7a6-968cc4097343 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T16:17:33.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-mon[133973]: osdmap e50: 6 total, 4 up, 6 in 2026-03-09T16:17:33.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-mon[133973]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T16:17:33.817 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:17:33.585+0000 7f0d06341640 -1 osd.0 46 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/ln -snf /dev/ceph-3d189c7b-bade-4f10-ae59-a0275e25e467/osd-block-77efea00-570c-4571-a7a6-968cc4097343 /var/lib/ceph/osd/ceph-1/block 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/ln -snf /dev/ceph-3d189c7b-bade-4f10-ae59-a0275e25e467/osd-block-77efea00-570c-4571-a7a6-968cc4097343 /var/lib/ceph/osd/ceph-1/block 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate[147493]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147482]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147482]: 2026-03-09 16:17:33.640670993 +0000 UTC m=+1.099476669 container died e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3) 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147482]: 2026-03-09 16:17:33.658378427 +0000 UTC m=+1.117184103 container remove e0cf3e16ee2762e26d250a9982b891a3e4a8bd974f17a7167fcdafe3e3840083 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147777]: 2026-03-09 16:17:33.771622479 +0000 UTC m=+0.018422412 container create 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147777]: 2026-03-09 16:17:33.81349286 +0000 UTC m=+0.060292803 container init 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-09T16:17:33.818 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147777]: 2026-03-09 16:17:33.816728726 +0000 UTC m=+0.063528659 container start 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) 2026-03-09T16:17:34.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:33 vm05.local ceph-mon[108543]: Health check failed: Degraded data redundancy: 51/264 objects degraded (19.318%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:34.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:33 vm05.local ceph-mon[108543]: osdmap e50: 6 total, 4 up, 6 in 2026-03-09T16:17:34.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:33 vm05.local ceph-mon[108543]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T16:17:34.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local bash[147777]: 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c 2026-03-09T16:17:34.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local podman[147777]: 2026-03-09 16:17:33.764178877 +0000 UTC m=+0.010978810 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:34.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:33 vm03.local systemd[1]: Started Ceph osd.1 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: osdmap e51: 6 total, 4 up, 6 in 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:34.805 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-mon[133973]: pgmap v28: 65 pgs: 2 stale+active+undersized, 13 active+undersized, 4 stale+active+undersized+degraded, 7 stale+active+clean, 15 active+undersized+degraded, 24 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 64 KiB/s wr, 5 op/s; 51/264 objects degraded (19.318%) 2026-03-09T16:17:34.805 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:34 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:17:34.650+0000 7fcaa3800740 -1 Falling back to public interface 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: osdmap e51: 6 total, 4 up, 6 in 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: from='osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:35.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:34 vm05.local ceph-mon[108543]: pgmap v28: 65 pgs: 2 stale+active+undersized, 13 active+undersized, 4 stale+active+undersized+degraded, 7 stale+active+clean, 15 active+undersized+degraded, 24 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 64 KiB/s wr, 5 op/s; 51/264 objects degraded (19.318%) 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344] boot 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: osdmap e52: 6 total, 5 up, 6 in 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:35 vm03.local ceph-mon[133973]: osdmap e53: 6 total, 5 up, 6 in 2026-03-09T16:17:35.903 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: osd.0 [v2:192.168.123.103:6802/2827570344,v1:192.168.123.103:6803/2827570344] boot 2026-03-09T16:17:35.903 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: osdmap e52: 6 total, 5 up, 6 in 2026-03-09T16:17:35.903 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T16:17:35.903 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.903 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:35.904 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:35 vm05.local ceph-mon[108543]: osdmap e53: 6 total, 5 up, 6 in 2026-03-09T16:17:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:37 vm03.local ceph-mon[133973]: pgmap v31: 65 pgs: 22 peering, 2 stale+active+undersized, 20 active+undersized, 2 stale+active+undersized+degraded, 7 active+undersized+degraded, 12 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18/264 objects degraded (6.818%) 2026-03-09T16:17:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:37 vm05.local ceph-mon[108543]: pgmap v31: 65 pgs: 22 peering, 2 stale+active+undersized, 20 active+undersized, 2 stale+active+undersized+degraded, 7 active+undersized+degraded, 12 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18/264 objects degraded (6.818%) 2026-03-09T16:17:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.640 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:17:38.381+0000 7fcaa3800740 -1 osd.1 48 log_to_monitors true 2026-03-09T16:17:38.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: Upgrade: unsafe to stop osd(s) at this time (20 PGs are or would become offline) 2026-03-09T16:17:38.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:38 vm03.local ceph-mon[133973]: pgmap v32: 65 pgs: 22 peering, 2 stale+active+undersized, 20 active+undersized, 2 stale+active+undersized+degraded, 7 active+undersized+degraded, 12 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18/264 objects degraded (6.818%) 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: Upgrade: unsafe to stop osd(s) at this time (20 PGs are or would become offline) 2026-03-09T16:17:38.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:38 vm05.local ceph-mon[108543]: pgmap v32: 65 pgs: 22 peering, 2 stale+active+undersized, 20 active+undersized, 2 stale+active+undersized+degraded, 7 active+undersized+degraded, 12 active+clean; 254 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18/264 objects degraded (6.818%) 2026-03-09T16:17:39.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:39 vm03.local ceph-mon[133973]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T16:17:39.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:39.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:39.640 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:17:39 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:17:39.339+0000 7fca9b59a640 -1 osd.1 48 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:17:39.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:39 vm05.local ceph-mon[108543]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T16:17:39.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:39.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:40.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:40 vm03.local ceph-mon[133973]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T16:17:40.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:40 vm03.local ceph-mon[133973]: osdmap e54: 6 total, 5 up, 6 in 2026-03-09T16:17:40.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:40 vm03.local ceph-mon[133973]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:17:40.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:40 vm03.local ceph-mon[133973]: pgmap v34: 65 pgs: 22 peering, 18 active+undersized, 6 active+undersized+degraded, 19 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 11/264 objects degraded (4.167%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:40.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:40 vm05.local ceph-mon[108543]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T16:17:40.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:40 vm05.local ceph-mon[108543]: osdmap e54: 6 total, 5 up, 6 in 2026-03-09T16:17:40.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:40 vm05.local ceph-mon[108543]: from='osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:17:40.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:40 vm05.local ceph-mon[108543]: pgmap v34: 65 pgs: 22 peering, 18 active+undersized, 6 active+undersized+degraded, 19 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 11/264 objects degraded (4.167%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:41 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 11/264 objects degraded (4.167%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:41 vm03.local ceph-mon[133973]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:17:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:41 vm03.local ceph-mon[133973]: osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719] boot 2026-03-09T16:17:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:41 vm03.local ceph-mon[133973]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T16:17:41.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:41 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:17:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:41 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 11/264 objects degraded (4.167%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:41 vm05.local ceph-mon[108543]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:17:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:41 vm05.local ceph-mon[108543]: osd.1 [v2:192.168.123.103:6810/2424605719,v1:192.168.123.103:6811/2424605719] boot 2026-03-09T16:17:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:41 vm05.local ceph-mon[108543]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T16:17:41.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:41 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T16:17:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:42 vm05.local ceph-mon[108543]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T16:17:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:42 vm05.local ceph-mon[108543]: pgmap v37: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:42.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:42 vm03.local ceph-mon[133973]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T16:17:42.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:42 vm03.local ceph-mon[133973]: pgmap v37: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:45.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:45 vm03.local ceph-mon[133973]: pgmap v38: 65 pgs: 19 active+undersized, 13 active+undersized+degraded, 33 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:45.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:45 vm05.local ceph-mon[108543]: pgmap v38: 65 pgs: 19 active+undersized, 13 active+undersized+degraded, 33 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%); 99 B/s, 0 objects/s recovering 2026-03-09T16:17:46.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:46 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 34/264 objects degraded (12.879%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.315+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2586495844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 msgr2=0x7fabbc075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.315+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2586495844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc075b00 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fabac0099b0 tx=0x7fabac02f220 comp rx=0 tx=0).stop 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2586495844 shutdown_connections 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2586495844 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabbc076040 0x7fabbc111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2586495844 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc075b00 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2586495844 >> 192.168.123.103:0/2586495844 conn(0x7fabbc0fe710 msgr2=0x7fabbc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2586495844 shutdown_connections 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2586495844 wait complete. 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.316+0000 7fabc1f5e640 1 Processor -- start 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabc1f5e640 1 -- start start 2026-03-09T16:17:46.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabc1f5e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabc1f5e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabbc076040 0x7fabbc19f3e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabc1f5e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabbc19fa70 con 0x7fabbc075720 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabc1f5e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabbc1a37e0 con 0x7fabbc076040 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41790/0 (socket says 192.168.123.103:41790) 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 -- 192.168.123.103:0/2732666822 learned_addr learned my addr 192.168.123.103:0/2732666822 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 -- 192.168.123.103:0/2732666822 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabbc076040 msgr2=0x7fabbc19f3e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabbc076040 0x7fabbc19f3e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.317+0000 7fabbb7fe640 1 -- 192.168.123.103:0/2732666822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fabac009660 con 0x7fabbc075720 2026-03-09T16:17:46.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabbb7fe640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fabac002410 tx=0x7fabac004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabac03d070 con 0x7fabbc075720 2026-03-09T16:17:46.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fabac0043b0 con 0x7fabbc075720 2026-03-09T16:17:46.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fabbc1a3a60 con 0x7fabbc075720 2026-03-09T16:17:46.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabac041880 con 0x7fabbc075720 2026-03-09T16:17:46.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.318+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fabbc1a3f20 con 0x7fabbc075720 2026-03-09T16:17:46.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.320+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fabac02fc90 con 0x7fabbc075720 2026-03-09T16:17:46.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.320+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab80005350 con 0x7fabbc075720 2026-03-09T16:17:46.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.321+0000 7fabb8ff9640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 0x7fab90079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.321+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fabac0be7c0 con 0x7fabbc075720 2026-03-09T16:17:46.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.323+0000 7fabbaffd640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 0x7fab90079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.323+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fabac086d50 con 0x7fabbc075720 2026-03-09T16:17:46.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.324+0000 7fabbaffd640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 0x7fab90079d00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fabbc1a0450 tx=0x7faba8006d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.423+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fab80002bf0 con 0x7fab90077840 2026-03-09T16:17:46.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.425+0000 7fabb8ff9640 1 -- 192.168.123.103:0/2732666822 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fab80002bf0 con 0x7fab90077840 2026-03-09T16:17:46.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 msgr2=0x7fab90079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 0x7fab90079d00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fabbc1a0450 tx=0x7faba8006d20 comp rx=0 tx=0).stop 2026-03-09T16:17:46.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 msgr2=0x7fabbc19eea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fabac002410 tx=0x7fabac004290 comp rx=0 tx=0).stop 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 shutdown_connections 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fab90077840 0x7fab90079d00 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabbc076040 0x7fabbc19f3e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 --2- 192.168.123.103:0/2732666822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabbc075720 0x7fabbc19eea0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.427+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 >> 192.168.123.103:0/2732666822 conn(0x7fabbc0fe710 msgr2=0x7fabbc0ffdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.428+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 shutdown_connections 2026-03-09T16:17:46.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.428+0000 7fabc1f5e640 1 -- 192.168.123.103:0/2732666822 wait complete. 2026-03-09T16:17:46.436 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:17:46.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.480+0000 7f37f7577640 1 -- 192.168.123.103:0/1588820395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 msgr2=0x7f37f8111160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.480+0000 7f37f7577640 1 --2- 192.168.123.103:0/1588820395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f8111160 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f37ec0099b0 tx=0x7f37ec02f220 comp rx=0 tx=0).stop 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 -- 192.168.123.103:0/1588820395 shutdown_connections 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 --2- 192.168.123.103:0/1588820395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f8111160 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 --2- 192.168.123.103:0/1588820395 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 0x7f37f8075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 -- 192.168.123.103:0/1588820395 >> 192.168.123.103:0/1588820395 conn(0x7f37f80fe540 msgr2=0x7f37f8100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 -- 192.168.123.103:0/1588820395 shutdown_connections 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 -- 192.168.123.103:0/1588820395 wait complete. 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.481+0000 7f37f7577640 1 Processor -- start 2026-03-09T16:17:46.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f7577640 1 -- start start 2026-03-09T16:17:46.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f7577640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 0x7f37f81a49b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f7577640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f7577640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37f819eaa0 con 0x7f37f8076040 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f7577640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37f819ec10 con 0x7f37f8075720 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f5d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f5d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41812/0 (socket says 192.168.123.103:41812) 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.482+0000 7f37f5d74640 1 -- 192.168.123.103:0/2447992046 learned_addr learned my addr 192.168.123.103:0/2447992046 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.483+0000 7f37f5d74640 1 -- 192.168.123.103:0/2447992046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 msgr2=0x7f37f81a49b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.483+0000 7f37f6575640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 0x7f37f81a49b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.483+0000 7f37f5d74640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 0x7f37f81a49b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.483+0000 7f37f5d74640 1 -- 192.168.123.103:0/2447992046 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f37ec009660 con 0x7f37f8076040 2026-03-09T16:17:46.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.484+0000 7f37f5d74640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f37ec002410 tx=0x7f37ec031cd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.484+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37ec03d070 con 0x7f37f8076040 2026-03-09T16:17:46.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.484+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f37ec004440 con 0x7f37f8076040 2026-03-09T16:17:46.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.484+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37ec031260 con 0x7f37f8076040 2026-03-09T16:17:46.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.484+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f37f819ee90 con 0x7f37f8076040 2026-03-09T16:17:46.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.485+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f37f819f300 con 0x7f37f8076040 2026-03-09T16:17:46.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.486+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f37c4005350 con 0x7f37f8076040 2026-03-09T16:17:46.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.487+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f37ec038730 con 0x7f37f8076040 2026-03-09T16:17:46.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.487+0000 7f37df7fe640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 0x7f37c8079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.487+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f37ec0be280 con 0x7f37f8076040 2026-03-09T16:17:46.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.489+0000 7f37f6575640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 0x7f37c8079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.489+0000 7f37f6575640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 0x7f37c8079d00 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f37e000a8b0 tx=0x7f37e0008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.490+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f37ec0c3050 con 0x7f37f8076040 2026-03-09T16:17:46.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:46 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 34/264 objects degraded (12.879%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:17:46.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.589+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f37c4002bf0 con 0x7f37c8077840 2026-03-09T16:17:46.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.594+0000 7f37df7fe640 1 -- 192.168.123.103:0/2447992046 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f37c4002bf0 con 0x7f37c8077840 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.596+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 msgr2=0x7f37c8079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.596+0000 7f37f7577640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 0x7f37c8079d00 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f37e000a8b0 tx=0x7f37e0008040 comp rx=0 tx=0).stop 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.596+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 msgr2=0x7f37f81a4ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.596+0000 7f37f7577640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f37ec002410 tx=0x7f37ec031cd0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 shutdown_connections 2026-03-09T16:17:46.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37c8077840 0x7f37c8079d00 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f37f8076040 0x7f37f81a4ef0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 --2- 192.168.123.103:0/2447992046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37f8075720 0x7f37f81a49b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 >> 192.168.123.103:0/2447992046 conn(0x7f37f80fe540 msgr2=0x7f37f80ffab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 shutdown_connections 2026-03-09T16:17:46.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.597+0000 7f37f7577640 1 -- 192.168.123.103:0/2447992046 wait complete. 2026-03-09T16:17:46.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.649+0000 7fa47796b640 1 -- 192.168.123.103:0/2289160945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 msgr2=0x7fa470102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.649+0000 7fa47796b640 1 --2- 192.168.123.103:0/2289160945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa470102e30 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fa460009a00 tx=0x7fa46002f290 comp rx=0 tx=0).stop 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 -- 192.168.123.103:0/2289160945 shutdown_connections 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 --2- 192.168.123.103:0/2289160945 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa470102e30 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 --2- 192.168.123.103:0/2289160945 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 0x7fa470108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 -- 192.168.123.103:0/2289160945 >> 192.168.123.103:0/2289160945 conn(0x7fa4700fe710 msgr2=0x7fa470100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 -- 192.168.123.103:0/2289160945 shutdown_connections 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 -- 192.168.123.103:0/2289160945 wait complete. 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 Processor -- start 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.650+0000 7fa47796b640 1 -- start start 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa47796b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa47796b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 0x7fa47019af00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa47796b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa47019b5e0 con 0x7fa4701029d0 2026-03-09T16:17:46.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa47796b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa47019f330 con 0x7fa4701089d0 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa4756e0640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa4756e0640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41830/0 (socket says 192.168.123.103:41830) 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa4756e0640 1 -- 192.168.123.103:0/244328853 learned_addr learned my addr 192.168.123.103:0/244328853 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.651+0000 7fa474edf640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 0x7fa47019af00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa4756e0640 1 -- 192.168.123.103:0/244328853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 msgr2=0x7fa47019af00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa4756e0640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 0x7fa47019af00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa4756e0640 1 -- 192.168.123.103:0/244328853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa460009660 con 0x7fa4701029d0 2026-03-09T16:17:46.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa4756e0640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fa46400b750 tx=0x7fa46400bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa464004070 con 0x7fa4701029d0 2026-03-09T16:17:46.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa4640027a0 con 0x7fa4701029d0 2026-03-09T16:17:46.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa46400cad0 con 0x7fa4701029d0 2026-03-09T16:17:46.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa47019f610 con 0x7fa4701029d0 2026-03-09T16:17:46.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.652+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa47019fa30 con 0x7fa4701029d0 2026-03-09T16:17:46.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.654+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa46400cc30 con 0x7fa4701029d0 2026-03-09T16:17:46.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.654+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa438005350 con 0x7fa4701029d0 2026-03-09T16:17:46.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.655+0000 7fa45e7fc640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 0x7fa44c079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.655+0000 7fa474edf640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 0x7fa44c079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.655+0000 7fa474edf640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 0x7fa44c079e70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fa47019bf70 tx=0x7fa460005950 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.655+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa464099100 con 0x7fa4701029d0 2026-03-09T16:17:46.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.657+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa464061720 con 0x7fa4701029d0 2026-03-09T16:17:46.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.751+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa438002bf0 con 0x7fa44c0779b0 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 11s ago 7m 25.2M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 11s ago 7m 9374k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (7m) 26s ago 7m 9861k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (29s) 11s ago 7m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (27s) 26s ago 7m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (106s) 11s ago 7m 89.7M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (5m) 11s ago 5m 18.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (5m) 11s ago 5m 191M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (5m) 26s ago 5m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (5m) 26s ago 5m 18.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (3m) 11s ago 8m 602M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (2m) 26s ago 7m 494M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (61s) 11s ago 8m 60.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (43s) 26s ago 7m 46.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 11s ago 7m 9508k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 26s ago 7m 9479k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (19s) 11s ago 6m 114M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (12s) 11s ago 6m 12.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (6m) 11s ago 6m 309M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 31188175e77b 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (6m) 26s ago 6m 455M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (6m) 26s ago 6m 380M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (5m) 26s ago 5m 331M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 11s ago 7m 43.9M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:17:46.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.756+0000 7fa45e7fc640 1 -- 192.168.123.103:0/244328853 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7fa438002bf0 con 0x7fa44c0779b0 2026-03-09T16:17:46.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 msgr2=0x7fa44c079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 0x7fa44c079e70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fa47019bf70 tx=0x7fa460005950 comp rx=0 tx=0).stop 2026-03-09T16:17:46.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 msgr2=0x7fa47019a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fa46400b750 tx=0x7fa46400bc20 comp rx=0 tx=0).stop 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 shutdown_connections 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa44c0779b0 0x7fa44c079e70 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4701089d0 0x7fa47019af00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 --2- 192.168.123.103:0/244328853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa4701029d0 0x7fa47019a9c0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.759+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 >> 192.168.123.103:0/244328853 conn(0x7fa4700fe710 msgr2=0x7fa470100130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.760+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 shutdown_connections 2026-03-09T16:17:46.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.760+0000 7fa47796b640 1 -- 192.168.123.103:0/244328853 wait complete. 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 -- 192.168.123.103:0/2521600742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4102800 msgr2=0x7f8df4102c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 --2- 192.168.123.103:0/2521600742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4102800 0x7f8df4102c60 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f8de00099b0 tx=0x7f8de002f220 comp rx=0 tx=0).stop 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 -- 192.168.123.103:0/2521600742 shutdown_connections 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 --2- 192.168.123.103:0/2521600742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4102800 0x7f8df4102c60 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 --2- 192.168.123.103:0/2521600742 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4108800 0x7f8df4108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 -- 192.168.123.103:0/2521600742 >> 192.168.123.103:0/2521600742 conn(0x7f8df40fe540 msgr2=0x7f8df4100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.812+0000 7f8df9082640 1 -- 192.168.123.103:0/2521600742 shutdown_connections 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 -- 192.168.123.103:0/2521600742 wait complete. 2026-03-09T16:17:46.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 Processor -- start 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 -- start start 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4102800 0x7f8df419a6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8df419b2b0 con 0x7f8df4108800 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df9082640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8df419f020 con 0x7f8df4102800 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41856/0 (socket says 192.168.123.103:41856) 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.813+0000 7f8df37fe640 1 -- 192.168.123.103:0/1754295712 learned_addr learned my addr 192.168.123.103:0/1754295712 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df3fff640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4102800 0x7f8df419a6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df37fe640 1 -- 192.168.123.103:0/1754295712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4102800 msgr2=0x7f8df419a6e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df37fe640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4102800 0x7f8df419a6e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df37fe640 1 -- 192.168.123.103:0/1754295712 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8de0009660 con 0x7f8df4108800 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df37fe640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8de002f730 tx=0x7f8de0031cd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8de003d070 con 0x7f8df4108800 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.814+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8de0031f00 con 0x7f8df4108800 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.815+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8de0031310 con 0x7f8df4108800 2026-03-09T16:17:46.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.815+0000 7f8df9082640 1 -- 192.168.123.103:0/1754295712 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8df419f2a0 con 0x7f8df4108800 2026-03-09T16:17:46.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.815+0000 7f8df9082640 1 -- 192.168.123.103:0/1754295712 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8df419f710 con 0x7f8df4108800 2026-03-09T16:17:46.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.816+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8db4005350 con 0x7f8df4108800 2026-03-09T16:17:46.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.819+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8de002fc90 con 0x7f8df4108800 2026-03-09T16:17:46.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.819+0000 7f8df17fa640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 0x7f8dc8079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:46.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.819+0000 7f8df3fff640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 0x7f8dc8079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:46.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.820+0000 7f8df3fff640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 0x7f8dc8079cb0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8de400a900 tx=0x7f8de4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:46.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.820+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f8de00bec40 con 0x7f8df4108800 2026-03-09T16:17:46.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.820+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8de00c2280 con 0x7f8df4108800 2026-03-09T16:17:46.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.954+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8db40058d0 con 0x7f8df4108800 2026-03-09T16:17:46.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.955+0000 7f8df17fa640 1 -- 192.168.123.103:0/1754295712 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f8de00871d0 con 0x7f8df4108800 2026-03-09T16:17:46.955 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:17:46.955 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 8, 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:17:46.956 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 msgr2=0x7f8dc8079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 0x7f8dc8079cb0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8de400a900 tx=0x7f8de4008040 comp rx=0 tx=0).stop 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 msgr2=0x7f8df419ac20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8de002f730 tx=0x7f8de0031cd0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 shutdown_connections 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8dc80777f0 0x7f8dc8079cb0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8df4108800 0x7f8df419ac20 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 --2- 192.168.123.103:0/1754295712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8df4102800 0x7f8df419a6e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:46.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.957+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 >> 192.168.123.103:0/1754295712 conn(0x7f8df40fe540 msgr2=0x7f8df40fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:46.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.958+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 shutdown_connections 2026-03-09T16:17:46.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:46.958+0000 7f8dceffd640 1 -- 192.168.123.103:0/1754295712 wait complete. 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 -- 192.168.123.103:0/4246975810 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41089d0 msgr2=0x7fe0b4108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/4246975810 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41089d0 0x7fe0b4108db0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fe0a8009a00 tx=0x7fe0a802f280 comp rx=0 tx=0).stop 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 -- 192.168.123.103:0/4246975810 shutdown_connections 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/4246975810 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b41029d0 0x7fe0b4102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/4246975810 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41089d0 0x7fe0b4108db0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 -- 192.168.123.103:0/4246975810 >> 192.168.123.103:0/4246975810 conn(0x7fe0b40fe710 msgr2=0x7fe0b4100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.010+0000 7fe0bc48d640 1 -- 192.168.123.103:0/4246975810 shutdown_connections 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 -- 192.168.123.103:0/4246975810 wait complete. 2026-03-09T16:17:47.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 Processor -- start 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 -- start start 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 0x7fe0b4072780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0b406bda0 con 0x7fe0b41029d0 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0bc48d640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0b406bf10 con 0x7fe0b4072cc0 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0b9a01640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0b9a01640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:54016/0 (socket says 192.168.123.103:54016) 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.011+0000 7fe0b9a01640 1 -- 192.168.123.103:0/530731110 learned_addr learned my addr 192.168.123.103:0/530731110 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0ba202640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 0x7fe0b4072780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0b9a01640 1 -- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 msgr2=0x7fe0b4072780 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0b9a01640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 0x7fe0b4072780 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0b9a01640 1 -- 192.168.123.103:0/530731110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0a8009660 con 0x7fe0b4072cc0 2026-03-09T16:17:47.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0ba202640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 0x7fe0b4072780 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0b9a01640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fe0a400e9b0 tx=0x7fe0a400ee80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0a400cd90 con 0x7fe0b4072cc0 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0b406c1f0 con 0x7fe0b4072cc0 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe0a4004590 con 0x7fe0b4072cc0 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.012+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0a4010640 con 0x7fe0b4072cc0 2026-03-09T16:17:47.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.013+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0b4070320 con 0x7fe0b4072cc0 2026-03-09T16:17:47.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.013+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe084005350 con 0x7fe0b4072cc0 2026-03-09T16:17:47.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.014+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe0a40040d0 con 0x7fe0b4072cc0 2026-03-09T16:17:47.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.015+0000 7fe0a37fe640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 0x7fe094079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.015+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe0a4014070 con 0x7fe0b4072cc0 2026-03-09T16:17:47.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.015+0000 7fe0ba202640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 0x7fe094079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.016+0000 7fe0ba202640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 0x7fe094079ec0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fe0a80040c0 tx=0x7fe0a8002d80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.018+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe0a4062030 con 0x7fe0b4072cc0 2026-03-09T16:17:47.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.131+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe0840058d0 con 0x7fe0b4072cc0 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.131+0000 7fe0a37fe640 1 -- 192.168.123.103:0/530731110 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1919 (secure 0 0 0) 0x7fe0a4061780 con 0x7fe0b4072cc0 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:17:47.132 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 41 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14476} 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:17:47.133 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 msgr2=0x7fe094079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 0x7fe094079ec0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fe0a80040c0 tx=0x7fe0a8002d80 comp rx=0 tx=0).stop 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 msgr2=0x7fe0b406b860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fe0a400e9b0 tx=0x7fe0a400ee80 comp rx=0 tx=0).stop 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 shutdown_connections 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe094077a00 0x7fe094079ec0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0b4072cc0 0x7fe0b406b860 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 --2- 192.168.123.103:0/530731110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b41029d0 0x7fe0b4072780 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.134+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 >> 192.168.123.103:0/530731110 conn(0x7fe0b40fe710 msgr2=0x7fe0b4077970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.135+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 shutdown_connections 2026-03-09T16:17:47.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.135+0000 7fe0bc48d640 1 -- 192.168.123.103:0/530731110 wait complete. 2026-03-09T16:17:47.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.187+0000 7f9942fff640 1 -- 192.168.123.103:0/3842474341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c068f10 msgr2=0x7f993c069370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.187+0000 7f9942fff640 1 --2- 192.168.123.103:0/3842474341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c068f10 0x7f993c069370 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f992c0099b0 tx=0x7f992c02f220 comp rx=0 tx=0).stop 2026-03-09T16:17:47.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 -- 192.168.123.103:0/3842474341 shutdown_connections 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 --2- 192.168.123.103:0/3842474341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c068f10 0x7f993c069370 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 --2- 192.168.123.103:0/3842474341 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c1067f0 0x7f993c106bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 -- 192.168.123.103:0/3842474341 >> 192.168.123.103:0/3842474341 conn(0x7f993c0fb3d0 msgr2=0x7f993c0fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 -- 192.168.123.103:0/3842474341 shutdown_connections 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 -- 192.168.123.103:0/3842474341 wait complete. 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 Processor -- start 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.188+0000 7f9942fff640 1 -- start start 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9942fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9942fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 0x7f993c1a0b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9942fff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f993c1a10b0 con 0x7f993c1067f0 2026-03-09T16:17:47.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9942fff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f993c19a6d0 con 0x7f993c068f10 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:54032/0 (socket says 192.168.123.103:54032) 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 -- 192.168.123.103:0/1935741231 learned_addr learned my addr 192.168.123.103:0/1935741231 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9933fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 0x7f993c1a0b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 -- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 msgr2=0x7f993c1a0b20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 0x7f993c1a0b20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9940d74640 1 -- 192.168.123.103:0/1935741231 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f992c009660 con 0x7f993c068f10 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.189+0000 7f9933fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 0x7f993c1a0b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:17:47.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9940d74640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f992400ca30 tx=0x7f992400cf00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9924004430 con 0x7f993c068f10 2026-03-09T16:17:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9924004590 con 0x7f993c068f10 2026-03-09T16:17:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f992400f660 con 0x7f993c068f10 2026-03-09T16:17:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f993c19a9b0 con 0x7f993c068f10 2026-03-09T16:17:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.190+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f993c19adc0 con 0x7f993c068f10 2026-03-09T16:17:47.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.191+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9924002870 con 0x7f993c068f10 2026-03-09T16:17:47.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.191+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9908005350 con 0x7f993c068f10 2026-03-09T16:17:47.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.192+0000 7f9931ffb640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 0x7f990c079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.192+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f9924099d40 con 0x7f993c068f10 2026-03-09T16:17:47.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.192+0000 7f9933fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 0x7f990c079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.192+0000 7f9933fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 0x7f990c079e70 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f993c19bb40 tx=0x7f992c03a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.194+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9924062250 con 0x7f993c068f10 2026-03-09T16:17:47.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.295+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9908002bf0 con 0x7f990c0779b0 2026-03-09T16:17:47.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:47 vm03.local ceph-mon[133973]: pgmap v39: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:47.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:47 vm03.local ceph-mon[133973]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/264 objects degraded (12.879%), 13 pgs degraded) 2026-03-09T16:17:47.297 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:47 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1754295712' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:47.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.299+0000 7f9931ffb640 1 -- 192.168.123.103:0/1935741231 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9908002bf0 con 0x7f990c0779b0 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "8/23 daemons upgraded", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:17:47.301 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:17:47.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.303+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 msgr2=0x7f990c079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 0x7f990c079e70 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f993c19bb40 tx=0x7f992c03a040 comp rx=0 tx=0).stop 2026-03-09T16:17:47.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 msgr2=0x7f993c1a05e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f992400ca30 tx=0x7f992400cf00 comp rx=0 tx=0).stop 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 shutdown_connections 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f990c0779b0 0x7f990c079e70 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f993c1067f0 0x7f993c1a0b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 --2- 192.168.123.103:0/1935741231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993c068f10 0x7f993c1a05e0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 >> 192.168.123.103:0/1935741231 conn(0x7f993c0fb3d0 msgr2=0x7f993c0ff5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.304+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 shutdown_connections 2026-03-09T16:17:47.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.305+0000 7f9942fff640 1 -- 192.168.123.103:0/1935741231 wait complete. 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.355+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1954253179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00ff190 msgr2=0x7ff0a010c7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.355+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1954253179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00ff190 0x7ff0a010c7f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff094009a30 tx=0x7ff09402f380 comp rx=0 tx=0).stop 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1954253179 shutdown_connections 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1954253179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00ff190 0x7ff0a010c7f0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1954253179 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00fe870 0x7ff0a00fec50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1954253179 >> 192.168.123.103:0/1954253179 conn(0x7ff0a00fa4a0 msgr2=0x7ff0a00fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1954253179 shutdown_connections 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.356+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1954253179 wait complete. 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 Processor -- start 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 -- start start 2026-03-09T16:17:47.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 0x7ff0a019a710 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0a019b330 con 0x7ff0a00fe870 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a6f5e640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0a019dfa0 con 0x7ff0a00ff190 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a575b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a575b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:54052/0 (socket says 192.168.123.103:54052) 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a575b640 1 -- 192.168.123.103:0/1363401109 learned_addr learned my addr 192.168.123.103:0/1363401109 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.357+0000 7ff0a5f5c640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 0x7ff0a019a710 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a575b640 1 -- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 msgr2=0x7ff0a019a710 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a575b640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 0x7ff0a019a710 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a575b640 1 -- 192.168.123.103:0/1363401109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff094009660 con 0x7ff0a00ff190 2026-03-09T16:17:47.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a5f5c640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 0x7ff0a019a710 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:17:47.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a575b640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7ff094005ec0 tx=0x7ff094004830 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff094004410 con 0x7ff0a00ff190 2026-03-09T16:17:47.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.358+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff0a019e220 con 0x7ff0a00ff190 2026-03-09T16:17:47.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.359+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff0a019e710 con 0x7ff0a00ff190 2026-03-09T16:17:47.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.359+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff09402fdb0 con 0x7ff0a00ff190 2026-03-09T16:17:47.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.359+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff0940319b0 con 0x7ff0a00ff190 2026-03-09T16:17:47.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.359+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff068005350 con 0x7ff0a00ff190 2026-03-09T16:17:47.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.360+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff09403f070 con 0x7ff0a00ff190 2026-03-09T16:17:47.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.360+0000 7ff08effd640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 0x7ff07c079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:17:47.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.361+0000 7ff0a5f5c640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 0x7ff07c079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:17:47.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.361+0000 7ff0a5f5c640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 0x7ff07c079cb0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7ff090009fd0 tx=0x7ff090009290 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:17:47.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.361+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff0940be450 con 0x7ff0a00ff190 2026-03-09T16:17:47.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.363+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff094086a10 con 0x7ff0a00ff190 2026-03-09T16:17:47.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.491+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff0680058d0 con 0x7ff0a00ff190 2026-03-09T16:17:47.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.492+0000 7ff08effd640 1 -- 192.168.123.103:0/1363401109 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7ff094086160 con 0x7ff0a00ff190 2026-03-09T16:17:47.493 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:47.493 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:17:47.493 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:17:47.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 msgr2=0x7ff07c079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 0x7ff07c079cb0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7ff090009fd0 tx=0x7ff090009290 comp rx=0 tx=0).stop 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 msgr2=0x7ff0a019ac50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7ff094005ec0 tx=0x7ff094004830 comp rx=0 tx=0).stop 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 shutdown_connections 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff07c0777f0 0x7ff07c079cb0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0a00ff190 0x7ff0a019ac50 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 --2- 192.168.123.103:0/1363401109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff0a00fe870 0x7ff0a019a710 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 >> 192.168.123.103:0/1363401109 conn(0x7ff0a00fa4a0 msgr2=0x7ff0a00fbb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 shutdown_connections 2026-03-09T16:17:47.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:17:47.495+0000 7ff0a6f5e640 1 -- 192.168.123.103:0/1363401109 wait complete. 2026-03-09T16:17:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:47 vm05.local ceph-mon[108543]: pgmap v39: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:47 vm05.local ceph-mon[108543]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/264 objects degraded (12.879%), 13 pgs degraded) 2026-03-09T16:17:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:47 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1754295712' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:17:48.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:48 vm03.local ceph-mon[133973]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:48 vm03.local ceph-mon[133973]: from='client.34162 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:48 vm03.local ceph-mon[133973]: from='client.34166 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:48 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/530731110' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:17:48.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:48 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1363401109' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:17:48.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:48 vm05.local ceph-mon[108543]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:48 vm05.local ceph-mon[108543]: from='client.34162 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:48 vm05.local ceph-mon[108543]: from='client.34166 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:48.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:48 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/530731110' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:17:48.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:48 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1363401109' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:17:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:49 vm03.local ceph-mon[133973]: from='client.44137 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:49 vm03.local ceph-mon[133973]: pgmap v40: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:49 vm05.local ceph-mon[108543]: from='client.44137 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:17:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:49 vm05.local ceph-mon[108543]: pgmap v40: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:51 vm03.local ceph-mon[133973]: pgmap v41: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:51 vm05.local ceph-mon[108543]: pgmap v41: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:53.205 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-mon[133973]: pgmap v42: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:53.205 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:53.205 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:53.205 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:17:53.205 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:53 vm05.local ceph-mon[108543]: pgmap v42: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:53 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:53 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:53 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T16:17:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:53 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:17:53.891 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:53 vm03.local systemd[1]: Stopping Ceph osd.2 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:53.891 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:17:53.590+0000 7fa3e9735640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:17:53.891 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:17:53.590+0000 7fa3e9735640 -1 osd.2 56 *** Got signal Terminated *** 2026-03-09T16:17:53.891 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:53 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[85204]: 2026-03-09T16:17:53.590+0000 7fa3e9735640 -1 osd.2 56 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:17:54.584 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152176]: 2026-03-09 16:17:54.429338912 +0000 UTC m=+0.852024160 container died 31188175e77b1f277fb900a48503ec62001d55a79a6b741f8d0c4c2b7c5c883e (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-09T16:17:54.584 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152176]: 2026-03-09 16:17:54.583938591 +0000 UTC m=+1.006623848 container remove 31188175e77b1f277fb900a48503ec62001d55a79a6b741f8d0c4c2b7c5c883e (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:54.584 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:54 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:54.584 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:54 vm03.local ceph-mon[133973]: Upgrade: osd.2 is safe to restart 2026-03-09T16:17:54.584 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:54 vm03.local ceph-mon[133973]: Upgrade: Updating osd.2 2026-03-09T16:17:54.584 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:54 vm03.local ceph-mon[133973]: Deploying daemon osd.2 on vm03 2026-03-09T16:17:54.584 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:54 vm03.local ceph-mon[133973]: osd.2 marked itself down and dead 2026-03-09T16:17:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:54 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T16:17:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:54 vm05.local ceph-mon[108543]: Upgrade: osd.2 is safe to restart 2026-03-09T16:17:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:54 vm05.local ceph-mon[108543]: Upgrade: Updating osd.2 2026-03-09T16:17:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:54 vm05.local ceph-mon[108543]: Deploying daemon osd.2 on vm03 2026-03-09T16:17:54.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:54 vm05.local ceph-mon[108543]: osd.2 marked itself down and dead 2026-03-09T16:17:54.841 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local bash[152176]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2 2026-03-09T16:17:55.092 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152242]: 2026-03-09 16:17:54.841202448 +0000 UTC m=+0.106805120 container create 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:55.092 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152242]: 2026-03-09 16:17:54.753939436 +0000 UTC m=+0.019542119 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:55.092 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152242]: 2026-03-09 16:17:54.916454992 +0000 UTC m=+0.182057664 container init 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:17:55.092 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152242]: 2026-03-09 16:17:54.921035635 +0000 UTC m=+0.186638307 container start 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:17:55.092 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:54 vm03.local podman[152242]: 2026-03-09 16:17:54.940172584 +0000 UTC m=+0.205775246 container attach 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:55.395 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152261]: 2026-03-09 16:17:55.091566453 +0000 UTC m=+0.010261017 container died 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:55.396 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152261]: 2026-03-09 16:17:55.289603785 +0000 UTC m=+0.208298349 container remove 59ce90fb861064f7d09da91ab1de5e41a41d904bb5eb7bf005cd9fc9b963234e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-09T16:17:55.396 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.2.service: Deactivated successfully. 2026-03-09T16:17:55.396 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local systemd[1]: Stopped Ceph osd.2 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:55.396 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.2.service: Consumed 30.592s CPU time. 2026-03-09T16:17:55.772 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: pgmap v43: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:55.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:17:55.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:55.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:55.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: osdmap e57: 6 total, 5 up, 6 in 2026-03-09T16:17:55.773 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:55.773 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local systemd[1]: Starting Ceph osd.2 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: pgmap v43: 65 pgs: 65 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: osdmap e57: 6 total, 5 up, 6 in 2026-03-09T16:17:55.854 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:55 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:17:56.034 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152353]: 2026-03-09 16:17:55.772054242 +0000 UTC m=+0.073782001 container create a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152353]: 2026-03-09 16:17:55.708490968 +0000 UTC m=+0.010218737 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152353]: 2026-03-09 16:17:55.896041933 +0000 UTC m=+0.197769701 container init a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152353]: 2026-03-09 16:17:55.909531822 +0000 UTC m=+0.211259581 container start a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local podman[152353]: 2026-03-09 16:17:55.945854869 +0000 UTC m=+0.247582628 container attach a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:55 vm03.local bash[152353]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.035 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.391 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.822 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-mon[133973]: osdmap e58: 6 total, 5 up, 6 in 2026-03-09T16:17:56.822 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-mon[133973]: pgmap v46: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ec02378b-a883-47ae-bde2-eabc224a76f3/osd-block-5f4a9aed-e670-4b8f-b945-c157bdccafca --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T16:17:56.823 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ec02378b-a883-47ae-bde2-eabc224a76f3/osd-block-5f4a9aed-e670-4b8f-b945-c157bdccafca --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T16:17:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:56 vm05.local ceph-mon[108543]: osdmap e58: 6 total, 5 up, 6 in 2026-03-09T16:17:57.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:56 vm05.local ceph-mon[108543]: pgmap v46: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:57.119 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/ln -snf /dev/ceph-ec02378b-a883-47ae-bde2-eabc224a76f3/osd-block-5f4a9aed-e670-4b8f-b945-c157bdccafca /var/lib/ceph/osd/ceph-2/block 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/ln -snf /dev/ceph-ec02378b-a883-47ae-bde2-eabc224a76f3/osd-block-5f4a9aed-e670-4b8f-b945-c157bdccafca /var/lib/ceph/osd/ceph-2/block 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate[152364]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local bash[152353]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T16:17:57.120 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:56 vm03.local podman[152353]: 2026-03-09 16:17:56.984233902 +0000 UTC m=+1.285961662 container died a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True) 2026-03-09T16:17:57.387 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local podman[152353]: 2026-03-09 16:17:57.183279241 +0000 UTC m=+1.485007000 container remove a0b56f7705a3b0aa3a5bfec3dbbdd266cf573ded464e750c88da95d5e7b0a90e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS) 2026-03-09T16:17:57.387 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local podman[152627]: 2026-03-09 16:17:57.310448738 +0000 UTC m=+0.032665011 container create 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True) 2026-03-09T16:17:57.387 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local podman[152627]: 2026-03-09 16:17:57.288255549 +0000 UTC m=+0.010471822 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:17:57.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local podman[152627]: 2026-03-09 16:17:57.394349218 +0000 UTC m=+0.116565490 container init 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-09T16:17:57.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local podman[152627]: 2026-03-09 16:17:57.399079521 +0000 UTC m=+0.121295784 container start 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-09T16:17:57.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local bash[152627]: 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c 2026-03-09T16:17:57.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:57 vm03.local systemd[1]: Started Ceph osd.2 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:17:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:57 vm05.local ceph-mon[108543]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-09T16:17:58.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:57 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:58.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:57 vm03.local ceph-mon[133973]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-09T16:17:58.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:57 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:58.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:17:58 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:17:58.231+0000 7fdda572b740 -1 Falling back to public interface 2026-03-09T16:17:59.031 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:59.031 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:58 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:59.031 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:17:58 vm03.local ceph-mon[133973]: pgmap v47: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:17:59.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:58 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:17:59.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:58 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:17:59.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:17:58 vm05.local ceph-mon[108543]: pgmap v47: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:00.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:00 vm03.local ceph-mon[133973]: Health check failed: Degraded data redundancy: 24/264 objects degraded (9.091%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:00.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:00 vm05.local ceph-mon[108543]: Health check failed: Degraded data redundancy: 24/264 objects degraded (9.091%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:01.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:01 vm03.local ceph-mon[133973]: pgmap v48: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 24/264 objects degraded (9.091%) 2026-03-09T16:18:01.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:01.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:01.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:01 vm05.local ceph-mon[108543]: pgmap v48: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 24/264 objects degraded (9.091%) 2026-03-09T16:18:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:02.663 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:02 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:02.663 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:02 vm03.local ceph-mon[133973]: pgmap v49: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:02.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:02 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:02.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:02 vm05.local ceph-mon[108543]: pgmap v49: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:03.517 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:18:03.388+0000 7fdda572b740 -1 osd.2 56 log_to_monitors true 2026-03-09T16:18:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-09T16:18:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:03.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:03.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:03 vm03.local ceph-mon[133973]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:04.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:03 vm05.local ceph-mon[108543]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T16:18:04.640 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:18:04.193+0000 7fdd9d4c5640 -1 osd.2 56 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: pgmap v50: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: osdmap e59: 6 total, 5 up, 6 in 2026-03-09T16:18:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:04 vm05.local ceph-mon[108543]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:18:05.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T16:18:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: pgmap v50: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T16:18:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: osdmap e59: 6 total, 5 up, 6 in 2026-03-09T16:18:05.142 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:04 vm03.local ceph-mon[133973]: from='osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T16:18:06.135 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:05 vm03.local ceph-mon[133973]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:18:06.135 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:05 vm03.local ceph-mon[133973]: osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843] boot 2026-03-09T16:18:06.135 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:05 vm03.local ceph-mon[133973]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T16:18:06.135 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:05 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:18:06.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:05 vm05.local ceph-mon[108543]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:18:06.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:05 vm05.local ceph-mon[108543]: osd.2 [v2:192.168.123.103:6818/1751250843,v1:192.168.123.103:6819/1751250843] boot 2026-03-09T16:18:06.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:05 vm05.local ceph-mon[108543]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T16:18:06.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:05 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T16:18:07.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:06 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 34/264 objects degraded (12.879%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:07.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:06 vm03.local ceph-mon[133973]: pgmap v53: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:07.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:06 vm03.local ceph-mon[133973]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T16:18:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:06 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 34/264 objects degraded (12.879%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:06 vm05.local ceph-mon[108543]: pgmap v53: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:07.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:06 vm05.local ceph-mon[108543]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T16:18:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:08 vm05.local ceph-mon[108543]: pgmap v55: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:08.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:08 vm03.local ceph-mon[133973]: pgmap v55: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 34/264 objects degraded (12.879%) 2026-03-09T16:18:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:10 vm03.local ceph-mon[133973]: pgmap v56: 65 pgs: 6 active+undersized, 4 active+undersized+degraded, 55 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 9/264 objects degraded (3.409%) 2026-03-09T16:18:10.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:10.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:10.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:10 vm05.local ceph-mon[108543]: pgmap v56: 65 pgs: 6 active+undersized, 4 active+undersized+degraded, 55 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 9/264 objects degraded (3.409%) 2026-03-09T16:18:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:11 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 9/264 objects degraded (3.409%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:11 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 9/264 objects degraded (3.409%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:12.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:12 vm05.local ceph-mon[108543]: pgmap v57: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:12.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:12 vm03.local ceph-mon[133973]: pgmap v57: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:13.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:13 vm03.local ceph-mon[133973]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 9/264 objects degraded (3.409%), 4 pgs degraded) 2026-03-09T16:18:14.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:13 vm05.local ceph-mon[108543]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 9/264 objects degraded (3.409%), 4 pgs degraded) 2026-03-09T16:18:14.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:14 vm03.local ceph-mon[133973]: pgmap v58: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:15.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:14 vm05.local ceph-mon[108543]: pgmap v58: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:16 vm05.local ceph-mon[108543]: pgmap v59: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:16.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:16 vm03.local ceph-mon[133973]: pgmap v59: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 -- 192.168.123.103:0/1630005971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 msgr2=0x7f569c073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 --2- 192.168.123.103:0/1630005971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c073510 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f56840099e0 tx=0x7f568402f2f0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 -- 192.168.123.103:0/1630005971 shutdown_connections 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 --2- 192.168.123.103:0/1630005971 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 0x7f569c073e90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 --2- 192.168.123.103:0/1630005971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c073510 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.560+0000 7f56a2bca640 1 -- 192.168.123.103:0/1630005971 >> 192.168.123.103:0/1630005971 conn(0x7f569c0fc480 msgr2=0x7f569c0fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:17.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 -- 192.168.123.103:0/1630005971 shutdown_connections 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 -- 192.168.123.103:0/1630005971 wait complete. 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 Processor -- start 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 -- start start 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 0x7f569c19eeb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f569c19fa80 con 0x7f569c104630 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.561+0000 7f56a2bca640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f569c1a37f0 con 0x7f569c073a50 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f56a093f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 0x7f569c19eeb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56660/0 (socket says 192.168.123.103:56660) 2026-03-09T16:18:17.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 -- 192.168.123.103:0/304250481 learned_addr learned my addr 192.168.123.103:0/304250481 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:17.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 -- 192.168.123.103:0/304250481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 msgr2=0x7f569c19eeb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 0x7f569c19eeb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 -- 192.168.123.103:0/304250481 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5684009660 con 0x7f569c104630 2026-03-09T16:18:17.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.562+0000 7f5693fff640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f568c00b790 tx=0x7f568c00bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.563+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f568c004070 con 0x7f569c104630 2026-03-09T16:18:17.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.563+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f568c0026e0 con 0x7f569c104630 2026-03-09T16:18:17.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.563+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f568c00cb30 con 0x7f569c104630 2026-03-09T16:18:17.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.563+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f569c1a3ad0 con 0x7f569c104630 2026-03-09T16:18:17.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.563+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f569c1a4020 con 0x7f569c104630 2026-03-09T16:18:17.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.564+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5664005350 con 0x7f569c104630 2026-03-09T16:18:17.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.565+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f568c00cc90 con 0x7f569c104630 2026-03-09T16:18:17.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.565+0000 7f5691ffb640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 0x7f5678079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.566+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f568c098d60 con 0x7f569c104630 2026-03-09T16:18:17.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.567+0000 7f56a093f640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 0x7f5678079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.567+0000 7f56a093f640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 0x7f5678079ec0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f5684002410 tx=0x7f568403a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.569+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f568c061380 con 0x7f569c104630 2026-03-09T16:18:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.683+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5664002bf0 con 0x7f5678077a00 2026-03-09T16:18:17.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.685+0000 7f5691ffb640 1 -- 192.168.123.103:0/304250481 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f5664002bf0 con 0x7f5678077a00 2026-03-09T16:18:17.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.688+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 msgr2=0x7f5678079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.688+0000 7f56a2bca640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 0x7f5678079ec0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f5684002410 tx=0x7f568403a040 comp rx=0 tx=0).stop 2026-03-09T16:18:17.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.689+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 msgr2=0x7f569c19f3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.689+0000 7f56a2bca640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f568c00b790 tx=0x7f568c00bc60 comp rx=0 tx=0).stop 2026-03-09T16:18:17.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.689+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 shutdown_connections 2026-03-09T16:18:17.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.689+0000 7f56a2bca640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5678077a00 0x7f5678079ec0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.690+0000 7f56a2bca640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f569c104630 0x7f569c19f3f0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.690+0000 7f56a2bca640 1 --2- 192.168.123.103:0/304250481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f569c073a50 0x7f569c19eeb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.690+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 >> 192.168.123.103:0/304250481 conn(0x7f569c0fc480 msgr2=0x7f569c0fd710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:17.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.690+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 shutdown_connections 2026-03-09T16:18:17.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.690+0000 7f56a2bca640 1 -- 192.168.123.103:0/304250481 wait complete. 2026-03-09T16:18:17.700 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:18:17.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 -- 192.168.123.103:0/265917172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38100770 msgr2=0x7f0d38100bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 --2- 192.168.123.103:0/265917172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38100770 0x7f0d38100bd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f0d28009a00 tx=0x7f0d2802f280 comp rx=0 tx=0).stop 2026-03-09T16:18:17.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 -- 192.168.123.103:0/265917172 shutdown_connections 2026-03-09T16:18:17.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 --2- 192.168.123.103:0/265917172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38100770 0x7f0d38100bd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 --2- 192.168.123.103:0/265917172 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d38106770 0x7f0d38106b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 -- 192.168.123.103:0/265917172 >> 192.168.123.103:0/265917172 conn(0x7f0d380fc470 msgr2=0x7f0d380fe890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 -- 192.168.123.103:0/265917172 shutdown_connections 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.753+0000 7f0d40359640 1 -- 192.168.123.103:0/265917172 wait complete. 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 Processor -- start 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 -- start start 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d3819f5b0 0x7f0d381a3960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d3819fbb0 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d40359640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d3819fd20 con 0x7f0d3819f5b0 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d3e0ce640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d3e0ce640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56678/0 (socket says 192.168.123.103:56678) 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.754+0000 7f0d3e0ce640 1 -- 192.168.123.103:0/4033360744 learned_addr learned my addr 192.168.123.103:0/4033360744 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d3e0ce640 1 -- 192.168.123.103:0/4033360744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d3819f5b0 msgr2=0x7f0d381a3960 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d3e0ce640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d3819f5b0 0x7f0d381a3960 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d3e0ce640 1 -- 192.168.123.103:0/4033360744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d28009660 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d3e0ce640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f0d2c009a90 tx=0x7f0d2c009f60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d2c00cd00 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.755+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0d2c004590 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.756+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d2c019670 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.756+0000 7f0d40359640 1 -- 192.168.123.103:0/4033360744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d381a3f60 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.756+0000 7f0d40359640 1 -- 192.168.123.103:0/4033360744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d381a4480 con 0x7f0d38106770 2026-03-09T16:18:17.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.757+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d3810c370 con 0x7f0d38106770 2026-03-09T16:18:17.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.759+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0d2c0026e0 con 0x7f0d38106770 2026-03-09T16:18:17.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.760+0000 7f0d277fe640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 0x7f0d0c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.760+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f0d2c09a590 con 0x7f0d38106770 2026-03-09T16:18:17.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.762+0000 7f0d3d8cd640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 0x7f0d0c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.762+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0d2c062b20 con 0x7f0d38106770 2026-03-09T16:18:17.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.762+0000 7f0d3d8cd640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 0x7f0d0c079d80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f0d280040c0 tx=0x7f0d280023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.885+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0d381a0380 con 0x7f0d0c0778c0 2026-03-09T16:18:17.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.890+0000 7f0d277fe640 1 -- 192.168.123.103:0/4033360744 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0d381a0380 con 0x7f0d0c0778c0 2026-03-09T16:18:17.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 msgr2=0x7f0d0c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 0x7f0d0c079d80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f0d280040c0 tx=0x7f0d280023d0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 msgr2=0x7f0d3819f070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f0d2c009a90 tx=0x7f0d2c009f60 comp rx=0 tx=0).stop 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 shutdown_connections 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0d0c0778c0 0x7f0d0c079d80 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d3819f5b0 0x7f0d381a3960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 --2- 192.168.123.103:0/4033360744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0d38106770 0x7f0d3819f070 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.892+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 >> 192.168.123.103:0/4033360744 conn(0x7f0d380fc470 msgr2=0x7f0d380fc850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.893+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 shutdown_connections 2026-03-09T16:18:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.893+0000 7f0d257fa640 1 -- 192.168.123.103:0/4033360744 wait complete. 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 -- 192.168.123.103:0/1920902126 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 msgr2=0x7fbbb0102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/1920902126 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb0102e30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fbba00099b0 tx=0x7fbba002f220 comp rx=0 tx=0).stop 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 -- 192.168.123.103:0/1920902126 shutdown_connections 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/1920902126 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb0102e30 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/1920902126 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb0108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.956+0000 7fbbb83c9640 1 -- 192.168.123.103:0/1920902126 >> 192.168.123.103:0/1920902126 conn(0x7fbbb00fe710 msgr2=0x7fbbb0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 -- 192.168.123.103:0/1920902126 shutdown_connections 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 -- 192.168.123.103:0/1920902126 wait complete. 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 Processor -- start 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 -- start start 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb01a0640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.957+0000 7fbbb83c9640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb83c9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbb01a1180 con 0x7fbbb01029d0 2026-03-09T16:18:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb83c9640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbb019a750 con 0x7fbbb01089d0 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58498/0 (socket says 192.168.123.103:58498) 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 -- 192.168.123.103:0/569729767 learned_addr learned my addr 192.168.123.103:0/569729767 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb613e640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb01a0640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 -- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 msgr2=0x7fbbb01a0640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb01a0640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb593d640 1 -- 192.168.123.103:0/569729767 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbba0009660 con 0x7fbbb01089d0 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.958+0000 7fbbb613e640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb01a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:18:17.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.959+0000 7fbbb593d640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fbba0002410 tx=0x7fbba0004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.959+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbba003d070 con 0x7fbbb01089d0 2026-03-09T16:18:17.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.959+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbb019a9d0 con 0x7fbbb01089d0 2026-03-09T16:18:17.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.959+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbb019aec0 con 0x7fbbb01089d0 2026-03-09T16:18:17.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.960+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbba00043b0 con 0x7fbbb01089d0 2026-03-09T16:18:17.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.960+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbba0041740 con 0x7fbbb01089d0 2026-03-09T16:18:17.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.961+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbb78005350 con 0x7fbbb01089d0 2026-03-09T16:18:17.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.961+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbba0038730 con 0x7fbbb01089d0 2026-03-09T16:18:17.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.962+0000 7fbba77fe640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 0x7fbb8c079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:17.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.962+0000 7fbbb613e640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 0x7fbb8c079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:17.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.962+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbba00be7e0 con 0x7fbbb01089d0 2026-03-09T16:18:17.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.962+0000 7fbbb613e640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 0x7fbb8c079cb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fbb98005f10 tx=0x7fbb98005ea0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:17.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:17.965+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbba00866d0 con 0x7fbbb01089d0 2026-03-09T16:18:18.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.080+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fbb78002bf0 con 0x7fbb8c0777f0 2026-03-09T16:18:18.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.086+0000 7fbba77fe640 1 -- 192.168.123.103:0/569729767 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fbb78002bf0 con 0x7fbb8c0777f0 2026-03-09T16:18:18.093 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 17s ago 8m 25.3M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 17s ago 8m 9743k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (7m) 57s ago 7m 9861k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (60s) 17s ago 8m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (58s) 57s ago 7m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 17s ago 7m 83.9M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (6m) 17s ago 6m 19.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (6m) 17s ago 6m 190M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (6m) 57s ago 6m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (6m) 57s ago 6m 18.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (3m) 17s ago 8m 604M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (3m) 57s ago 7m 494M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (92s) 17s ago 8m 62.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (75s) 57s ago 7m 46.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 17s ago 8m 9517k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 57s ago 7m 9479k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (51s) 17s ago 7m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (44s) 17s ago 7m 99.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (20s) 17s ago 6m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (6m) 57s ago 6m 455M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d95aab347c9f 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (6m) 57s ago 6m 380M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 5076005b452d 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (6m) 57s ago 6m 331M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 17s ago 7m 45.1M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 msgr2=0x7fbb8c079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 0x7fbb8c079cb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fbb98005f10 tx=0x7fbb98005ea0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 msgr2=0x7fbbb01a0b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fbba0002410 tx=0x7fbba0004290 comp rx=0 tx=0).stop 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 shutdown_connections 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.089+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbb8c0777f0 0x7fbb8c079cb0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.090+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbb01089d0 0x7fbbb01a0b80 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.090+0000 7fbbb83c9640 1 --2- 192.168.123.103:0/569729767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbb01029d0 0x7fbbb01a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.090+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 >> 192.168.123.103:0/569729767 conn(0x7fbbb00fe710 msgr2=0x7fbbb010c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.090+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 shutdown_connections 2026-03-09T16:18:18.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.090+0000 7fbbb83c9640 1 -- 192.168.123.103:0/569729767 wait complete. 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 -- 192.168.123.103:0/372590046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244102a00 msgr2=0x7f9244102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 --2- 192.168.123.103:0/372590046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244102a00 0x7f9244102e60 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f922c009a00 tx=0x7f922c02f280 comp rx=0 tx=0).stop 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 -- 192.168.123.103:0/372590046 shutdown_connections 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 --2- 192.168.123.103:0/372590046 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244102a00 0x7f9244102e60 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 --2- 192.168.123.103:0/372590046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244108a00 0x7f9244108de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.160+0000 7f924c669640 1 -- 192.168.123.103:0/372590046 >> 192.168.123.103:0/372590046 conn(0x7f92440fe700 msgr2=0x7f9244100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.161+0000 7f924c669640 1 -- 192.168.123.103:0/372590046 shutdown_connections 2026-03-09T16:18:18.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.161+0000 7f924c669640 1 -- 192.168.123.103:0/372590046 wait complete. 2026-03-09T16:18:18.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.161+0000 7f924c669640 1 Processor -- start 2026-03-09T16:18:18.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.161+0000 7f924c669640 1 -- start start 2026-03-09T16:18:18.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924c669640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924c669640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 0x7f92441a0bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924c669640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f924419a7a0 con 0x7f9244108a00 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924c669640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f924419a910 con 0x7f9244102a00 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924a3de640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924a3de640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58526/0 (socket says 192.168.123.103:58526) 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924a3de640 1 -- 192.168.123.103:0/474293927 learned_addr learned my addr 192.168.123.103:0/474293927 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924a3de640 1 -- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 msgr2=0x7f92441a0bf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f9249bdd640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 0x7f92441a0bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.162+0000 7f924a3de640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 0x7f92441a0bf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.163+0000 7f924a3de640 1 -- 192.168.123.103:0/474293927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f922c009660 con 0x7f9244102a00 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.163+0000 7f9249bdd640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 0x7f92441a0bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:18:18.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.163+0000 7f924a3de640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f923400e9b0 tx=0x7f923400ee80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.172+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f923400cd90 con 0x7f9244102a00 2026-03-09T16:18:18.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.172+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f924419abf0 con 0x7f9244102a00 2026-03-09T16:18:18.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.173+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f924419b140 con 0x7f9244102a00 2026-03-09T16:18:18.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.173+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9234004590 con 0x7f9244102a00 2026-03-09T16:18:18.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.173+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9234010640 con 0x7f9244102a00 2026-03-09T16:18:18.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.174+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f920c005350 con 0x7f9244102a00 2026-03-09T16:18:18.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.175+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f92340040d0 con 0x7f9244102a00 2026-03-09T16:18:18.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.175+0000 7f923b7fe640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 0x7f921c079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.175+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f9234014070 con 0x7f9244102a00 2026-03-09T16:18:18.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.176+0000 7f9249bdd640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 0x7f921c079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.177+0000 7f9249bdd640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 0x7f921c079ec0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f924419bc10 tx=0x7f922c0023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.181+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9234062ac0 con 0x7f9244102a00 2026-03-09T16:18:18.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.333+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f920c0058d0 con 0x7f9244102a00 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.334+0000 7f923b7fe640 1 -- 192.168.123.103:0/474293927 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f9234062210 con 0x7f9244102a00 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 3, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 7, 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:18:18.335 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 msgr2=0x7f921c079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 0x7f921c079ec0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f924419bc10 tx=0x7f922c0023d0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 msgr2=0x7f92441a06b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f923400e9b0 tx=0x7f923400ee80 comp rx=0 tx=0).stop 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 shutdown_connections 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f921c077a00 0x7f921c079ec0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9244108a00 0x7f92441a0bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 --2- 192.168.123.103:0/474293927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9244102a00 0x7f92441a06b0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.337+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 >> 192.168.123.103:0/474293927 conn(0x7f92440fe700 msgr2=0x7f924410c950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.338+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 shutdown_connections 2026-03-09T16:18:18.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.338+0000 7f924c669640 1 -- 192.168.123.103:0/474293927 wait complete. 2026-03-09T16:18:18.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.413+0000 7fae25119640 1 -- 192.168.123.103:0/4149982859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ffe80 msgr2=0x7fae2010cd50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.413+0000 7fae25119640 1 --2- 192.168.123.103:0/4149982859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ffe80 0x7fae2010cd50 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fae14009a00 tx=0x7fae1402f290 comp rx=0 tx=0).stop 2026-03-09T16:18:18.414 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:18 vm03.local ceph-mon[133973]: from='client.34188 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.414 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:18 vm03.local ceph-mon[133973]: from='client.34192 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.414 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:18 vm03.local ceph-mon[133973]: pgmap v60: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:18.414 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:18 vm03.local ceph-mon[133973]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 -- 192.168.123.103:0/4149982859 shutdown_connections 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 --2- 192.168.123.103:0/4149982859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ffe80 0x7fae2010cd50 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 --2- 192.168.123.103:0/4149982859 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ff560 0x7fae200ff940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 -- 192.168.123.103:0/4149982859 >> 192.168.123.103:0/4149982859 conn(0x7fae200fb3d0 msgr2=0x7fae200fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 -- 192.168.123.103:0/4149982859 shutdown_connections 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.416+0000 7fae25119640 1 -- 192.168.123.103:0/4149982859 wait complete. 2026-03-09T16:18:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 Processor -- start 2026-03-09T16:18:18.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 -- start start 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 0x7fae2010c520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae20105600 con 0x7fae200ff560 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.417+0000 7fae25119640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae20105770 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.418+0000 7fae1ed76640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 0x7fae2010c520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.418+0000 7fae1e575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.418+0000 7fae1e575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:58532/0 (socket says 192.168.123.103:58532) 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.418+0000 7fae1e575640 1 -- 192.168.123.103:0/453755319 learned_addr learned my addr 192.168.123.103:0/453755319 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.419+0000 7fae1e575640 1 -- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 msgr2=0x7fae2010c520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.419+0000 7fae1e575640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 0x7fae2010c520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.419+0000 7fae1e575640 1 -- 192.168.123.103:0/453755319 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae14009660 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.419+0000 7fae1ed76640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 0x7fae2010c520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.420+0000 7fae1e575640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fae1402f7a0 tx=0x7fae14031d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.420+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae14002a50 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.420+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae20105910 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.420+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae20105db0 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.421+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fae14031eb0 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.421+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae14038470 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.422+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fae1403f070 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.422+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae200696a0 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.422+0000 7fadfffff640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 0x7fadf4079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.422+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fae140be670 con 0x7fae200ffe80 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.423+0000 7fae1ed76640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 0x7fadf4079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.423+0000 7fae1ed76640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 0x7fadf4079ec0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fae08004750 tx=0x7fae080091c0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.426+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fae14086cb0 con 0x7fae200ffe80 2026-03-09T16:18:18.497 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:18 vm05.local ceph-mon[108543]: from='client.34188 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.497 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:18 vm05.local ceph-mon[108543]: from='client.34192 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.497 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:18 vm05.local ceph-mon[108543]: pgmap v60: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:18.497 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:18 vm05.local ceph-mon[108543]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:18.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.548+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fae20102c30 con 0x7fae200ffe80 2026-03-09T16:18:18.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.550+0000 7fadfffff640 1 -- 192.168.123.103:0/453755319 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1919 (secure 0 0 0) 0x7fae14046790 con 0x7fae200ffe80 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 41 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14476} 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:18:18.553 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:18.554 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:18:18.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.560+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 msgr2=0x7fadf4079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.560+0000 7fae25119640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 0x7fadf4079ec0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fae08004750 tx=0x7fae080091c0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.560+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 msgr2=0x7fae2010ca80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fae1402f7a0 tx=0x7fae14031d40 comp rx=0 tx=0).stop 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 shutdown_connections 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fadf4077a00 0x7fadf4079ec0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fae200ffe80 0x7fae2010ca80 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 --2- 192.168.123.103:0/453755319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae200ff560 0x7fae2010c520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 >> 192.168.123.103:0/453755319 conn(0x7fae200fb3d0 msgr2=0x7fae200fbc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.561+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 shutdown_connections 2026-03-09T16:18:18.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.562+0000 7fae25119640 1 -- 192.168.123.103:0/453755319 wait complete. 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 -- 192.168.123.103:0/2727795068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7000fe630 msgr2=0x7ff7000fea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2727795068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7000fe630 0x7ff7000fea70 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7ff6f00098e0 tx=0x7ff6f002f1e0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 -- 192.168.123.103:0/2727795068 shutdown_connections 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2727795068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7000fe630 0x7ff7000fea70 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2727795068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7001056e0 0x7ff700105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 -- 192.168.123.103:0/2727795068 >> 192.168.123.103:0/2727795068 conn(0x7ff7000fa4a0 msgr2=0x7ff7000fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 -- 192.168.123.103:0/2727795068 shutdown_connections 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.627+0000 7ff706f3f640 1 -- 192.168.123.103:0/2727795068 wait complete. 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff706f3f640 1 Processor -- start 2026-03-09T16:18:18.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff706f3f640 1 -- start start 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff706f3f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff706f3f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 0x7ff70019abd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff705f3d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.628+0000 7ff705f3d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56758/0 (socket says 192.168.123.103:56758) 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff705f3d640 1 -- 192.168.123.103:0/2085950964 learned_addr learned my addr 192.168.123.103:0/2085950964 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff70019b260 con 0x7ff7000fe630 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff70019efd0 con 0x7ff7001056e0 2026-03-09T16:18:18.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff70573c640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 0x7ff70019abd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff70573c640 1 -- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 msgr2=0x7ff70019a690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff70573c640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff70573c640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6f4009660 con 0x7ff7001056e0 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff705f3d640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.629+0000 7ff70573c640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 0x7ff70019abd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff6f00098b0 tx=0x7ff6f00043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.630+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6f003d070 con 0x7ff7001056e0 2026-03-09T16:18:18.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.630+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6f0009590 con 0x7ff7001056e0 2026-03-09T16:18:18.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.630+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff70019f5b0 con 0x7ff7001056e0 2026-03-09T16:18:18.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.630+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff6f002fd10 con 0x7ff7001056e0 2026-03-09T16:18:18.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.630+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6f0041600 con 0x7ff7001056e0 2026-03-09T16:18:18.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.631+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff7000ffa60 con 0x7ff7001056e0 2026-03-09T16:18:18.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.632+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff6f0038a30 con 0x7ff7001056e0 2026-03-09T16:18:18.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.632+0000 7ff6eeffd640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 0x7ff6d8079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.633+0000 7ff705f3d640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 0x7ff6d8079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.633+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff6f00be540 con 0x7ff7001056e0 2026-03-09T16:18:18.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.633+0000 7ff705f3d640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 0x7ff6d8079e70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7ff6f4009d50 tx=0x7ff6f4009340 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.634+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff6f0086b80 con 0x7ff7001056e0 2026-03-09T16:18:18.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.746+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff70010d860 con 0x7ff6d80779b0 2026-03-09T16:18:18.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.748+0000 7ff6eeffd640 1 -- 192.168.123.103:0/2085950964 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff70010d860 con 0x7ff6d80779b0 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "9/23 daemons upgraded", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:18:18.751 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:18:18.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.752+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 msgr2=0x7ff6d8079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.752+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 0x7ff6d8079e70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7ff6f4009d50 tx=0x7ff6f4009340 comp rx=0 tx=0).stop 2026-03-09T16:18:18.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 msgr2=0x7ff70019abd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 0x7ff70019abd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff6f00098b0 tx=0x7ff6f00043d0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 shutdown_connections 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7ff6d80779b0 0x7ff6d8079e70 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff7001056e0 0x7ff70019abd0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 --2- 192.168.123.103:0/2085950964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7000fe630 0x7ff70019a690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.753+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 >> 192.168.123.103:0/2085950964 conn(0x7ff7000fa4a0 msgr2=0x7ff7000fb710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.754+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 shutdown_connections 2026-03-09T16:18:18.754 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.754+0000 7ff706f3f640 1 -- 192.168.123.103:0/2085950964 wait complete. 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 -- 192.168.123.103:0/1400111158 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 msgr2=0x7f7798111330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 --2- 192.168.123.103:0/1400111158 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f7798111330 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f77880099b0 tx=0x7f778802f240 comp rx=0 tx=0).stop 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 -- 192.168.123.103:0/1400111158 shutdown_connections 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 --2- 192.168.123.103:0/1400111158 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f7798111330 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 --2- 192.168.123.103:0/1400111158 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f7798075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.824+0000 7f77a03b2640 1 -- 192.168.123.103:0/1400111158 >> 192.168.123.103:0/1400111158 conn(0x7f77980fe710 msgr2=0x7f7798100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.825+0000 7f77a03b2640 1 -- 192.168.123.103:0/1400111158 shutdown_connections 2026-03-09T16:18:18.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.825+0000 7f77a03b2640 1 -- 192.168.123.103:0/1400111158 wait complete. 2026-03-09T16:18:18.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.825+0000 7f77a03b2640 1 Processor -- start 2026-03-09T16:18:18.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.825+0000 7f77a03b2640 1 -- start start 2026-03-09T16:18:18.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f77a03b2640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f779e127640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f779e127640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45106/0 (socket says 192.168.123.103:45106) 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f779e127640 1 -- 192.168.123.103:0/248719719 learned_addr learned my addr 192.168.123.103:0/248719719 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f779819f2c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f779819f950 con 0x7f7798075720 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f779d926640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f779819f2c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.826+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77981a36c0 con 0x7f7798076040 2026-03-09T16:18:18.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.827+0000 7f779e127640 1 -- 192.168.123.103:0/248719719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 msgr2=0x7f779819f2c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.827+0000 7f779e127640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f779819f2c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.827+0000 7f779e127640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f778c009590 con 0x7f7798075720 2026-03-09T16:18:18.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.827+0000 7f779e127640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f778c002990 tx=0x7f778c002e60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.833+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f778c00eba0 con 0x7f7798075720 2026-03-09T16:18:18.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.833+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f778c00ed00 con 0x7f7798075720 2026-03-09T16:18:18.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.834+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f778c018730 con 0x7f7798075720 2026-03-09T16:18:18.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.835+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7788009660 con 0x7f7798075720 2026-03-09T16:18:18.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.835+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77981a3ce0 con 0x7f7798075720 2026-03-09T16:18:18.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.839+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7798076e60 con 0x7f7798075720 2026-03-09T16:18:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.840+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f778c016020 con 0x7f7798075720 2026-03-09T16:18:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.841+0000 7f77877fe640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 0x7f776c079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.841+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f778c09a140 con 0x7f7798075720 2026-03-09T16:18:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.842+0000 7f779d926640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 0x7f776c079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.842+0000 7f779d926640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 0x7f776c079c60 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f77981a0330 tx=0x7f7788002f10 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:18.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.844+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f778c062650 con 0x7f7798075720 2026-03-09T16:18:18.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.984+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f77981a0110 con 0x7f7798075720 2026-03-09T16:18:18.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.986+0000 7f77877fe640 1 -- 192.168.123.103:0/248719719 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f778c061da0 con 0x7f7798075720 2026-03-09T16:18:18.987 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:18:18.987 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:18:18.987 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:18:18.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.991+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 msgr2=0x7f776c079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.991+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 0x7f776c079c60 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f77981a0330 tx=0x7f7788002f10 comp rx=0 tx=0).stop 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 msgr2=0x7f779819ed80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f778c002990 tx=0x7f778c002e60 comp rx=0 tx=0).stop 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 shutdown_connections 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f776c0777a0 0x7f776c079c60 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7798076040 0x7f779819f2c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 --2- 192.168.123.103:0/248719719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7798075720 0x7f779819ed80 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.992+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 >> 192.168.123.103:0/248719719 conn(0x7f77980fe710 msgr2=0x7f77980ffd20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.993+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 shutdown_connections 2026-03-09T16:18:18.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:18.993+0000 7f77a03b2640 1 -- 192.168.123.103:0/248719719 wait complete. 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: Upgrade: osd.3 is safe to restart 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/474293927' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/453755319' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: Upgrade: Updating osd.3 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: Deploying daemon osd.3 on vm05 2026-03-09T16:18:19.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:19 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/248719719' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: Upgrade: osd.3 is safe to restart 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/474293927' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/453755319' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: Upgrade: Updating osd.3 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: Deploying daemon osd.3 on vm05 2026-03-09T16:18:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:19 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/248719719' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:18:20.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:20 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:20.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:20 vm05.local ceph-mon[108543]: pgmap v61: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:20.776 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:20 vm05.local systemd[1]: Stopping Ceph osd.3 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:18:20.776 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:20 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:18:20.472+0000 7f8c9bee7640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:18:20.776 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:20 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:18:20.472+0000 7f8c9bee7640 -1 osd.3 61 *** Got signal Terminated *** 2026-03-09T16:18:20.776 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:20 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[64527]: 2026-03-09T16:18:20.472+0000 7f8c9bee7640 -1 osd.3 61 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:18:20.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:20 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:20.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:20 vm03.local ceph-mon[133973]: pgmap v61: 65 pgs: 65 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:22.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:21 vm05.local ceph-mon[108543]: osd.3 marked itself down and dead 2026-03-09T16:18:22.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:21 vm05.local ceph-mon[108543]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:18:22.028 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:21 vm05.local podman[115767]: 2026-03-09 16:18:21.67363015 +0000 UTC m=+1.271500015 container died d95aab347c9f52456642cf90031e627aac878eff5801ac09a95af70cac70852f (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:18:22.028 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:21 vm05.local podman[115767]: 2026-03-09 16:18:21.883864979 +0000 UTC m=+1.481734833 container remove d95aab347c9f52456642cf90031e627aac878eff5801ac09a95af70cac70852f (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=reef) 2026-03-09T16:18:22.028 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:21 vm05.local bash[115767]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3 2026-03-09T16:18:22.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:21 vm03.local ceph-mon[133973]: osd.3 marked itself down and dead 2026-03-09T16:18:22.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:21 vm03.local ceph-mon[133973]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:18:22.403 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.165103655 +0000 UTC m=+0.050188910 container create e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:18:22.403 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.22429829 +0000 UTC m=+0.109383555 container init e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True) 2026-03-09T16:18:22.403 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.229874808 +0000 UTC m=+0.114960063 container start e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-09T16:18:22.403 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.142118483 +0000 UTC m=+0.027203749 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:22.403 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.261394212 +0000 UTC m=+0.146479467 container attach e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:18:22.747 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local conmon[115857]: conmon e3740f552fc7cecf7e81 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19.scope/container/memory.events 2026-03-09T16:18:22.747 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.40393657 +0000 UTC m=+0.289021825 container died e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-09T16:18:23.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:22 vm05.local ceph-mon[108543]: osdmap e62: 6 total, 5 up, 6 in 2026-03-09T16:18:23.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:22 vm05.local ceph-mon[108543]: pgmap v63: 65 pgs: 16 stale+active+clean, 49 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:23.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local podman[115846]: 2026-03-09 16:18:22.747370397 +0000 UTC m=+0.632455643 container remove e3740f552fc7cecf7e81ef4546a9700a76a1a99ff94b43844771002b42ffaa19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS) 2026-03-09T16:18:23.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.3.service: Deactivated successfully. 2026-03-09T16:18:23.026 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local systemd[1]: Stopped Ceph osd.3 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:18:23.027 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.3.service: Consumed 42.235s CPU time. 2026-03-09T16:18:23.027 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:22 vm05.local systemd[1]: Starting Ceph osd.3 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:18:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:22 vm03.local ceph-mon[133973]: osdmap e62: 6 total, 5 up, 6 in 2026-03-09T16:18:23.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:22 vm03.local ceph-mon[133973]: pgmap v63: 65 pgs: 16 stale+active+clean, 49 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local podman[115950]: 2026-03-09 16:18:23.131105597 +0000 UTC m=+0.059215936 container create 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local podman[115950]: 2026-03-09 16:18:23.081607671 +0000 UTC m=+0.009718021 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local podman[115950]: 2026-03-09 16:18:23.19653433 +0000 UTC m=+0.124644659 container init 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local podman[115950]: 2026-03-09 16:18:23.203754834 +0000 UTC m=+0.131865173 container start 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local podman[115950]: 2026-03-09 16:18:23.243617056 +0000 UTC m=+0.171727395 container attach 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0) 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:23.385 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:23.386 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:24.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:23 vm03.local ceph-mon[133973]: osdmap e63: 6 total, 5 up, 6 in 2026-03-09T16:18:24.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-mon[108543]: osdmap e63: 6 total, 5 up, 6 in 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2572d668-4736-4205-9b79-df0f906f872c/osd-block-aa64c4f2-8110-40fd-928c-4df2efafc82e --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T16:18:24.201 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:23 vm05.local bash[115950]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2572d668-4736-4205-9b79-df0f906f872c/osd-block-aa64c4f2-8110-40fd-928c-4df2efafc82e --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/ln -snf /dev/ceph-2572d668-4736-4205-9b79-df0f906f872c/osd-block-aa64c4f2-8110-40fd-928c-4df2efafc82e /var/lib/ceph/osd/ceph-3/block 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[115950]: Running command: /usr/bin/ln -snf /dev/ceph-2572d668-4736-4205-9b79-df0f906f872c/osd-block-aa64c4f2-8110-40fd-928c-4df2efafc82e /var/lib/ceph/osd/ceph-3/block 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[115950]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[115950]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[115950]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate[115962]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[115950]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[115950]: 2026-03-09 16:18:24.229445066 +0000 UTC m=+1.157555405 container died 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True) 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[115950]: 2026-03-09 16:18:24.247092266 +0000 UTC m=+1.175202605 container remove 6cb9d83cb4c8903976928d9be32f6170131c58b7c7357f752686e62acad07748 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-activate, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[116209]: 2026-03-09 16:18:24.347165114 +0000 UTC m=+0.020277565 container create c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3) 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[116209]: 2026-03-09 16:18:24.394176145 +0000 UTC m=+0.067288596 container init c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20260223) 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[116209]: 2026-03-09 16:18:24.397649996 +0000 UTC m=+0.070762457 container start c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local bash[116209]: c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local podman[116209]: 2026-03-09 16:18:24.338488863 +0000 UTC m=+0.011601314 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:24.528 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:24 vm05.local systemd[1]: Started Ceph osd.3 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: pgmap v65: 65 pgs: 4 active+undersized, 13 stale+active+clean, 3 active+undersized+degraded, 45 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 10/264 objects degraded (3.788%) 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:25.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:25 vm03.local ceph-mon[133973]: Health check failed: Degraded data redundancy: 10/264 objects degraded (3.788%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:25.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: pgmap v65: 65 pgs: 4 active+undersized, 13 stale+active+clean, 3 active+undersized+degraded, 45 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 10/264 objects degraded (3.788%) 2026-03-09T16:18:25.278 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:25.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:25.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:25.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-mon[108543]: Health check failed: Degraded data redundancy: 10/264 objects degraded (3.788%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:25.279 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:25 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:18:25.025+0000 7f7eaaa7c740 -1 Falling back to public interface 2026-03-09T16:18:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:26 vm05.local ceph-mon[108543]: pgmap v66: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 127 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:26 vm03.local ceph-mon[133973]: pgmap v66: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 127 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: mgrmap e38: vm03.gbgzmu(active, since 92s), standbys: vm05.dygxfv 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: mgrmap e38: vm03.gbgzmu(active, since 92s), standbys: vm05.dygxfv 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:27.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:28.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:28 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:28.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:28 vm05.local ceph-mon[108543]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-09T16:18:28.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:28 vm05.local ceph-mon[108543]: pgmap v67: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 127 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:28.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:28 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:28.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:28 vm03.local ceph-mon[133973]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-09T16:18:28.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:28 vm03.local ceph-mon[133973]: pgmap v67: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 127 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:29.517 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:29 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:18:29.172+0000 7f7eaaa7c740 -1 osd.3 61 log_to_monitors true 2026-03-09T16:18:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:29 vm05.local ceph-mon[108543]: from='osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:18:29.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:29 vm05.local ceph-mon[108543]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:18:29.776 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:18:29 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:18:29.539+0000 7f7ea2816640 -1 osd.3 61 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:18:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:29 vm03.local ceph-mon[133973]: from='osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:18:29.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:29 vm03.local ceph-mon[133973]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T16:18:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:30 vm03.local ceph-mon[133973]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T16:18:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:30 vm03.local ceph-mon[133973]: osdmap e64: 6 total, 5 up, 6 in 2026-03-09T16:18:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:30 vm03.local ceph-mon[133973]: from='osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:30 vm03.local ceph-mon[133973]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:30.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:30 vm03.local ceph-mon[133973]: pgmap v69: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:30 vm05.local ceph-mon[108543]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T16:18:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:30 vm05.local ceph-mon[108543]: osdmap e64: 6 total, 5 up, 6 in 2026-03-09T16:18:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:30 vm05.local ceph-mon[108543]: from='osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:30 vm05.local ceph-mon[108543]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:30 vm05.local ceph-mon[108543]: pgmap v69: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:31.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:31 vm03.local ceph-mon[133973]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:18:31.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:31 vm03.local ceph-mon[133973]: osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398] boot 2026-03-09T16:18:31.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:31 vm03.local ceph-mon[133973]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T16:18:31.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:31 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:18:31.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:31 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 59/264 objects degraded (22.348%), 18 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:31 vm05.local ceph-mon[108543]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:18:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:31 vm05.local ceph-mon[108543]: osd.3 [v2:192.168.123.105:6800/79995398,v1:192.168.123.105:6801/79995398] boot 2026-03-09T16:18:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:31 vm05.local ceph-mon[108543]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T16:18:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:31 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T16:18:32.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:31 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 59/264 objects degraded (22.348%), 18 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:32 vm03.local ceph-mon[133973]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T16:18:32.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:32 vm03.local ceph-mon[133973]: pgmap v72: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:32 vm05.local ceph-mon[108543]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T16:18:33.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:32 vm05.local ceph-mon[108543]: pgmap v72: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 59/264 objects degraded (22.348%) 2026-03-09T16:18:35.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:35 vm03.local ceph-mon[133973]: pgmap v73: 65 pgs: 17 active+undersized, 17 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 56/264 objects degraded (21.212%) 2026-03-09T16:18:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:35 vm05.local ceph-mon[108543]: pgmap v73: 65 pgs: 17 active+undersized, 17 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 56/264 objects degraded (21.212%) 2026-03-09T16:18:36.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:36 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 56/264 objects degraded (21.212%), 17 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:36.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:36 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 56/264 objects degraded (21.212%), 17 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:37 vm03.local ceph-mon[133973]: pgmap v74: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 13 B/s, 0 objects/s recovering 2026-03-09T16:18:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:37 vm03.local ceph-mon[133973]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 56/264 objects degraded (21.212%), 17 pgs degraded) 2026-03-09T16:18:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:37 vm05.local ceph-mon[108543]: pgmap v74: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 13 B/s, 0 objects/s recovering 2026-03-09T16:18:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:37 vm05.local ceph-mon[108543]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 56/264 objects degraded (21.212%), 17 pgs degraded) 2026-03-09T16:18:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:39 vm03.local ceph-mon[133973]: pgmap v75: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 11 B/s, 0 objects/s recovering 2026-03-09T16:18:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:39 vm05.local ceph-mon[108543]: pgmap v75: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 11 B/s, 0 objects/s recovering 2026-03-09T16:18:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:41 vm03.local ceph-mon[133973]: pgmap v76: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 9 B/s, 0 objects/s recovering 2026-03-09T16:18:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:41 vm05.local ceph-mon[108543]: pgmap v76: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 9 B/s, 0 objects/s recovering 2026-03-09T16:18:42.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:42 vm05.local ceph-mon[108543]: pgmap v77: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 8 B/s, 0 objects/s recovering 2026-03-09T16:18:42.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:42 vm03.local ceph-mon[133973]: pgmap v77: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 8 B/s, 0 objects/s recovering 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: Upgrade: osd.4 is safe to restart 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: Upgrade: Updating osd.4 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:43 vm03.local ceph-mon[133973]: Deploying daemon osd.4 on vm05 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: Upgrade: osd.4 is safe to restart 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: Upgrade: Updating osd.4 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:43.699 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:43 vm05.local ceph-mon[108543]: Deploying daemon osd.4 on vm05 2026-03-09T16:18:44.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:44 vm05.local ceph-mon[108543]: pgmap v78: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 7 B/s, 0 objects/s recovering 2026-03-09T16:18:44.817 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:44 vm03.local ceph-mon[133973]: pgmap v78: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 7 B/s, 0 objects/s recovering 2026-03-09T16:18:45.118 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local systemd[1]: Stopping Ceph osd.4 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:18:45.526 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:18:45.115+0000 7f8fe5546640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:18:45.526 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:18:45.115+0000 7f8fe5546640 -1 osd.4 66 *** Got signal Terminated *** 2026-03-09T16:18:45.526 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[70826]: 2026-03-09T16:18:45.115+0000 7f8fe5546640 -1 osd.4 66 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:18:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:45.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:45 vm03.local ceph-mon[133973]: osd.4 marked itself down and dead 2026-03-09T16:18:46.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:46.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:45 vm05.local ceph-mon[108543]: osd.4 marked itself down and dead 2026-03-09T16:18:46.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local podman[120100]: 2026-03-09 16:18:45.827294541 +0000 UTC m=+0.732469479 container died 5076005b452d33b4f3b0f46ed1581d6ae8c4854a1526f799e40dc9a6ea7012ff (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-09T16:18:46.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local podman[120100]: 2026-03-09 16:18:45.885485358 +0000 UTC m=+0.790660296 container remove 5076005b452d33b4f3b0f46ed1581d6ae8c4854a1526f799e40dc9a6ea7012ff (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3) 2026-03-09T16:18:46.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:45 vm05.local bash[120100]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4 2026-03-09T16:18:46.450 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.167108543 +0000 UTC m=+0.098794425 container create 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:18:46.451 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.078585585 +0000 UTC m=+0.010271467 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:46.451 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.288578479 +0000 UTC m=+0.220264362 container init 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:18:46.451 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.292937348 +0000 UTC m=+0.224623230 container start 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-09T16:18:46.451 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.342733822 +0000 UTC m=+0.274419694 container attach 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:18:46.707 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:46 vm05.local ceph-mon[108543]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:18:46.707 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:46 vm05.local ceph-mon[108543]: osdmap e67: 6 total, 5 up, 6 in 2026-03-09T16:18:46.707 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:46 vm05.local ceph-mon[108543]: pgmap v80: 65 pgs: 6 stale+active+clean, 59 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:46.707 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local conmon[120180]: conmon 19247f15f482e271d39a : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8.scope/container/memory.events 2026-03-09T16:18:46.707 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.451312299 +0000 UTC m=+0.382998181 container died 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local podman[120169]: 2026-03-09 16:18:46.671525574 +0000 UTC m=+0.603211456 container remove 19247f15f482e271d39a7b5fd445f862e7e953d1cd016ce0ad284b48332208b8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service: Deactivated successfully. 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service: Unit process 120180 (conmon) remains running after unit stopped. 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service: Unit process 120188 (podman) remains running after unit stopped. 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: Stopped Ceph osd.4 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:18:46.708 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service: Consumed 36.610s CPU time, 787.7M memory peak. 2026-03-09T16:18:46.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:46 vm03.local ceph-mon[133973]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:18:46.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:46 vm03.local ceph-mon[133973]: osdmap e67: 6 total, 5 up, 6 in 2026-03-09T16:18:46.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:46 vm03.local ceph-mon[133973]: pgmap v80: 65 pgs: 6 stale+active+clean, 59 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:47.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:46 vm05.local systemd[1]: Starting Ceph osd.4 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local podman[120271]: 2026-03-09 16:18:47.055735271 +0000 UTC m=+0.037944151 container create 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local podman[120271]: 2026-03-09 16:18:47.119812024 +0000 UTC m=+0.102020915 container init 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local podman[120271]: 2026-03-09 16:18:47.124328388 +0000 UTC m=+0.106537268 container start 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local podman[120271]: 2026-03-09 16:18:47.030766387 +0000 UTC m=+0.012975257 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local podman[120271]: 2026-03-09 16:18:47.175953384 +0000 UTC m=+0.158162264 container attach 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True) 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:47.528 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:48.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-mon[108543]: osdmap e68: 6 total, 5 up, 6 in 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4e3cae96-8d9b-49ae-9db8-e7e653c8f7ed/osd-block-1567921f-08ce-4412-84d0-a4474c4e6ac0 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T16:18:48.026 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:47 vm05.local bash[120271]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4e3cae96-8d9b-49ae-9db8-e7e653c8f7ed/osd-block-1567921f-08ce-4412-84d0-a4474c4e6ac0 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T16:18:48.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:47 vm03.local ceph-mon[133973]: osdmap e68: 6 total, 5 up, 6 in 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/ln -snf /dev/ceph-4e3cae96-8d9b-49ae-9db8-e7e653c8f7ed/osd-block-1567921f-08ce-4412-84d0-a4474c4e6ac0 /var/lib/ceph/osd/ceph-4/block 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120271]: Running command: /usr/bin/ln -snf /dev/ceph-4e3cae96-8d9b-49ae-9db8-e7e653c8f7ed/osd-block-1567921f-08ce-4412-84d0-a4474c4e6ac0 /var/lib/ceph/osd/ceph-4/block 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120271]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120271]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120271]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate[120282]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120271]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T16:18:48.338 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120271]: 2026-03-09 16:18:48.091443282 +0000 UTC m=+1.073652162 container died 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-09T16:18:48.588 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120271]: 2026-03-09 16:18:48.337526498 +0000 UTC m=+1.319735388 container remove 99a30a0808ef7f0212f8c2aa8532c40b1cd4ab2a51751c6dc35855383cfba418 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-activate, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:18:48.589 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120537]: 2026-03-09 16:18:48.48713974 +0000 UTC m=+0.054262305 container create 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-09T16:18:48.589 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120537]: 2026-03-09 16:18:48.448133051 +0000 UTC m=+0.015255605 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:18:48.873 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120537]: 2026-03-09 16:18:48.588569386 +0000 UTC m=+0.155691940 container init 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:18:48.873 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local podman[120537]: 2026-03-09 16:18:48.592057255 +0000 UTC m=+0.159179809 container start 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:18:48.873 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local bash[120537]: 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 2026-03-09T16:18:48.873 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:48 vm05.local systemd[1]: Started Ceph osd.4 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:18:48.873 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-mon[108543]: pgmap v82: 65 pgs: 6 stale+active+clean, 59 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:48.873 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:48.873 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:48.873 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.055+0000 7fa00aa4d640 1 -- 192.168.123.103:0/4166223092 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004075720 msgr2=0x7fa004075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.055+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/4166223092 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004075720 0x7fa004075b00 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f9fe80099b0 tx=0x7f9fe802f220 comp rx=0 tx=0).stop 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.056+0000 7fa00aa4d640 1 -- 192.168.123.103:0/4166223092 shutdown_connections 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.056+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/4166223092 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004076040 0x7fa004111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.056+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/4166223092 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004075720 0x7fa004075b00 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.056+0000 7fa00aa4d640 1 -- 192.168.123.103:0/4166223092 >> 192.168.123.103:0/4166223092 conn(0x7fa0040fe710 msgr2=0x7fa004100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.056+0000 7fa00aa4d640 1 -- 192.168.123.103:0/4166223092 shutdown_connections 2026-03-09T16:18:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.057+0000 7fa00aa4d640 1 -- 192.168.123.103:0/4166223092 wait complete. 2026-03-09T16:18:49.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.057+0000 7fa00aa4d640 1 Processor -- start 2026-03-09T16:18:49.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.057+0000 7fa00aa4d640 1 -- start start 2026-03-09T16:18:49.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa00aa4d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004075720 0x7fa00419eea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa00aa4d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa00aa4d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa00419fa70 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa00aa4d640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0041a37e0 con 0x7fa004075720 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa003fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004075720 0x7fa00419eea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44570/0 (socket says 192.168.123.103:44570) 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 -- 192.168.123.103:0/3125424111 learned_addr learned my addr 192.168.123.103:0/3125424111 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 -- 192.168.123.103:0/3125424111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004075720 msgr2=0x7fa00419eea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004075720 0x7fa00419eea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.058+0000 7fa0037fe640 1 -- 192.168.123.103:0/3125424111 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9fe8009660 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.059+0000 7fa0037fe640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f9ff000b790 tx=0x7f9ff000bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.059+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ff0004070 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.059+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0041a3ac0 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.059+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0041a4010 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.060+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9ff00026e0 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.060+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ff000cb30 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.061+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9fcc005350 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.061+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9ff000cc90 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.063+0000 7fa0017fa640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 0x7f9fc8079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.063+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f9ff00980e0 con 0x7fa004076040 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.065+0000 7fa003fff640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 0x7f9fc8079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.065+0000 7fa003fff640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 0x7f9fc8079d50 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f9fe8002410 tx=0x7f9fe803a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.066+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9ff009d050 con 0x7fa004076040 2026-03-09T16:18:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:48 vm03.local ceph-mon[133973]: pgmap v82: 65 pgs: 6 stale+active+clean, 59 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:18:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:49.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.172+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9fcc002bf0 con 0x7f9fc8077890 2026-03-09T16:18:49.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.173+0000 7fa0017fa640 1 -- 192.168.123.103:0/3125424111 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f9fcc002bf0 con 0x7f9fc8077890 2026-03-09T16:18:49.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.175+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 msgr2=0x7f9fc8079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.175+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 0x7f9fc8079d50 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f9fe8002410 tx=0x7f9fe803a040 comp rx=0 tx=0).stop 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.175+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 msgr2=0x7fa00419f3e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.175+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f9ff000b790 tx=0x7f9ff000bc60 comp rx=0 tx=0).stop 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 shutdown_connections 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9fc8077890 0x7f9fc8079d50 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa004076040 0x7fa00419f3e0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 --2- 192.168.123.103:0/3125424111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa004075720 0x7fa00419eea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 >> 192.168.123.103:0/3125424111 conn(0x7fa0040fe710 msgr2=0x7fa0040ffdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 shutdown_connections 2026-03-09T16:18:49.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.176+0000 7fa00aa4d640 1 -- 192.168.123.103:0/3125424111 wait complete. 2026-03-09T16:18:49.186 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 -- 192.168.123.103:0/1949329544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 msgr2=0x7fe64410c340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 --2- 192.168.123.103:0/1949329544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64410c340 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fe6380099b0 tx=0x7fe63802f220 comp rx=0 tx=0).stop 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 -- 192.168.123.103:0/1949329544 shutdown_connections 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 --2- 192.168.123.103:0/1949329544 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe644107780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 --2- 192.168.123.103:0/1949329544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64410c340 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 -- 192.168.123.103:0/1949329544 >> 192.168.123.103:0/1949329544 conn(0x7fe644076740 msgr2=0x7fe644078b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 -- 192.168.123.103:0/1949329544 shutdown_connections 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.245+0000 7fe649a84640 1 -- 192.168.123.103:0/1949329544 wait complete. 2026-03-09T16:18:49.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 Processor -- start 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 -- start start 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe64419edb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe64419f980 con 0x7fe64410bf60 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.246+0000 7fe649a84640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe6441a36f0 con 0x7fe644107320 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe642ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe64419edb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44592/0 (socket says 192.168.123.103:44592) 2026-03-09T16:18:49.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 -- 192.168.123.103:0/1770251021 learned_addr learned my addr 192.168.123.103:0/1770251021 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 -- 192.168.123.103:0/1770251021 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 msgr2=0x7fe64419edb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe64419edb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 -- 192.168.123.103:0/1770251021 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe638009660 con 0x7fe64410bf60 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe642ffd640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe64419edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.247+0000 7fe6427fc640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fe62c00b790 tx=0x7fe62c00bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.248+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe62c004070 con 0x7fe64410bf60 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.248+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe62c0026e0 con 0x7fe64410bf60 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.248+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe62c00ca10 con 0x7fe64410bf60 2026-03-09T16:18:49.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.248+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe6441a39d0 con 0x7fe64410bf60 2026-03-09T16:18:49.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.248+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe6441a3fa0 con 0x7fe64410bf60 2026-03-09T16:18:49.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.250+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe62c00cb90 con 0x7fe64410bf60 2026-03-09T16:18:49.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.250+0000 7fe648a82640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 0x7fe618079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.250+0000 7fe642ffd640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 0x7fe618079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.250+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe62c099e30 con 0x7fe64410bf60 2026-03-09T16:18:49.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.250+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe608005350 con 0x7fe64410bf60 2026-03-09T16:18:49.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.254+0000 7fe642ffd640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 0x7fe618079da0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe638002410 tx=0x7fe63803a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.254+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe62c062670 con 0x7fe64410bf60 2026-03-09T16:18:49.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.362+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe608002bf0 con 0x7fe6180778e0 2026-03-09T16:18:49.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.364+0000 7fe648a82640 1 -- 192.168.123.103:0/1770251021 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7fe608002bf0 con 0x7fe6180778e0 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 msgr2=0x7fe618079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 0x7fe618079da0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe638002410 tx=0x7fe63803a040 comp rx=0 tx=0).stop 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 msgr2=0x7fe64419f2f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fe62c00b790 tx=0x7fe62c00bc60 comp rx=0 tx=0).stop 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 shutdown_connections 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe6180778e0 0x7fe618079da0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe64410bf60 0x7fe64419f2f0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 --2- 192.168.123.103:0/1770251021 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe644107320 0x7fe64419edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 >> 192.168.123.103:0/1770251021 conn(0x7fe644076740 msgr2=0x7fe644110e00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 shutdown_connections 2026-03-09T16:18:49.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.366+0000 7fe649a84640 1 -- 192.168.123.103:0/1770251021 wait complete. 2026-03-09T16:18:49.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.429+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/3863556571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4073a50 msgr2=0x7fa3f4073e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.429+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/3863556571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4073a50 0x7fa3f4073e90 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fa3e0009a30 tx=0x7fa3e002f360 comp rx=0 tx=0).stop 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/3863556571 shutdown_connections 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/3863556571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4073a50 0x7fa3f4073e90 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/3863556571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4104650 0x7fa3f4073510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/3863556571 >> 192.168.123.103:0/3863556571 conn(0x7fa3f40fc460 msgr2=0x7fa3f40fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/3863556571 shutdown_connections 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.431+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/3863556571 wait complete. 2026-03-09T16:18:49.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 Processor -- start 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 -- start start 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 0x7fa3f419f3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3f419fa40 con 0x7fa3f4104650 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f9fd0640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3f41a37b0 con 0x7fa3f4073a50 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.432+0000 7fa3f37fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f37fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47636/0 (socket says 192.168.123.103:47636) 2026-03-09T16:18:49.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f37fe640 1 -- 192.168.123.103:0/2872332882 learned_addr learned my addr 192.168.123.103:0/2872332882 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f37fe640 1 -- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 msgr2=0x7fa3f419f3b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f2ffd640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 0x7fa3f419f3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f37fe640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 0x7fa3f419f3b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f37fe640 1 -- 192.168.123.103:0/2872332882 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3e0009660 con 0x7fa3f4073a50 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.433+0000 7fa3f2ffd640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 0x7fa3f419f3b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f37fe640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa3e400e990 tx=0x7fa3e400ee60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3e400cd30 con 0x7fa3f4073a50 2026-03-09T16:18:49.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3f41a39b0 con 0x7fa3f4073a50 2026-03-09T16:18:49.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3f41a3f00 con 0x7fa3f4073a50 2026-03-09T16:18:49.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa3e400ce90 con 0x7fa3f4073a50 2026-03-09T16:18:49.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.434+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3e4010640 con 0x7fa3f4073a50 2026-03-09T16:18:49.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.435+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa3e40107a0 con 0x7fa3f4073a50 2026-03-09T16:18:49.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.436+0000 7fa3f0ff9640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 0x7fa3d4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.436+0000 7fa3f2ffd640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 0x7fa3d4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.437+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa3e4014070 con 0x7fa3f4073a50 2026-03-09T16:18:49.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.437+0000 7fa3f2ffd640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 0x7fa3d4079da0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fa3f41a0420 tx=0x7fa3e00023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.438+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3c0005350 con 0x7fa3f4073a50 2026-03-09T16:18:49.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.445+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa3e4063a30 con 0x7fa3f4073a50 2026-03-09T16:18:49.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.567+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa3c0002bf0 con 0x7fa3d40778e0 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.574+0000 7fa3f0ff9640 1 -- 192.168.123.103:0/2872332882 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7fa3c0002bf0 con 0x7fa3d40778e0 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 49s ago 8m 25.3M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 49s ago 8m 9743k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (8m) 24s ago 8m 9945k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (92s) 49s ago 8m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (90s) 24s ago 8m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 49s ago 8m 83.9M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:18:49.574 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (6m) 49s ago 6m 19.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (6m) 49s ago 6m 190M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (6m) 24s ago 6m 17.2M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (6m) 24s ago 6m 18.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (4m) 49s ago 9m 604M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (3m) 24s ago 8m 496M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 49s ago 9m 62.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (106s) 24s ago 8m 51.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 49s ago 8m 9517k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 24s ago 8m 9605k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (82s) 49s ago 7m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (75s) 49s ago 7m 99.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (52s) 49s ago 7m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (25s) 24s ago 7m 30.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e c052610d74d5 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 starting - - - 4096M 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (6m) 24s ago 6m 333M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 56fb3849b087 2026-03-09T16:18:49.575 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 49s ago 8m 45.1M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:18:49.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.577+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 msgr2=0x7fa3d4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.577+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 0x7fa3d4079da0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fa3f41a0420 tx=0x7fa3e00023d0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.577+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 msgr2=0x7fa3f419ee70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.577+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa3e400e990 tx=0x7fa3e400ee60 comp rx=0 tx=0).stop 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.577+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 shutdown_connections 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa3d40778e0 0x7fa3d4079da0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3f4104650 0x7fa3f419f3b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 --2- 192.168.123.103:0/2872332882 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3f4073a50 0x7fa3f419ee70 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 >> 192.168.123.103:0/2872332882 conn(0x7fa3f40fc460 msgr2=0x7fa3f40fd6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 shutdown_connections 2026-03-09T16:18:49.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.578+0000 7fa3f9fd0640 1 -- 192.168.123.103:0/2872332882 wait complete. 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 -- 192.168.123.103:0/4049751358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf4106770 msgr2=0x7fbbf4106b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/4049751358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf4106770 0x7fbbf4106b50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fbbd8009a00 tx=0x7fbbd802f280 comp rx=0 tx=0).stop 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 -- 192.168.123.103:0/4049751358 shutdown_connections 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/4049751358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbf4100770 0x7fbbf4100bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/4049751358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf4106770 0x7fbbf4106b50 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 -- 192.168.123.103:0/4049751358 >> 192.168.123.103:0/4049751358 conn(0x7fbbf40fc470 msgr2=0x7fbbf40fe890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.641+0000 7fbbfa34d640 1 -- 192.168.123.103:0/4049751358 shutdown_connections 2026-03-09T16:18:49.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.642+0000 7fbbfa34d640 1 -- 192.168.123.103:0/4049751358 wait complete. 2026-03-09T16:18:49.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.642+0000 7fbbfa34d640 1 Processor -- start 2026-03-09T16:18:49.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.642+0000 7fbbfa34d640 1 -- start start 2026-03-09T16:18:49.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbfa34d640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbf4100770 0x7fbbf41a09e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbfa34d640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbfa34d640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbf41a14f0 con 0x7fbbf41a0f20 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbfa34d640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbf419af80 con 0x7fbbf4100770 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44636/0 (socket says 192.168.123.103:44636) 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 -- 192.168.123.103:0/2612053395 learned_addr learned my addr 192.168.123.103:0/2612053395 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 -- 192.168.123.103:0/2612053395 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbf4100770 msgr2=0x7fbbf41a09e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbf4100770 0x7fbbf41a09e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 -- 192.168.123.103:0/2612053395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbd8009660 con 0x7fbbf41a0f20 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf37fe640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fbbe000e9b0 tx=0x7fbbe000ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbe000cd90 con 0x7fbbf41a0f20 2026-03-09T16:18:49.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.643+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbbe0004590 con 0x7fbbf41a0f20 2026-03-09T16:18:49.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.644+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbe0010640 con 0x7fbbf41a0f20 2026-03-09T16:18:49.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.644+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbf419b210 con 0x7fbbf41a0f20 2026-03-09T16:18:49.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.644+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbf419b690 con 0x7fbbf41a0f20 2026-03-09T16:18:49.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.646+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbbb8005350 con 0x7fbbf41a0f20 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.646+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbbe00026e0 con 0x7fbbf41a0f20 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.647+0000 7fbbf17fa640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 0x7fbbc8079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.647+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbbe0014070 con 0x7fbbf41a0f20 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.647+0000 7fbbf3fff640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 0x7fbbc8079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.648+0000 7fbbf3fff640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 0x7fbbc8079b90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbbd8008000 tx=0x7fbbd80023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.649+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbbe009e050 con 0x7fbbf41a0f20 2026-03-09T16:18:49.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.800+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fbbb80058d0 con 0x7fbbf41a0f20 2026-03-09T16:18:49.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.801+0000 7fbbf17fa640 1 -- 192.168.123.103:0/2612053395 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7fbbe0061840 con 0x7fbbf41a0f20 2026-03-09T16:18:49.804 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 1, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:18:49.834 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 msgr2=0x7fbbc8079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 0x7fbbc8079b90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbbd8008000 tx=0x7fbbd80023d0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 msgr2=0x7fbbf419aa40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fbbe000e9b0 tx=0x7fbbe000ee80 comp rx=0 tx=0).stop 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 shutdown_connections 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbbc80776d0 0x7fbbc8079b90 secure :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbbd8008000 tx=0x7fbbd80023d0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbf41a0f20 0x7fbbf419aa40 secure :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fbbe000e9b0 tx=0x7fbbe000ee80 comp rx=0 tx=0).stop 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 --2- 192.168.123.103:0/2612053395 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbf4100770 0x7fbbf41a09e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.805+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 >> 192.168.123.103:0/2612053395 conn(0x7fbbf40fc470 msgr2=0x7fbbf40733a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.806+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 shutdown_connections 2026-03-09T16:18:49.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.806+0000 7fbbfa34d640 1 -- 192.168.123.103:0/2612053395 wait complete. 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.863+0000 7fc6bb672640 1 -- 192.168.123.103:0/1748169796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41089d0 msgr2=0x7fc6b4108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.863+0000 7fc6bb672640 1 --2- 192.168.123.103:0/1748169796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41089d0 0x7fc6b4108db0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fc69c009a00 tx=0x7fc69c02f270 comp rx=0 tx=0).stop 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 -- 192.168.123.103:0/1748169796 shutdown_connections 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 --2- 192.168.123.103:0/1748169796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41029d0 0x7fc6b4102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 --2- 192.168.123.103:0/1748169796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41089d0 0x7fc6b4108db0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 -- 192.168.123.103:0/1748169796 >> 192.168.123.103:0/1748169796 conn(0x7fc6b40fe710 msgr2=0x7fc6b4100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:49.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 -- 192.168.123.103:0/1748169796 shutdown_connections 2026-03-09T16:18:49.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.864+0000 7fc6bb672640 1 -- 192.168.123.103:0/1748169796 wait complete. 2026-03-09T16:18:49.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.865+0000 7fc6bb672640 1 Processor -- start 2026-03-09T16:18:49.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.865+0000 7fc6bb672640 1 -- start start 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.865+0000 7fc6bb672640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.865+0000 7fc6b93e7640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6b93e7640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44648/0 (socket says 192.168.123.103:44648) 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6bb672640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41a0db0 0x7fc6b419a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6bb672640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6b41a12c0 con 0x7fc6b41029d0 2026-03-09T16:18:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6bb672640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6b41a1400 con 0x7fc6b41a0db0 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6b93e7640 1 -- 192.168.123.103:0/3107397821 learned_addr learned my addr 192.168.123.103:0/3107397821 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6b8be6640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41a0db0 0x7fc6b419a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6b93e7640 1 -- 192.168.123.103:0/3107397821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41a0db0 msgr2=0x7fc6b419a960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.866+0000 7fc6b93e7640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41a0db0 0x7fc6b419a960 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6b93e7640 1 -- 192.168.123.103:0/3107397821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc69c009660 con 0x7fc6b41029d0 2026-03-09T16:18:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6b93e7640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fc69c002bf0 tx=0x7fc69c031af0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc69c031c60 con 0x7fc6b41029d0 2026-03-09T16:18:49.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc69c031dc0 con 0x7fc6b41029d0 2026-03-09T16:18:49.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc69c038680 con 0x7fc6b41029d0 2026-03-09T16:18:49.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6b419af60 con 0x7fc6b41029d0 2026-03-09T16:18:49.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.867+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6b419b3f0 con 0x7fc6b41029d0 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.870+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc69c03f030 con 0x7fc6b41029d0 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.870+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc6b4104110 con 0x7fc6b41029d0 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.871+0000 7fc6aa7fc640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 0x7fc68c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.871+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc69c0bde20 con 0x7fc6b41029d0 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.871+0000 7fc6b8be6640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 0x7fc68c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:49.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.871+0000 7fc6b8be6640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 0x7fc68c079d50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fc6a4007c40 tx=0x7fc6a40073d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:49.903+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc69c086610 con 0x7fc6b41029d0 2026-03-09T16:18:49.973 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:49 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:18:49.731+0000 7f83db85b740 -1 Falling back to public interface 2026-03-09T16:18:49.973 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:49 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2612053395' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:50.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.045+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fc6b419b720 con 0x7fc6b41029d0 2026-03-09T16:18:50.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.046+0000 7fc6aa7fc640 1 -- 192.168.123.103:0/3107397821 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1919 (secure 0 0 0) 0x7fc69c085d60 con 0x7fc6b41029d0 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:12:21.661284+0000 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 41 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14476} 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:18:50.048 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{0:14476} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/1622851291,v1:192.168.123.103:6827/1622851291] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:18:50.049 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:18:50.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.049+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 msgr2=0x7fc68c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.049+0000 7fc6bb672640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 0x7fc68c079d50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fc6a4007c40 tx=0x7fc6a40073d0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.049+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 msgr2=0x7fc6b41a0870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.051 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:49 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2612053395' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:50.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.049+0000 7fc6bb672640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fc69c002bf0 tx=0x7fc69c031af0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 shutdown_connections 2026-03-09T16:18:50.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc68c077890 0x7fc68c079d50 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc6b41a0db0 0x7fc6b419a960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.052 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 --2- 192.168.123.103:0/3107397821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc6b41029d0 0x7fc6b41a0870 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.052 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 >> 192.168.123.103:0/3107397821 conn(0x7fc6b40fe710 msgr2=0x7fc6b40ff980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:50.052 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 shutdown_connections 2026-03-09T16:18:50.052 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.050+0000 7fc6bb672640 1 -- 192.168.123.103:0/3107397821 wait complete. 2026-03-09T16:18:50.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.111+0000 7faabfe4b640 1 -- 192.168.123.103:0/2040198947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 msgr2=0x7faab8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.111+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2040198947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab8102e30 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7faaa80098e0 tx=0x7faaa802f190 comp rx=0 tx=0).stop 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 -- 192.168.123.103:0/2040198947 shutdown_connections 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2040198947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab8102e30 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2040198947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 -- 192.168.123.103:0/2040198947 >> 192.168.123.103:0/2040198947 conn(0x7faab80fe710 msgr2=0x7faab8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 -- 192.168.123.103:0/2040198947 shutdown_connections 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.112+0000 7faabfe4b640 1 -- 192.168.123.103:0/2040198947 wait complete. 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.113+0000 7faabfe4b640 1 Processor -- start 2026-03-09T16:18:50.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.113+0000 7faabfe4b640 1 -- start start 2026-03-09T16:18:50.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.113+0000 7faabfe4b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab81a05e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.113+0000 7faabfe4b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.114+0000 7faabd3bf640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:50.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.114+0000 7faabd3bf640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44676/0 (socket says 192.168.123.103:44676) 2026-03-09T16:18:50.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.114+0000 7faabd3bf640 1 -- 192.168.123.103:0/2483072184 learned_addr learned my addr 192.168.123.103:0/2483072184 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:50.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.114+0000 7faabdbc0640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab81a05e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:50.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.113+0000 7faabfe4b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faab81a1140 con 0x7faab81089d0 2026-03-09T16:18:50.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.114+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faab819a6d0 con 0x7faab81029d0 2026-03-09T16:18:50.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faabd3bf640 1 -- 192.168.123.103:0/2483072184 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 msgr2=0x7faab81a05e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faabd3bf640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab81a05e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faabd3bf640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faaac009660 con 0x7faab81089d0 2026-03-09T16:18:50.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faabd3bf640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7faaa80098b0 tx=0x7faaa8004550 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:50.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faaa803d070 con 0x7faab81089d0 2026-03-09T16:18:50.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faaa802fbc0 con 0x7faab81089d0 2026-03-09T16:18:50.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.115+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faaa8009590 con 0x7faab81089d0 2026-03-09T16:18:50.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.116+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faab819acb0 con 0x7faab81089d0 2026-03-09T16:18:50.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.116+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faaa8038700 con 0x7faab81089d0 2026-03-09T16:18:50.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.118+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faa80005350 con 0x7faab81089d0 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.118+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faaa80389c0 con 0x7faab81089d0 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.118+0000 7faaa6ffd640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 0x7faa94079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.118+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7faaa80be770 con 0x7faab81089d0 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.119+0000 7faabdbc0640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 0x7faa94079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.119+0000 7faabdbc0640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 0x7faa94079da0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7faaac0046c0 tx=0x7faaac009340 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:50.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.121+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faaa8086ea0 con 0x7faab81089d0 2026-03-09T16:18:50.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.242+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faa80002bf0 con 0x7faa940778e0 2026-03-09T16:18:50.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.243+0000 7faaa6ffd640 1 -- 192.168.123.103:0/2483072184 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7faa80002bf0 con 0x7faa940778e0 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "10/23 daemons upgraded", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:18:50.245 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 msgr2=0x7faa94079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 0x7faa94079da0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7faaac0046c0 tx=0x7faaac009340 comp rx=0 tx=0).stop 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 msgr2=0x7faab81a0b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7faaa80098b0 tx=0x7faaa8004550 comp rx=0 tx=0).stop 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 shutdown_connections 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7faa940778e0 0x7faa94079da0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faab81089d0 0x7faab81a0b20 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 --2- 192.168.123.103:0/2483072184 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faab81029d0 0x7faab81a05e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.246+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 >> 192.168.123.103:0/2483072184 conn(0x7faab80fe710 msgr2=0x7faab810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.247+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 shutdown_connections 2026-03-09T16:18:50.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.247+0000 7faabfe4b640 1 -- 192.168.123.103:0/2483072184 wait complete. 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 -- 192.168.123.103:0/137435387 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88241022a0 msgr2=0x7f882410a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 --2- 192.168.123.103:0/137435387 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88241022a0 0x7f882410a790 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f88140099b0 tx=0x7f881402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 -- 192.168.123.103:0/137435387 shutdown_connections 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 --2- 192.168.123.103:0/137435387 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88241022a0 0x7f882410a790 secure :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f88140099b0 tx=0x7f881402f2b0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 --2- 192.168.123.103:0/137435387 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8824101980 0x7f8824101d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.306+0000 7f882bcff640 1 -- 192.168.123.103:0/137435387 >> 192.168.123.103:0/137435387 conn(0x7f88240fb340 msgr2=0x7f88240fd760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.309+0000 7f882bcff640 1 -- 192.168.123.103:0/137435387 shutdown_connections 2026-03-09T16:18:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.310+0000 7f882bcff640 1 -- 192.168.123.103:0/137435387 wait complete. 2026-03-09T16:18:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.310+0000 7f882bcff640 1 Processor -- start 2026-03-09T16:18:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.310+0000 7f882bcff640 1 -- start start 2026-03-09T16:18:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f882bcff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f882bcff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88240ff9d0 0x7f88240ffe30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f882bcff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8824101450 con 0x7f88240ff9d0 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f882bcff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8824101590 con 0x7f8824101980 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:47718/0 (socket says 192.168.123.103:47718) 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 -- 192.168.123.103:0/160387367 learned_addr learned my addr 192.168.123.103:0/160387367 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 -- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88240ff9d0 msgr2=0x7f88240ffe30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88240ff9d0 0x7f88240ffe30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 -- 192.168.123.103:0/160387367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8818009590 con 0x7f8824101980 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8829a74640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8818002760 tx=0x7f8818002c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.311+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f881800ecf0 con 0x7f8824101980 2026-03-09T16:18:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.312+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8818002e90 con 0x7f8824101980 2026-03-09T16:18:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.312+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f881800f6f0 con 0x7f8824101980 2026-03-09T16:18:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.312+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8814009660 con 0x7f8824101980 2026-03-09T16:18:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.312+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8824071d90 con 0x7f8824101980 2026-03-09T16:18:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.313+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87ec005350 con 0x7f8824101980 2026-03-09T16:18:50.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.314+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8818016020 con 0x7f8824101980 2026-03-09T16:18:50.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.315+0000 7f8812ffd640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 0x7f8800079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:18:50.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.315+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f8818099c20 con 0x7f8824101980 2026-03-09T16:18:50.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.315+0000 7f8829273640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 0x7f8800079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:18:50.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.316+0000 7f8829273640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 0x7f8800079b90 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f881402f7c0 tx=0x7f8814005c50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:18:50.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.319+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8818061c00 con 0x7f8824101980 2026-03-09T16:18:50.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.465+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f87ec005600 con 0x7f8824101980 2026-03-09T16:18:50.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.467+0000 7f8812ffd640 1 -- 192.168.123.103:0/160387367 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+285 (secure 0 0 0) 0x7f8818061a20 con 0x7f8824101980 2026-03-09T16:18:50.468 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down 2026-03-09T16:18:50.469 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:18:50.469 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:18:50.469 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T16:18:50.469 INFO:teuthology.orchestra.run.vm03.stdout: osd.4 (root=default,host=vm05) is down 2026-03-09T16:18:50.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.470+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 msgr2=0x7f8800079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.470+0000 7f882bcff640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 0x7f8800079b90 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f881402f7c0 tx=0x7f8814005c50 comp rx=0 tx=0).stop 2026-03-09T16:18:50.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 msgr2=0x7f88240ff490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:18:50.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8818002760 tx=0x7f8818002c30 comp rx=0 tx=0).stop 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 shutdown_connections 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f88000776d0 0x7f8800079b90 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88240ff9d0 0x7f88240ffe30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.471+0000 7f882bcff640 1 --2- 192.168.123.103:0/160387367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8824101980 0x7f88240ff490 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.472+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 >> 192.168.123.103:0/160387367 conn(0x7f88240fb340 msgr2=0x7f8824109e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.472+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 shutdown_connections 2026-03-09T16:18:50.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:18:50.472+0000 7f882bcff640 1 -- 192.168.123.103:0/160387367 wait complete. 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='client.34220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='client.44179 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: pgmap v83: 65 pgs: 16 active+undersized, 1 stale+active+clean, 13 active+undersized+degraded, 35 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3107397821' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/160387367' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:18:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='client.34220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='client.44179 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: pgmap v83: 65 pgs: 16 active+undersized, 1 stale+active+clean, 13 active+undersized+degraded, 35 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3107397821' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/160387367' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:18:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: from='client.34238 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: Health check failed: Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:52 vm05.local ceph-mon[108543]: pgmap v84: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: from='client.34238 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:18:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: Health check failed: Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:18:52.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:52.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:52 vm03.local ceph-mon[133973]: pgmap v84: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:54.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: pgmap v85: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: pgmap v85: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:18:54.816 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:18:55.664 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:55 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:18:55.396+0000 7f83db85b740 -1 osd.4 66 log_to_monitors true 2026-03-09T16:18:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:55 vm05.local ceph-mon[108543]: from='osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:18:56.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:55 vm05.local ceph-mon[108543]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:18:56.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:55 vm03.local ceph-mon[133973]: from='osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:18:56.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:55 vm03.local ceph-mon[133973]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T16:18:57.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:56 vm03.local ceph-mon[133973]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T16:18:57.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:56 vm03.local ceph-mon[133973]: osdmap e69: 6 total, 5 up, 6 in 2026-03-09T16:18:57.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:56 vm03.local ceph-mon[133973]: from='osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:57.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:56 vm03.local ceph-mon[133973]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:57.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:56 vm03.local ceph-mon[133973]: pgmap v87: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:56 vm05.local ceph-mon[108543]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T16:18:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:56 vm05.local ceph-mon[108543]: osdmap e69: 6 total, 5 up, 6 in 2026-03-09T16:18:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:56 vm05.local ceph-mon[108543]: from='osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:56 vm05.local ceph-mon[108543]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:18:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:56 vm05.local ceph-mon[108543]: pgmap v87: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:58.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:18:58 vm05.local ceph-mon[108543]: pgmap v88: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:58.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:18:58 vm03.local ceph-mon[133973]: pgmap v88: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/264 objects degraded (17.424%) 2026-03-09T16:18:59.526 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:18:59 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:18:59.218+0000 7f83d2df4640 -1 osd.4 66 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: from='osd.4 ' entity='osd.4' 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727] boot 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:19:00.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:00 vm03.local ceph-mon[133973]: pgmap v90: 65 pgs: 10 peering, 14 active+undersized, 10 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 23/264 objects degraded (8.712%) 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: from='osd.4 ' entity='osd.4' 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: osd.4 [v2:192.168.123.105:6808/222393727,v1:192.168.123.105:6809/222393727] boot 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T16:19:00.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:00 vm05.local ceph-mon[108543]: pgmap v90: 65 pgs: 10 peering, 14 active+undersized, 10 active+undersized+degraded, 31 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 23/264 objects degraded (8.712%) 2026-03-09T16:19:01.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:01 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 23/264 objects degraded (8.712%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:01.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:01 vm03.local ceph-mon[133973]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T16:19:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:01 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 23/264 objects degraded (8.712%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:01.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:01 vm05.local ceph-mon[108543]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T16:19:02.777 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:02 vm05.local ceph-mon[108543]: pgmap v92: 65 pgs: 14 peering, 7 active+undersized, 7 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 14/264 objects degraded (5.303%) 2026-03-09T16:19:02.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:02 vm03.local ceph-mon[133973]: pgmap v92: 65 pgs: 14 peering, 7 active+undersized, 7 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 14/264 objects degraded (5.303%) 2026-03-09T16:19:05.355 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:05 vm05.local ceph-mon[108543]: pgmap v93: 65 pgs: 14 peering, 4 active+undersized, 4 active+undersized+degraded, 43 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 7/264 objects degraded (2.652%) 2026-03-09T16:19:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:05 vm03.local ceph-mon[133973]: pgmap v93: 65 pgs: 14 peering, 4 active+undersized, 4 active+undersized+degraded, 43 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 7/264 objects degraded (2.652%) 2026-03-09T16:19:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:06 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 7/264 objects degraded (2.652%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:06.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:06 vm03.local ceph-mon[133973]: pgmap v94: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:06.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:06 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 7/264 objects degraded (2.652%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:06.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:06 vm05.local ceph-mon[108543]: pgmap v94: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:07.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:07 vm05.local ceph-mon[108543]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 7/264 objects degraded (2.652%), 4 pgs degraded) 2026-03-09T16:19:07.892 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:07 vm03.local ceph-mon[133973]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 7/264 objects degraded (2.652%), 4 pgs degraded) 2026-03-09T16:19:08.816 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:08 vm05.local ceph-mon[108543]: pgmap v95: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:08.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:08 vm03.local ceph-mon[133973]: pgmap v95: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:09.772 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:19:09.772 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: Upgrade: osd.5 is safe to restart 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:19:09.773 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:19:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T16:19:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: Upgrade: osd.5 is safe to restart 2026-03-09T16:19:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:09.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:09.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:09.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T16:19:09.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:10.953 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-mon[108543]: Upgrade: Updating osd.5 2026-03-09T16:19:10.953 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-mon[108543]: Deploying daemon osd.5 on vm05 2026-03-09T16:19:10.953 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-mon[108543]: pgmap v96: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:10.954 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:10 vm05.local systemd[1]: Stopping Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:19:11.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:10 vm03.local ceph-mon[133973]: Upgrade: Updating osd.5 2026-03-09T16:19:11.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:10 vm03.local ceph-mon[133973]: Deploying daemon osd.5 on vm05 2026-03-09T16:19:11.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:10 vm03.local ceph-mon[133973]: pgmap v96: 65 pgs: 65 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:11.276 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:19:10.952+0000 7fb97778e640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:19:11.276 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:19:10.952+0000 7fb97778e640 -1 osd.5 71 *** Got signal Terminated *** 2026-03-09T16:19:11.276 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:10 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[77364]: 2026-03-09T16:19:10.952+0000 7fb97778e640 -1 osd.5 71 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:19:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:11 vm05.local ceph-mon[108543]: osd.5 marked itself down and dead 2026-03-09T16:19:12.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:11 vm05.local podman[124316]: 2026-03-09 16:19:11.829580468 +0000 UTC m=+0.937157999 container died 56fb3849b087f6730931423dad21d75ee904180e62f1c56e7dfd1f5e59255545 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:19:12.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:11 vm05.local podman[124316]: 2026-03-09 16:19:11.848768011 +0000 UTC m=+0.956345531 container remove 56fb3849b087f6730931423dad21d75ee904180e62f1c56e7dfd1f5e59255545 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, org.label-schema.build-date=20260223, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:19:12.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:11 vm05.local bash[124316]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5 2026-03-09T16:19:12.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.001526603 +0000 UTC m=+0.016079496 container create 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:19:12.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:11 vm03.local ceph-mon[133973]: osd.5 marked itself down and dead 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.050763858 +0000 UTC m=+0.065316751 container init 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.053838934 +0000 UTC m=+0.068391827 container start 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.058108634 +0000 UTC m=+0.072661538 container attach 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:11.995267188 +0000 UTC m=+0.009820081 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.195955436 +0000 UTC m=+0.210508329 container died 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124380]: 2026-03-09 16:19:12.215756337 +0000 UTC m=+0.230309230 container remove 86eb656002fbff202056c9c95737d1e76e9fce478ece969965d17ad007cb9786 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service: Deactivated successfully. 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local systemd[1]: Stopped Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:19:12.301 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service: Consumed 34.671s CPU time. 2026-03-09T16:19:12.759 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-mon[108543]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:19:12.759 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-mon[108543]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T16:19:12.760 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-mon[108543]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T16:19:12.760 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-mon[108543]: pgmap v98: 65 pgs: 13 stale+active+clean, 52 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local systemd[1]: Starting Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124482]: 2026-03-09 16:19:12.491902288 +0000 UTC m=+0.016412529 container create a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124482]: 2026-03-09 16:19:12.528698237 +0000 UTC m=+0.053208478 container init a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124482]: 2026-03-09 16:19:12.53435785 +0000 UTC m=+0.058868081 container start a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3) 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124482]: 2026-03-09 16:19:12.535480141 +0000 UTC m=+0.059990382 container attach a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local podman[124482]: 2026-03-09 16:19:12.485245018 +0000 UTC m=+0.009755259 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local bash[124482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:12.760 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:12 vm05.local bash[124482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:13.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:12 vm03.local ceph-mon[133973]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T16:19:13.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:12 vm03.local ceph-mon[133973]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T16:19:13.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:12 vm03.local ceph-mon[133973]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T16:19:13.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:12 vm03.local ceph-mon[133973]: pgmap v98: 65 pgs: 13 stale+active+clean, 52 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4a406ece-904b-46b0-a694-953c243d1352/osd-block-c322dd19-66a4-4f40-abd7-54565e63f71b --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4a406ece-904b-46b0-a694-953c243d1352/osd-block-c322dd19-66a4-4f40-abd7-54565e63f71b --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/ln -snf /dev/ceph-4a406ece-904b-46b0-a694-953c243d1352/osd-block-c322dd19-66a4-4f40-abd7-54565e63f71b /var/lib/ceph/osd/ceph-5/block 2026-03-09T16:19:13.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/ln -snf /dev/ceph-4a406ece-904b-46b0-a694-953c243d1352/osd-block-c322dd19-66a4-4f40-abd7-54565e63f71b /var/lib/ceph/osd/ceph-5/block 2026-03-09T16:19:13.886 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-mon[108543]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T16:19:13.886 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:13.886 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:13.886 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate[124492]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124482]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124482]: 2026-03-09 16:19:13.564078233 +0000 UTC m=+1.088588474 container died a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124482]: 2026-03-09 16:19:13.583518549 +0000 UTC m=+1.108028780 container remove a0df0e93e021b01e15d7045b1a0491f44745de9a3b04b2a115962d56a089e252 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124731]: 2026-03-09 16:19:13.67771888 +0000 UTC m=+0.016377834 container create d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124731]: 2026-03-09 16:19:13.715682435 +0000 UTC m=+0.054341389 container init d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3) 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124731]: 2026-03-09 16:19:13.718544802 +0000 UTC m=+0.057203756 container start d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local bash[124731]: d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local podman[124731]: 2026-03-09 16:19:13.671624563 +0000 UTC m=+0.010283527 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local systemd[1]: Started Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:19:13.886 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:13 vm05.local ceph-osd[124746]: -- 192.168.123.105:0/2737313569 <== mon.1 v2:192.168.123.105:3300/0 4 ==== auth_reply(proto 2 0 (0) Success) ==== 194+0+0 (secure 0 0 0) 0x55f379c46960 con 0x55f379df6000 2026-03-09T16:19:14.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:13 vm03.local ceph-mon[133973]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T16:19:14.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:14.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:14.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:13 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:14 vm03.local ceph-mon[133973]: pgmap v100: 65 pgs: 3 active+undersized, 11 stale+active+clean, 51 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:14 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:14 vm05.local ceph-mon[108543]: pgmap v100: 65 pgs: 3 active+undersized, 11 stale+active+clean, 51 active+clean; 254 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail 2026-03-09T16:19:15.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:14 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:15.276 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:14 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:19:14.807+0000 7fcc5ed1d740 -1 Falling back to public interface 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: pgmap v101: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:16.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:16 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: pgmap v101: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:16.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:16 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: Health check failed: Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all osd 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T16:19:17.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:17 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T16:19:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: Health check failed: Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all osd 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T16:19:17.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:17 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: Upgrade: Updating mds.cephfs.vm03.kygyjl 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: Deploying daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-mon[133973]: pgmap v103: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:19:18.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:18 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[133969]: 2026-03-09T16:19:18.202+0000 7f80f28e8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: Upgrade: Updating mds.cephfs.vm03.kygyjl 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kygyjl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:18.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: Deploying daemon mds.cephfs.vm03.kygyjl on vm03 2026-03-09T16:19:18.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-mon[108543]: pgmap v103: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/264 objects degraded (15.152%) 2026-03-09T16:19:18.527 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:18 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:19:18.175+0000 7fcc5ed1d740 -1 osd.5 71 log_to_monitors true 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: from='osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: osdmap e75: 6 total, 5 up, 6 in 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: Standby daemon mds.cephfs.vm03.kntrco assigned to filesystem cephfs as rank 0 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: from='osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:19:19.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 2 up:standby 2026-03-09T16:19:19.526 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:19:19 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:19:19.207+0000 7fcc562b6640 -1 osd.5 71 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: from='osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: osdmap e75: 6 total, 5 up, 6 in 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: Standby daemon mds.cephfs.vm03.kntrco assigned to filesystem cephfs as rank 0 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: from='osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-09T16:19:19.641 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:19 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 2 up:standby 2026-03-09T16:19:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:20 vm05.local ceph-mon[108543]: from='osd.5 ' entity='osd.5' 2026-03-09T16:19:20.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:20 vm05.local ceph-mon[108543]: pgmap v105: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5 op/s; 40/264 objects degraded (15.152%) 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.538+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/2235598331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 msgr2=0x7f2f4810ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.538+0000 7f2f4d0eb640 1 --2- 192.168.123.103:0/2235598331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 0x7f2f4810ba70 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f2f38007920 tx=0x7f2f38030040 comp rx=0 tx=0).stop 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/2235598331 shutdown_connections 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 --2- 192.168.123.103:0/2235598331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 0x7f2f4810ba70 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 --2- 192.168.123.103:0/2235598331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f48072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/2235598331 >> 192.168.123.103:0/2235598331 conn(0x7f2f4806c7e0 msgr2=0x7f2f4806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:20.540 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:20 vm03.local ceph-mon[133973]: from='osd.5 ' entity='osd.5' 2026-03-09T16:19:20.540 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:20 vm03.local ceph-mon[133973]: pgmap v105: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5 op/s; 40/264 objects degraded (15.152%) 2026-03-09T16:19:20.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/2235598331 shutdown_connections 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.539+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/2235598331 wait complete. 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 Processor -- start 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 -- start start 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 0x7f2f481ba690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f4807a840 con 0x7f2f48072af0 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f4d0eb640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f4807a9b0 con 0x7f2f48072140 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f46d76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f46d76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55014/0 (socket says 192.168.123.103:55014) 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.540+0000 7f2f46d76640 1 -- 192.168.123.103:0/1417986903 learned_addr learned my addr 192.168.123.103:0/1417986903 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f46d76640 1 -- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 msgr2=0x7f2f481ba690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f46d76640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 0x7f2f481ba690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f46d76640 1 -- 192.168.123.103:0/1417986903 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f380075d0 con 0x7f2f48072140 2026-03-09T16:19:20.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f46d76640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f2f4000ef30 tx=0x7f2f4000c560 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f40018070 con 0x7f2f48072140 2026-03-09T16:19:20.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/1417986903 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f4807ac30 con 0x7f2f48072140 2026-03-09T16:19:20.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.541+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/1417986903 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f4807b180 con 0x7f2f48072140 2026-03-09T16:19:20.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.542+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2f4000f040 con 0x7f2f48072140 2026-03-09T16:19:20.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.542+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f40014690 con 0x7f2f48072140 2026-03-09T16:19:20.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.543+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2f40007500 con 0x7f2f48072140 2026-03-09T16:19:20.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.544+0000 7f2f27fff640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 0x7f2f28079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.544+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f2f40099f60 con 0x7f2f48072140 2026-03-09T16:19:20.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.544+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/1417986903 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f48108570 con 0x7f2f48072140 2026-03-09T16:19:20.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.544+0000 7f2f46575640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 0x7f2f28079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.548+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2f400ca9f0 con 0x7f2f48072140 2026-03-09T16:19:20.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.568+0000 7f2f46575640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 0x7f2f28079da0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f2f38033040 tx=0x7f2f380045d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.652+0000 7f2f4d0eb640 1 -- 192.168.123.103:0/1417986903 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2f481bb390 con 0x7f2f280778e0 2026-03-09T16:19:20.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.655+0000 7f2f27fff640 1 -- 192.168.123.103:0/1417986903 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f2f481bb390 con 0x7f2f280778e0 2026-03-09T16:19:20.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.657+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 msgr2=0x7f2f28079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.657+0000 7f2f25ffb640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 0x7f2f28079da0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f2f38033040 tx=0x7f2f380045d0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.657+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 msgr2=0x7f2f481ba150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f2f4000ef30 tx=0x7f2f4000c560 comp rx=0 tx=0).stop 2026-03-09T16:19:20.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 shutdown_connections 2026-03-09T16:19:20.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2f280778e0 0x7f2f28079da0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f48072af0 0x7f2f481ba690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 --2- 192.168.123.103:0/1417986903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f48072140 0x7f2f481ba150 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.660+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 >> 192.168.123.103:0/1417986903 conn(0x7f2f4806c7e0 msgr2=0x7f2f4806fb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:20.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.662+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 shutdown_connections 2026-03-09T16:19:20.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.663+0000 7f2f25ffb640 1 -- 192.168.123.103:0/1417986903 wait complete. 2026-03-09T16:19:20.674 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 -- 192.168.123.103:0/3467273881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc0720b0 msgr2=0x7f35bc072490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 --2- 192.168.123.103:0/3467273881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc0720b0 0x7f35bc072490 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f35b00099b0 tx=0x7f35b002f2b0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 -- 192.168.123.103:0/3467273881 shutdown_connections 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 --2- 192.168.123.103:0/3467273881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35bc0729d0 0x7f35bc10b9f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 --2- 192.168.123.103:0/3467273881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc0720b0 0x7f35bc072490 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 -- 192.168.123.103:0/3467273881 >> 192.168.123.103:0/3467273881 conn(0x7f35bc06c7e0 msgr2=0x7f35bc06cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 -- 192.168.123.103:0/3467273881 shutdown_connections 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.726+0000 7f35c3b44640 1 -- 192.168.123.103:0/3467273881 wait complete. 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 Processor -- start 2026-03-09T16:19:20.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 -- start start 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35bc0729d0 0x7f35bc1a7520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35bc1a8060 con 0x7f35bc1a7a60 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c3b44640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35bc1a81d0 con 0x7f35bc0729d0 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c10b8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c10b8640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60708/0 (socket says 192.168.123.103:60708) 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.727+0000 7f35c10b8640 1 -- 192.168.123.103:0/1556093143 learned_addr learned my addr 192.168.123.103:0/1556093143 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.728+0000 7f35c10b8640 1 -- 192.168.123.103:0/1556093143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35bc0729d0 msgr2=0x7f35bc1a7520 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.728+0000 7f35c10b8640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35bc0729d0 0x7f35bc1a7520 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.728+0000 7f35c10b8640 1 -- 192.168.123.103:0/1556093143 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35b0009660 con 0x7f35bc1a7a60 2026-03-09T16:19:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.728+0000 7f35c10b8640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f35b400efc0 tx=0x7f35b400c490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.730+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35b4009280 con 0x7f35bc1a7a60 2026-03-09T16:19:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.730+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f35b400f040 con 0x7f35bc1a7a60 2026-03-09T16:19:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.730+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35b4004910 con 0x7f35bc1a7a60 2026-03-09T16:19:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.730+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f35bc1ac460 con 0x7f35bc1a7a60 2026-03-09T16:19:20.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.730+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f35bc1ac9b0 con 0x7f35bc1a7a60 2026-03-09T16:19:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.732+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f35b40040d0 con 0x7f35bc1a7a60 2026-03-09T16:19:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.732+0000 7f35aaffd640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 0x7f3598079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.732+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f35b4015e30 con 0x7f35bc1a7a60 2026-03-09T16:19:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.732+0000 7f35c18b9640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 0x7f3598079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.733+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f35bc1acc80 con 0x7f35bc1a7a60 2026-03-09T16:19:20.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.736+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f35b409e050 con 0x7f35bc1a7a60 2026-03-09T16:19:20.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.736+0000 7f35c18b9640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 0x7f3598079da0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f35b00099b0 tx=0x7f35b00023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.839+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f35bc1a8880 con 0x7f35980778e0 2026-03-09T16:19:20.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.843+0000 7f35aaffd640 1 -- 192.168.123.103:0/1556093143 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f35bc1a8880 con 0x7f35980778e0 2026-03-09T16:19:20.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 msgr2=0x7f3598079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 0x7f3598079da0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f35b00099b0 tx=0x7f35b00023d0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 msgr2=0x7f35bc1abe60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f35b400efc0 tx=0x7f35b400c490 comp rx=0 tx=0).stop 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 shutdown_connections 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f35980778e0 0x7f3598079da0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f35bc1a7a60 0x7f35bc1abe60 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 --2- 192.168.123.103:0/1556093143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35bc0729d0 0x7f35bc1a7520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 >> 192.168.123.103:0/1556093143 conn(0x7f35bc06c7e0 msgr2=0x7f35bc070160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.847+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 shutdown_connections 2026-03-09T16:19:20.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.848+0000 7f35c3b44640 1 -- 192.168.123.103:0/1556093143 wait complete. 2026-03-09T16:19:20.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- 192.168.123.103:0/2578437003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c072af0 msgr2=0x7eff1c10ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 --2- 192.168.123.103:0/2578437003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c072af0 0x7eff1c10ba70 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7eff1400caa0 tx=0x7eff140305a0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- 192.168.123.103:0/2578437003 shutdown_connections 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 --2- 192.168.123.103:0/2578437003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c072af0 0x7eff1c10ba70 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 --2- 192.168.123.103:0/2578437003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- 192.168.123.103:0/2578437003 >> 192.168.123.103:0/2578437003 conn(0x7eff1c06c7e0 msgr2=0x7eff1c06cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- 192.168.123.103:0/2578437003 shutdown_connections 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- 192.168.123.103:0/2578437003 wait complete. 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 Processor -- start 2026-03-09T16:19:20.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.913+0000 7eff20ae9640 1 -- start start 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff20ae9640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff20ae9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c07dab0 0x7eff1c07df10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff20ae9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff1c0844d0 con 0x7eff1c07dab0 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff20ae9640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff1c084640 con 0x7eff1c072140 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55046/0 (socket says 192.168.123.103:55046) 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 -- 192.168.123.103:0/2313550756 learned_addr learned my addr 192.168.123.103:0/2313550756 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff19d74640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c07dab0 0x7eff1c07df10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 -- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c07dab0 msgr2=0x7eff1c07df10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c07dab0 0x7eff1c07df10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.914+0000 7eff1a575640 1 -- 192.168.123.103:0/2313550756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7eff14009d00 con 0x7eff1c072140 2026-03-09T16:19:20.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.915+0000 7eff1a575640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7eff0c00b700 tx=0x7eff0c00bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.915+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff0c00be90 con 0x7eff1c072140 2026-03-09T16:19:20.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.915+0000 7eff20ae9640 1 -- 192.168.123.103:0/2313550756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7eff1c082100 con 0x7eff1c072140 2026-03-09T16:19:20.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.915+0000 7eff20ae9640 1 -- 192.168.123.103:0/2313550756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7eff1c082650 con 0x7eff1c072140 2026-03-09T16:19:20.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.916+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7eff0c002ba0 con 0x7eff1c072140 2026-03-09T16:19:20.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.916+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff0c00cab0 con 0x7eff1c072140 2026-03-09T16:19:20.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.916+0000 7eff20ae9640 1 -- 192.168.123.103:0/2313550756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7eff1c108570 con 0x7eff1c072140 2026-03-09T16:19:20.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.917+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7eff0c004380 con 0x7eff1c072140 2026-03-09T16:19:20.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.917+0000 7eff0b7fe640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 0x7efefc079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:20.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.918+0000 7eff19d74640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 0x7efefc079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:20.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.918+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6480+0+0 (secure 0 0 0) 0x7eff0c099000 con 0x7eff1c072140 2026-03-09T16:19:20.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.918+0000 7eff19d74640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 0x7efefc079da0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7eff1400ca70 tx=0x7eff1403e040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:20.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:20.919+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7eff0c061890 con 0x7eff1c072140 2026-03-09T16:19:21.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.017+0000 7eff20ae9640 1 -- 192.168.123.103:0/2313550756 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7eff1c079810 con 0x7efefc0778e0 2026-03-09T16:19:21.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.022+0000 7eff0b7fe640 1 -- 192.168.123.103:0/2313550756 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7eff1c079810 con 0x7efefc0778e0 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 80s ago 9m 25.3M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (9m) 80s ago 9m 9743k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (8m) 6s ago 8m 10.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 80s ago 9m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (2m) 6s ago 8m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 80s ago 8m 83.9M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (7m) 80s ago 7m 19.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8e7e3eb06891 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (7m) 80s ago 7m 190M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 f23b1415c23e 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (7m) 6s ago 7m 17.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 fbf69f4859f1 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (7m) 6s ago 7m 19.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (4m) 80s ago 9m 604M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (4m) 6s ago 8m 497M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 80s ago 9m 62.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (2m) 6s ago 8m 49.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 80s ago 9m 9517k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 6s ago 8m 9671k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (113s) 80s ago 8m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (107s) 80s ago 8m 99.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (83s) 80s ago 7m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (56s) 6s ago 7m 145M 4096M 19.2.3-678-ge911bdeb 654f31e6858e c052610d74d5 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (32s) 6s ago 7m 120M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4115e4720b89 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (7s) 6s ago 7m 13.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e d93569840b13 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 80s ago 8m 45.1M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.025+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 msgr2=0x7efefc079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.025+0000 7eff097fa640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 0x7efefc079da0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7eff1400ca70 tx=0x7eff1403e040 comp rx=0 tx=0).stop 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.025+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 msgr2=0x7eff1c07d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.025+0000 7eff097fa640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7eff0c00b700 tx=0x7eff0c00bbd0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.026+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 shutdown_connections 2026-03-09T16:19:21.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.026+0000 7eff097fa640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efefc0778e0 0x7efefc079da0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.026+0000 7eff097fa640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff1c07dab0 0x7eff1c07df10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.026+0000 7eff097fa640 1 --2- 192.168.123.103:0/2313550756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff1c072140 0x7eff1c07d570 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.026+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 >> 192.168.123.103:0/2313550756 conn(0x7eff1c06c7e0 msgr2=0x7eff1c10a9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.028+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 shutdown_connections 2026-03-09T16:19:21.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.028+0000 7eff097fa640 1 -- 192.168.123.103:0/2313550756 wait complete. 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 -- 192.168.123.103:0/2684253090 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800729d0 msgr2=0x7f8f8010b9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 --2- 192.168.123.103:0/2684253090 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800729d0 0x7f8f8010b9f0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f8f7800b0a0 tx=0x7f8f7802f550 comp rx=0 tx=0).stop 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 -- 192.168.123.103:0/2684253090 shutdown_connections 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 --2- 192.168.123.103:0/2684253090 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800729d0 0x7f8f8010b9f0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 --2- 192.168.123.103:0/2684253090 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f800720b0 0x7f8f80072490 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 -- 192.168.123.103:0/2684253090 >> 192.168.123.103:0/2684253090 conn(0x7f8f8006c7e0 msgr2=0x7f8f8006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 -- 192.168.123.103:0/2684253090 shutdown_connections 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 -- 192.168.123.103:0/2684253090 wait complete. 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.098+0000 7f8f8667a640 1 Processor -- start 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f8667a640 1 -- start start 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f8667a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f8667a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f80113100 0x7f8f801a8770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f8667a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f801135f0 con 0x7f8f800720b0 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f8667a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f80113730 con 0x7f8f80113100 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60740/0 (socket says 192.168.123.103:60740) 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 -- 192.168.123.103:0/1932825985 learned_addr learned my addr 192.168.123.103:0/1932825985 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 -- 192.168.123.103:0/1932825985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f80113100 msgr2=0x7f8f801a8770 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f80113100 0x7f8f801a8770 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 -- 192.168.123.103:0/1932825985 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f78009d00 con 0x7f8f800720b0 2026-03-09T16:19:21.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.099+0000 7f8f7ffff640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f8f7000c910 tx=0x7f8f7000cde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.101+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f70007c20 con 0x7f8f800720b0 2026-03-09T16:19:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.101+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8f70007d80 con 0x7f8f800720b0 2026-03-09T16:19:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.101+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f7000f450 con 0x7f8f800720b0 2026-03-09T16:19:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.103+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f801a8d10 con 0x7f8f800720b0 2026-03-09T16:19:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.103+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f801a9190 con 0x7f8f800720b0 2026-03-09T16:19:21.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.103+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f44005350 con 0x7f8f800720b0 2026-03-09T16:19:21.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.105+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8f70016030 con 0x7f8f800720b0 2026-03-09T16:19:21.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.106+0000 7f8f7d7fa640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 0x7f8f5c079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.106+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8f70099a70 con 0x7f8f800720b0 2026-03-09T16:19:21.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.111+0000 7f8f7f7fe640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 0x7f8f5c079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.111+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8f700621f0 con 0x7f8f800720b0 2026-03-09T16:19:21.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.114+0000 7f8f7f7fe640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 0x7f8f5c079b90 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f8f7800b070 tx=0x7f8f78002750 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.250+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8f44005e10 con 0x7f8f800720b0 2026-03-09T16:19:21.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.297+0000 7f8f7d7fa640 1 -- 192.168.123.103:0/1932825985 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7f8f70005000 con 0x7f8f800720b0 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 3 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 3, 2026-03-09T16:19:21.300 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 msgr2=0x7f8f5c079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 0x7f8f5c079b90 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f8f7800b070 tx=0x7f8f78002750 comp rx=0 tx=0).stop 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 msgr2=0x7f8f80112bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f8f7000c910 tx=0x7f8f7000cde0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 shutdown_connections 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8f5c0776d0 0x7f8f5c079b90 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f80113100 0x7f8f801a8770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 --2- 192.168.123.103:0/1932825985 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8f800720b0 0x7f8f80112bc0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 >> 192.168.123.103:0/1932825985 conn(0x7f8f8006c7e0 msgr2=0x7f8f8006eec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 shutdown_connections 2026-03-09T16:19:21.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.300+0000 7f8f8667a640 1 -- 192.168.123.103:0/1932825985 wait complete. 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 -- 192.168.123.103:0/2533201401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858102800 msgr2=0x7f5858102c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 --2- 192.168.123.103:0/2533201401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858102800 0x7f5858102c60 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f58480099b0 tx=0x7f584802f220 comp rx=0 tx=0).stop 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 -- 192.168.123.103:0/2533201401 shutdown_connections 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 --2- 192.168.123.103:0/2533201401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858102800 0x7f5858102c60 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 --2- 192.168.123.103:0/2533201401 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858108800 0x7f5858108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 -- 192.168.123.103:0/2533201401 >> 192.168.123.103:0/2533201401 conn(0x7f58580fe540 msgr2=0x7f5858100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 -- 192.168.123.103:0/2533201401 shutdown_connections 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.413+0000 7f585cbdd640 1 -- 192.168.123.103:0/2533201401 wait complete. 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 Processor -- start 2026-03-09T16:19:21.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 -- start start 2026-03-09T16:19:21.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858108800 0x7f58581a0a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58581a1050 con 0x7f5858108800 2026-03-09T16:19:21.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.414+0000 7f585cbdd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f585819a5e0 con 0x7f5858102800 2026-03-09T16:19:21.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55078/0 (socket says 192.168.123.103:55078) 2026-03-09T16:19:21.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 -- 192.168.123.103:0/2644543110 learned_addr learned my addr 192.168.123.103:0/2644543110 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:21.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 -- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858108800 msgr2=0x7f58581a0a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858108800 0x7f58581a0a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 -- 192.168.123.103:0/2644543110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5848009660 con 0x7f5858102800 2026-03-09T16:19:21.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f58577fe640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f584c00ef10 tx=0x7f584c00c560 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.416+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f584c009280 con 0x7f5858102800 2026-03-09T16:19:21.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.417+0000 7f585cbdd640 1 -- 192.168.123.103:0/2644543110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f585819a860 con 0x7f5858102800 2026-03-09T16:19:21.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.417+0000 7f585cbdd640 1 -- 192.168.123.103:0/2644543110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f585819adb0 con 0x7f5858102800 2026-03-09T16:19:21.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.417+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f584c00f040 con 0x7f5858102800 2026-03-09T16:19:21.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.417+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f584c004940 con 0x7f5858102800 2026-03-09T16:19:21.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.418+0000 7f585cbdd640 1 -- 192.168.123.103:0/2644543110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5824005350 con 0x7f5858102800 2026-03-09T16:19:21.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.418+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f584c004160 con 0x7f5858102800 2026-03-09T16:19:21.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.419+0000 7f5854ff9640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 0x7f5820079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.419+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(77..77 src has 1..77) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f584c0996c0 con 0x7f5858102800 2026-03-09T16:19:21.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.421+0000 7f5856ffd640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 0x7f5820079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.421+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f584c061e90 con 0x7f5858102800 2026-03-09T16:19:21.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.422+0000 7f5856ffd640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 0x7f5820079da0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5848005d20 tx=0x7f5848005c50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.550+0000 7f585cbdd640 1 -- 192.168.123.103:0/2644543110 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f5824005e10 con 0x7f5858102800 2026-03-09T16:19:21.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.552+0000 7f5854ff9640 1 -- 192.168.123.103:0/2644543110 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1742 (secure 0 0 0) 0x7f584c0615e0 con 0x7f5858102800 2026-03-09T16:19:21.552 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-09T16:19:21.552 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T16:19:18:213414+0000 2026-03-09T16:19:21.552 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:19:18.213355+0000 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 75 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14492} 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{0:14492} state up:replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:24291} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/1621230713,v1:192.168.123.105:6825/1621230713] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T16:19:21.553 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-09T16:19:21.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 msgr2=0x7f5820079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 0x7f5820079da0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5848005d20 tx=0x7f5848005c50 comp rx=0 tx=0).stop 2026-03-09T16:19:21.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 msgr2=0x7f58581a04f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f584c00ef10 tx=0x7f584c00c560 comp rx=0 tx=0).stop 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 shutdown_connections 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f58200778e0 0x7f5820079da0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5858108800 0x7f58581a0a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 --2- 192.168.123.103:0/2644543110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5858102800 0x7f58581a04f0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 >> 192.168.123.103:0/2644543110 conn(0x7f58580fe540 msgr2=0x7f585810c7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 shutdown_connections 2026-03-09T16:19:21.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.558+0000 7f583a7fc640 1 -- 192.168.123.103:0/2644543110 wait complete. 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 -- 192.168.123.103:0/2875661135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094072af0 msgr2=0x7f709410ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 --2- 192.168.123.103:0/2875661135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094072af0 0x7f709410ba70 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f708c00b0a0 tx=0x7f708c02f4c0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 -- 192.168.123.103:0/2875661135 shutdown_connections 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 --2- 192.168.123.103:0/2875661135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094072af0 0x7f709410ba70 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 --2- 192.168.123.103:0/2875661135 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7094072140 0x7f7094072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 -- 192.168.123.103:0/2875661135 >> 192.168.123.103:0/2875661135 conn(0x7f709406c7e0 msgr2=0x7f709406cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 -- 192.168.123.103:0/2875661135 shutdown_connections 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.616+0000 7f7099f67640 1 -- 192.168.123.103:0/2875661135 wait complete. 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 Processor -- start 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 -- start start 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7094072140 0x7f709407d410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f709407e030 con 0x7f7094072140 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7099f67640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f709407e170 con 0x7f7094084380 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55096/0 (socket says 192.168.123.103:55096) 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 -- 192.168.123.103:0/2461387102 learned_addr learned my addr 192.168.123.103:0/2461387102 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 -- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7094072140 msgr2=0x7f709407d410 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7094072140 0x7f709407d410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 -- 192.168.123.103:0/2461387102 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7084009590 con 0x7f7094084380 2026-03-09T16:19:21.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.617+0000 7f7092ffd640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f708c009fd0 tx=0x7f708c009300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.619+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f708c002c70 con 0x7f7094084380 2026-03-09T16:19:21.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.619+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f708c009d00 con 0x7f7094084380 2026-03-09T16:19:21.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.619+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f70940822b0 con 0x7f7094084380 2026-03-09T16:19:21.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.619+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f708c002dd0 con 0x7f7094084380 2026-03-09T16:19:21.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.619+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f708c0408f0 con 0x7f7094084380 2026-03-09T16:19:21.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.620+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7094108570 con 0x7f7094084380 2026-03-09T16:19:21.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.620+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f708c007ca0 con 0x7f7094084380 2026-03-09T16:19:21.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.621+0000 7f7090ff9640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 0x7f7074079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.621+0000 7f70937fe640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 0x7f7074079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.621+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(77..77 src has 1..77) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f708c0bea30 con 0x7f7094084380 2026-03-09T16:19:21.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.622+0000 7f70937fe640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 0x7f7074079e70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f7084005e00 tx=0x7f7084005d50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.625+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f708c0871b0 con 0x7f7094084380 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956] boot 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: from='client.34250 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:21 vm03.local ceph-mon[133973]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.732+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7094079790 con 0x7f70740779b0 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.741+0000 7f7090ff9640 1 -- 192.168.123.103:0/2461387102 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f7094079790 con 0x7f70740779b0 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "12/23 daemons upgraded", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T16:19:21.742 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:19:21.743 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:19:21.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.745+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 msgr2=0x7f7074079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.745+0000 7f7099f67640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 0x7f7074079e70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f7084005e00 tx=0x7f7084005d50 comp rx=0 tx=0).stop 2026-03-09T16:19:21.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.745+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 msgr2=0x7f709407d950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.745+0000 7f7099f67640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f708c009fd0 tx=0x7f708c009300 comp rx=0 tx=0).stop 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 shutdown_connections 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f70740779b0 0x7f7074079e70 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7094084380 0x7f709407d950 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 --2- 192.168.123.103:0/2461387102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7094072140 0x7f709407d410 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 >> 192.168.123.103:0/2461387102 conn(0x7f709406c7e0 msgr2=0x7f709406f8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 shutdown_connections 2026-03-09T16:19:21.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.747+0000 7f7099f67640 1 -- 192.168.123.103:0/2461387102 wait complete. 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: osd.5 [v2:192.168.123.105:6816/1912058956,v1:192.168.123.105:6817/1912058956] boot 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: from='client.34250 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:21 vm05.local ceph-mon[108543]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/1311755275 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c4072ad0 msgr2=0x7fe9c410b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/1311755275 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c4072ad0 0x7fe9c410b9a0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fe9bc008090 tx=0x7fe9bc031ea0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/1311755275 shutdown_connections 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/1311755275 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c4072ad0 0x7fe9c410b9a0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/1311755275 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9c4072120 0x7fe9c4072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/1311755275 >> 192.168.123.103:0/1311755275 conn(0x7fe9c406c7d0 msgr2=0x7fe9c406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/1311755275 shutdown_connections 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.808+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/1311755275 wait complete. 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 Processor -- start 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 -- start start 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9c4072120 0x7fe9c407d500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9c4084500 con 0x7fe9c4072120 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9cb1e3640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9c4084670 con 0x7fe9c407da40 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55120/0 (socket says 192.168.123.103:55120) 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 -- 192.168.123.103:0/3498111072 learned_addr learned my addr 192.168.123.103:0/3498111072 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 -- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9c4072120 msgr2=0x7fe9c407d500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9c4072120 0x7fe9c407d500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:21.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.809+0000 7fe9c99e0640 1 -- 192.168.123.103:0/3498111072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9bc007ce0 con 0x7fe9c407da40 2026-03-09T16:19:21.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.810+0000 7fe9c99e0640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fe9bc002f40 tx=0x7fe9bc002f70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.810+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9bc032800 con 0x7fe9c407da40 2026-03-09T16:19:21.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.811+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9c4082100 con 0x7fe9c407da40 2026-03-09T16:19:21.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.811+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9c40825f0 con 0x7fe9c407da40 2026-03-09T16:19:21.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.811+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe9bc032e20 con 0x7fe9c407da40 2026-03-09T16:19:21.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.811+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9bc00e450 con 0x7fe9c407da40 2026-03-09T16:19:21.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.812+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe9bc00e5b0 con 0x7fe9c407da40 2026-03-09T16:19:21.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.814+0000 7fe9bb7fe640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 0x7fe998079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:21.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.815+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(77..77 src has 1..77) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fe9bc0c0160 con 0x7fe9c407da40 2026-03-09T16:19:21.815 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.815+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe9c4108570 con 0x7fe9c407da40 2026-03-09T16:19:21.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.816+0000 7fe9ca1e1640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 0x7fe998079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:21.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.817+0000 7fe9ca1e1640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 0x7fe998079e70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fe9c0005fd0 tx=0x7fe9c0005950 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:21.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.818+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe9bc0888e0 con 0x7fe9c407da40 2026-03-09T16:19:21.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.991+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe9c407ea30 con 0x7fe9c407da40 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:21.991+0000 7fe9bb7fe640 1 -- 192.168.123.103:0/3498111072 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1194 (secure 0 0 0) 0x7fe9bc088030 con 0x7fe9c407da40 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem is degraded; 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_DEGRADED: 1 filesystem is degraded 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs is degraded 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 40/264 objects degraded (15.152%), 13 pgs degraded 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.2 is active+undersized+degraded, acting [1,0] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.3 is active+undersized+degraded, acting [2,1] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.8 is active+undersized+degraded, acting [3,0] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.b is active+undersized+degraded, acting [3,4] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.f is active+undersized+degraded, acting [4,0] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.14 is active+undersized+degraded, acting [3,4] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.16 is active+undersized+degraded, acting [3,2] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.18 is active+undersized+degraded, acting [4,3] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1a is active+undersized+degraded, acting [3,4] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1b is active+undersized+degraded, acting [1,0] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1c is active+undersized+degraded, acting [4,2] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1d is active+undersized+degraded, acting [3,0] 2026-03-09T16:19:21.993 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1e is active+undersized+degraded, acting [2,0] 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 msgr2=0x7fe998079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 0x7fe998079e70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fe9c0005fd0 tx=0x7fe9c0005950 comp rx=0 tx=0).stop 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 msgr2=0x7fe9c407dea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fe9bc002f40 tx=0x7fe9bc002f70 comp rx=0 tx=0).stop 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 shutdown_connections 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe9980779b0 0x7fe998079e70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9c407da40 0x7fe9c407dea0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:22.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 --2- 192.168.123.103:0/3498111072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9c4072120 0x7fe9c407d500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:22.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 >> 192.168.123.103:0/3498111072 conn(0x7fe9c406c7d0 msgr2=0x7fe9c407b5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:22.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.001+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 shutdown_connections 2026-03-09T16:19:22.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:22.002+0000 7fe9cb1e3640 1 -- 192.168.123.103:0/3498111072 wait complete. 2026-03-09T16:19:22.642 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1932825985' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:22.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T16:19:22.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2644543110' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:19:22.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:22.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: pgmap v108: 65 pgs: 3 peering, 14 active+undersized, 11 active+undersized+degraded, 37 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 8 op/s; 35/264 objects degraded (13.258%) 2026-03-09T16:19:22.643 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:22 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3498111072' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1932825985' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2644543110' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: pgmap v108: 65 pgs: 3 peering, 14 active+undersized, 11 active+undersized+degraded, 37 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 8 op/s; 35/264 objects degraded (13.258%) 2026-03-09T16:19:22.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:22 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3498111072' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:19:23.567 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:23 vm03.local ceph-mon[133973]: Health check update: Degraded data redundancy: 35/264 objects degraded (13.258%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:23.567 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:23 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:23.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:23 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:23.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:23 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:23.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:23 vm05.local ceph-mon[108543]: Health check update: Degraded data redundancy: 35/264 objects degraded (13.258%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T16:19:23.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:23 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:23.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:23 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:23.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:23 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:24.830 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6826/892320051,v1:192.168.123.103:6827/892320051] up:boot 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 3 up:standby 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: pgmap v109: 65 pgs: 3 peering, 4 active+undersized, 7 active+undersized+degraded, 51 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 6 op/s; 24/264 objects degraded (9.091%) 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:24.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6826/892320051,v1:192.168.123.103:6827/892320051] up:boot 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 3 up:standby 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: pgmap v109: 65 pgs: 3 peering, 4 active+undersized, 7 active+undersized+degraded, 51 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 6 op/s; 24/264 objects degraded (9.091%) 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:25.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:reconnect 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:reconnect} 3 up:standby 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0.001 2026-03-09T16:19:26.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:26 vm05.local ceph-mon[108543]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0.00600001 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:reconnect 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:reconnect} 3 up:standby 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0.001 2026-03-09T16:19:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:26 vm03.local ceph-mon[133973]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0.00600001 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: pgmap v110: 65 pgs: 65 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 6 op/s 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:rejoin 2026-03-09T16:19:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:rejoin} 3 up:standby 2026-03-09T16:19:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: daemon mds.cephfs.vm03.kntrco is now active in filesystem cephfs as rank 0 2026-03-09T16:19:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:27 vm03.local ceph-mon[133973]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 24/264 objects degraded (9.091%), 7 pgs degraded) 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: pgmap v110: 65 pgs: 65 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 6 op/s 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:rejoin 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:rejoin} 3 up:standby 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: daemon mds.cephfs.vm03.kntrco is now active in filesystem cephfs as rank 0 2026-03-09T16:19:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:27 vm05.local ceph-mon[108543]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 24/264 objects degraded (9.091%), 7 pgs degraded) 2026-03-09T16:19:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:28 vm03.local ceph-mon[133973]: Upgrade: Waiting for mds.cephfs.vm03.kntrco to be up:active (currently up:reconnect) 2026-03-09T16:19:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:28 vm03.local ceph-mon[133973]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:19:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:28 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:active 2026-03-09T16:19:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:28 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:19:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:28 vm05.local ceph-mon[108543]: Upgrade: Waiting for mds.cephfs.vm03.kntrco to be up:active (currently up:reconnect) 2026-03-09T16:19:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:28 vm05.local ceph-mon[108543]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:19:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:28 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/3419491835,v1:192.168.123.103:6829/3419491835] up:active 2026-03-09T16:19:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:28 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:19:29.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:29 vm03.local ceph-mon[133973]: pgmap v111: 65 pgs: 65 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s rd, 1 op/s 2026-03-09T16:19:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:29 vm05.local ceph-mon[108543]: pgmap v111: 65 pgs: 65 active+clean; 254 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s rd, 1 op/s 2026-03-09T16:19:31.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:30 vm05.local ceph-mon[108543]: pgmap v112: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 5.6 MiB/s rd, 734 B/s wr, 5 op/s 2026-03-09T16:19:31.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:30 vm03.local ceph-mon[133973]: pgmap v112: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 5.6 MiB/s rd, 734 B/s wr, 5 op/s 2026-03-09T16:19:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:33 vm03.local ceph-mon[133973]: pgmap v113: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s rd, 672 B/s wr, 5 op/s 2026-03-09T16:19:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:33 vm05.local ceph-mon[108543]: pgmap v113: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s rd, 672 B/s wr, 5 op/s 2026-03-09T16:19:35.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:35 vm03.local ceph-mon[133973]: pgmap v114: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.6 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-09T16:19:35.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:35 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:35.471 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:35 vm05.local ceph-mon[108543]: pgmap v114: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.6 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-09T16:19:35.471 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:35 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: pgmap v115: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.7 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:37.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:37 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[133969]: 2026-03-09T16:19:37.067+0000 7f80f28e8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: pgmap v115: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.7 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.kntrco", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:37 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:38.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Upgrade: Updating mds.cephfs.vm03.kntrco 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Deploying daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Standby daemon mds.cephfs.vm05.sqhria assigned to filesystem cephfs as rank 0 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:38.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:38 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:replay} 2 up:standby 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Upgrade: Updating mds.cephfs.vm03.kntrco 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Deploying daemon mds.cephfs.vm03.kntrco on vm03 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Standby daemon mds.cephfs.vm05.sqhria assigned to filesystem cephfs as rank 0 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:38 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:replay} 2 up:standby 2026-03-09T16:19:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:39 vm03.local ceph-mon[133973]: pgmap v117: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.3 KiB/s rd, 5.8 KiB/s wr, 7 op/s 2026-03-09T16:19:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:39 vm05.local ceph-mon[108543]: pgmap v117: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 4.3 KiB/s rd, 5.8 KiB/s wr, 7 op/s 2026-03-09T16:19:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:40.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:40 vm05.local ceph-mon[108543]: pgmap v118: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.1 KiB/s wr, 8 op/s 2026-03-09T16:19:40.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:40 vm03.local ceph-mon[133973]: pgmap v118: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.1 KiB/s wr, 8 op/s 2026-03-09T16:19:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:42 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:reconnect 2026-03-09T16:19:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:42 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:reconnect} 2 up:standby 2026-03-09T16:19:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:42 vm05.local ceph-mon[108543]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0 2026-03-09T16:19:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:42 vm05.local ceph-mon[108543]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0.002 2026-03-09T16:19:42.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:42 vm05.local ceph-mon[108543]: pgmap v119: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.1 KiB/s wr, 8 op/s 2026-03-09T16:19:42.829 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:42 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:reconnect 2026-03-09T16:19:42.829 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:42 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:reconnect} 2 up:standby 2026-03-09T16:19:42.829 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:42 vm03.local ceph-mon[133973]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0 2026-03-09T16:19:42.829 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:42 vm03.local ceph-mon[133973]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0.002 2026-03-09T16:19:42.829 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:42 vm03.local ceph-mon[133973]: pgmap v119: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.1 KiB/s wr, 8 op/s 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:rejoin 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:rejoin} 2 up:standby 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: daemon mds.cephfs.vm05.sqhria is now active in filesystem cephfs as rank 0 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:43 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:rejoin 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm05.sqhria=up:rejoin} 2 up:standby 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: daemon mds.cephfs.vm05.sqhria is now active in filesystem cephfs as rank 0 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:43.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:43 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:active 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:boot 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 3 up:standby 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: pgmap v120: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 102 B/s wr, 7 op/s 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:44.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:44 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.105:6826/1138709798,v1:192.168.123.105:6827/1138709798] up:active 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:boot 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 3 up:standby 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: pgmap v120: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 102 B/s wr, 7 op/s 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:44.818 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:44 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:46.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:45 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:46.152 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:45 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:46.823 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:46 vm05.local ceph-mon[108543]: pgmap v121: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:47.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:46 vm03.local ceph-mon[133973]: pgmap v121: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-09T16:19:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:47 vm03.local ceph-mon[133973]: Upgrade: Updating mds.cephfs.vm05.jgzfvu 2026-03-09T16:19:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:47 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:47 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:48.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:47 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:48.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:47 vm03.local ceph-mon[133973]: Deploying daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:19:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:47 vm05.local ceph-mon[108543]: Upgrade: Updating mds.cephfs.vm05.jgzfvu 2026-03-09T16:19:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:47 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:47 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.jgzfvu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:47 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:48.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:47 vm05.local ceph-mon[108543]: Deploying daemon mds.cephfs.vm05.jgzfvu on vm05 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: osdmap e79: 6 total, 6 up, 6 in 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 2 up:standby 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: pgmap v123: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:49.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:48 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: osdmap e79: 6 total, 6 up, 6 in 2026-03-09T16:19:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 2 up:standby 2026-03-09T16:19:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: pgmap v123: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-09T16:19:49.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:49.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:48 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.105:6824/3112850580,v1:192.168.123.105:6825/3112850580] up:boot 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 3 up:standby 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: pgmap v124: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 5.0 KiB/s wr, 7 op/s 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:50.925 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:50 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.105:6824/3112850580,v1:192.168.123.105:6825/3112850580] up:boot 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm05.sqhria=up:active} 3 up:standby 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: pgmap v124: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 5.0 KiB/s wr, 7 op/s 2026-03-09T16:19:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:51.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:50 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 -- 192.168.123.103:0/3857565879 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 msgr2=0x7fd47810ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 --2- 192.168.123.103:0/3857565879 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd47810ba70 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fd468007920 tx=0x7fd468030040 comp rx=0 tx=0).stop 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 -- 192.168.123.103:0/3857565879 shutdown_connections 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 --2- 192.168.123.103:0/3857565879 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd47810ba70 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 --2- 192.168.123.103:0/3857565879 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 0x7fd478072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 -- 192.168.123.103:0/3857565879 >> 192.168.123.103:0/3857565879 conn(0x7fd47806c7e0 msgr2=0x7fd47806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.071+0000 7fd47e326640 1 -- 192.168.123.103:0/3857565879 shutdown_connections 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 -- 192.168.123.103:0/3857565879 wait complete. 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 Processor -- start 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 -- start start 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 0x7fd478133350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd478133f20 con 0x7fd478072af0 2026-03-09T16:19:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.072+0000 7fd47e326640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd478137d20 con 0x7fd478072140 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd4777fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd4777fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44916/0 (socket says 192.168.123.103:44916) 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd4777fe640 1 -- 192.168.123.103:0/3438388270 learned_addr learned my addr 192.168.123.103:0/3438388270 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd477fff640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 0x7fd478133350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd4777fe640 1 -- 192.168.123.103:0/3438388270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 msgr2=0x7fd478133350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.073+0000 7fd4777fe640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 0x7fd478133350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.074+0000 7fd4777fe640 1 -- 192.168.123.103:0/3438388270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd4680075d0 con 0x7fd478072af0 2026-03-09T16:19:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.074+0000 7fd4777fe640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fd468033040 tx=0x7fd468002c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.074+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd468004300 con 0x7fd478072af0 2026-03-09T16:19:52.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.074+0000 7fd47e326640 1 -- 192.168.123.103:0/3438388270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd478137f40 con 0x7fd478072af0 2026-03-09T16:19:52.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.074+0000 7fd47e326640 1 -- 192.168.123.103:0/3438388270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd478138430 con 0x7fd478072af0 2026-03-09T16:19:52.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.075+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd4680045d0 con 0x7fd478072af0 2026-03-09T16:19:52.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.075+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd468041900 con 0x7fd478072af0 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[133969]: 2026-03-09T16:19:51.824+0000 7f80f28e8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:52.076 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:52.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: osdmap e80: 6 total, 6 up, 6 in 2026-03-09T16:19:52.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: Standby daemon mds.cephfs.vm03.kntrco assigned to filesystem cephfs as rank 0 2026-03-09T16:19:52.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:52.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:52.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:51 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 2 up:standby 2026-03-09T16:19:52.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.076+0000 7fd47e326640 1 -- 192.168.123.103:0/3438388270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd478108570 con 0x7fd478072af0 2026-03-09T16:19:52.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.081+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd468039680 con 0x7fd478072af0 2026-03-09T16:19:52.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.082+0000 7fd4757fa640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 0x7fd45c079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.082+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fd4680be6c0 con 0x7fd478072af0 2026-03-09T16:19:52.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.082+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd4680beb40 con 0x7fd478072af0 2026-03-09T16:19:52.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.082+0000 7fd477fff640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 0x7fd45c079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.113+0000 7fd477fff640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 0x7fd45c079e70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd47000b460 tx=0x7fd47000d040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.192+0000 7fd47e326640 1 -- 192.168.123.103:0/3438388270 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd478079550 con 0x7fd45c0779b0 2026-03-09T16:19:52.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.197+0000 7fd4757fa640 1 -- 192.168.123.103:0/3438388270 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fd478079550 con 0x7fd45c0779b0 2026-03-09T16:19:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.199+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 msgr2=0x7fd45c079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 0x7fd45c079e70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd47000b460 tx=0x7fd47000d040 comp rx=0 tx=0).stop 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 msgr2=0x7fd478133890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fd468033040 tx=0x7fd468002c80 comp rx=0 tx=0).stop 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 shutdown_connections 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd45c0779b0 0x7fd45c079e70 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd478072af0 0x7fd478133890 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 --2- 192.168.123.103:0/3438388270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd478072140 0x7fd478133350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 >> 192.168.123.103:0/3438388270 conn(0x7fd47806c7e0 msgr2=0x7fd47806fab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 shutdown_connections 2026-03-09T16:19:52.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.200+0000 7fd456ffd640 1 -- 192.168.123.103:0/3438388270 wait complete. 2026-03-09T16:19:52.212 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:19:52.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.268+0000 7f1c9b997640 1 -- 192.168.123.103:0/1887129502 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100310 msgr2=0x7f1c941006f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.268+0000 7f1c9b997640 1 --2- 192.168.123.103:0/1887129502 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100310 0x7f1c941006f0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f1c840099b0 tx=0x7f1c8402f240 comp rx=0 tx=0).stop 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 -- 192.168.123.103:0/1887129502 shutdown_connections 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 --2- 192.168.123.103:0/1887129502 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100cc0 0x7f1c94104190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 --2- 192.168.123.103:0/1887129502 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100310 0x7f1c941006f0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 -- 192.168.123.103:0/1887129502 >> 192.168.123.103:0/1887129502 conn(0x7f1c940fa4a0 msgr2=0x7f1c940fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 -- 192.168.123.103:0/1887129502 shutdown_connections 2026-03-09T16:19:52.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.269+0000 7f1c9b997640 1 -- 192.168.123.103:0/1887129502 wait complete. 2026-03-09T16:19:52.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.270+0000 7f1c9b997640 1 Processor -- start 2026-03-09T16:19:52.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.270+0000 7f1c9b997640 1 -- start start 2026-03-09T16:19:52.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9b997640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9b997640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 0x7f1c94198880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9970c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c98f0b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 0x7f1c94198880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9970c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:44288/0 (socket says 192.168.123.103:44288) 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c98f0b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 0x7f1c94198880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44926/0 (socket says 192.168.123.103:44926) 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9970c640 1 -- 192.168.123.103:0/1774700637 learned_addr learned my addr 192.168.123.103:0/1774700637 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c94198f10 con 0x7f1c94100cc0 2026-03-09T16:19:52.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.271+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c9419cc80 con 0x7f1c94100310 2026-03-09T16:19:52.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.272+0000 7f1c9970c640 1 -- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 msgr2=0x7f1c94198880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.272+0000 7f1c9970c640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 0x7f1c94198880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.272+0000 7f1c9970c640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1c84009660 con 0x7f1c94100310 2026-03-09T16:19:52.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.272+0000 7f1c9970c640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f1c84009ae0 tx=0x7f1c84031cf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.273+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c8403d070 con 0x7f1c94100310 2026-03-09T16:19:52.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.273+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1c9419cf00 con 0x7f1c94100310 2026-03-09T16:19:52.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.273+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1c9419d470 con 0x7f1c94100310 2026-03-09T16:19:52.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.274+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1c840043f0 con 0x7f1c94100310 2026-03-09T16:19:52.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.274+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c84031280 con 0x7f1c94100310 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.sqhria", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: osdmap e80: 6 total, 6 up, 6 in 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: Standby daemon mds.cephfs.vm03.kntrco assigned to filesystem cephfs as rank 0 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T16:19:52.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:51 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:replay} 2 up:standby 2026-03-09T16:19:52.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.275+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1c840388c0 con 0x7f1c94100310 2026-03-09T16:19:52.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.275+0000 7f1c827fc640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 0x7f1c6c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.275+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f1c840be910 con 0x7f1c94100310 2026-03-09T16:19:52.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.276+0000 7f1c98f0b640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 0x7f1c6c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.276+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1c940fea90 con 0x7f1c94100310 2026-03-09T16:19:52.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.277+0000 7f1c98f0b640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 0x7f1c6c079da0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f1c941998f0 tx=0x7f1c88009210 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.280+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1c84086f40 con 0x7f1c94100310 2026-03-09T16:19:52.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.400+0000 7f1c9b997640 1 -- 192.168.123.103:0/1774700637 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1c9410d9b0 con 0x7f1c6c0778e0 2026-03-09T16:19:52.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.401+0000 7f1c827fc640 1 -- 192.168.123.103:0/1774700637 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f1c9410d9b0 con 0x7f1c6c0778e0 2026-03-09T16:19:52.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.408+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 msgr2=0x7f1c6c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.408+0000 7f1c63fff640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 0x7f1c6c079da0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f1c941998f0 tx=0x7f1c88009210 comp rx=0 tx=0).stop 2026-03-09T16:19:52.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.408+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 msgr2=0x7f1c94198340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.408+0000 7f1c63fff640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f1c84009ae0 tx=0x7f1c84031cf0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 shutdown_connections 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1c6c0778e0 0x7f1c6c079da0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c94100cc0 0x7f1c94198880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 --2- 192.168.123.103:0/1774700637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1c94100310 0x7f1c94198340 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 >> 192.168.123.103:0/1774700637 conn(0x7f1c940fa4a0 msgr2=0x7f1c940faa30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 shutdown_connections 2026-03-09T16:19:52.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.411+0000 7f1c63fff640 1 -- 192.168.123.103:0/1774700637 wait complete. 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.482+0000 7f1b99e9a640 1 -- 192.168.123.103:0/1752664328 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072120 msgr2=0x7f1b94072500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.482+0000 7f1b99e9a640 1 --2- 192.168.123.103:0/1752664328 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072120 0x7f1b94072500 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f1b7c0099b0 tx=0x7f1b7c02f220 comp rx=0 tx=0).stop 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 -- 192.168.123.103:0/1752664328 shutdown_connections 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 --2- 192.168.123.103:0/1752664328 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b94072a40 0x7f1b9410ca90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 --2- 192.168.123.103:0/1752664328 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072120 0x7f1b94072500 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 -- 192.168.123.103:0/1752664328 >> 192.168.123.103:0/1752664328 conn(0x7f1b9406c7d0 msgr2=0x7f1b9406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 -- 192.168.123.103:0/1752664328 shutdown_connections 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 -- 192.168.123.103:0/1752664328 wait complete. 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 Processor -- start 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.483+0000 7f1b99e9a640 1 -- start start 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b99e9a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b99e9a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 0x7f1b941abde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b99e9a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b941a7fe0 con 0x7f1b94072a40 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b99e9a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b941a8150 con 0x7f1b941a79e0 2026-03-09T16:19:52.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b98e98640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b93fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 0x7f1b941abde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b98e98640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44942/0 (socket says 192.168.123.103:44942) 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.484+0000 7f1b98e98640 1 -- 192.168.123.103:0/2746580937 learned_addr learned my addr 192.168.123.103:0/2746580937 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b93fff640 1 -- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 msgr2=0x7f1b941a74a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b93fff640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b93fff640 1 -- 192.168.123.103:0/2746580937 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b7c009660 con 0x7f1b941a79e0 2026-03-09T16:19:52.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b98e98640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:19:52.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b93fff640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 0x7f1b941abde0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f1b8c00efc0 tx=0x7f1b8c00c490 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b8c009280 con 0x7f1b941a79e0 2026-03-09T16:19:52.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b99e9a640 1 -- 192.168.123.103:0/2746580937 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b941ac3e0 con 0x7f1b941a79e0 2026-03-09T16:19:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1b8c00f040 con 0x7f1b941a79e0 2026-03-09T16:19:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.485+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b8c007500 con 0x7f1b941a79e0 2026-03-09T16:19:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.486+0000 7f1b99e9a640 1 -- 192.168.123.103:0/2746580937 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b941ac930 con 0x7f1b941a79e0 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.487+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1b8c007660 con 0x7f1b941a79e0 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.487+0000 7f1b91ffb640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 0x7f1b70079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.487+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f1b8c0998f0 con 0x7f1b941a79e0 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.488+0000 7f1b98e98640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 0x7f1b70079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.488+0000 7f1b99e9a640 1 -- 192.168.123.103:0/2746580937 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b9410f9f0 con 0x7f1b941a79e0 2026-03-09T16:19:52.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.488+0000 7f1b98e98640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 0x7f1b70079da0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f1b7c005bb0 tx=0x7f1b7c03a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.492+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1b8c061fd0 con 0x7f1b941a79e0 2026-03-09T16:19:52.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.594+0000 7f1b99e9a640 1 -- 192.168.123.103:0/2746580937 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1b941a8800 con 0x7f1b700778e0 2026-03-09T16:19:52.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.601+0000 7f1b91ffb640 1 -- 192.168.123.103:0/2746580937 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f1b941a8800 con 0x7f1b700778e0 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 8s ago 9m 25.4M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (9m) 8s ago 9m 9.93M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 6555290daeb9 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (9m) 2s ago 9m 10.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 23ca0ac664fd 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 8s ago 9m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (2m) 2s ago 9m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 8s ago 9m 91.0M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (9s) 8s ago 7m 12.4M - 19.2.3-678-ge911bdeb 654f31e6858e feea99babe02 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (30s) 8s ago 7m 18.4M - 19.2.3-678-ge911bdeb 654f31e6858e a895bdb15107 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (3s) 2s ago 7m 16.7M - 19.2.3-678-ge911bdeb 654f31e6858e 44438ec5f534 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (7m) 2s ago 7m 98.1M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 e7155e6e0a47 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (5m) 8s ago 10m 612M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (4m) 2s ago 9m 497M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 8s ago 10m 63.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (2m) 2s ago 9m 51.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 8s ago 9m 9991k - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 2s ago 9m 9810k - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 8s ago 8m 131M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 8s ago 8m 111M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (115s) 8s ago 8m 98.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (88s) 2s ago 8m 155M 4096M 19.2.3-678-ge911bdeb 654f31e6858e c052610d74d5 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (64s) 2s ago 8m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4115e4720b89 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (38s) 2s ago 7m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e d93569840b13 2026-03-09T16:19:52.602 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 8s ago 9m 51.8M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.604+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 msgr2=0x7f1b70079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.604+0000 7f1b637fe640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 0x7f1b70079da0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f1b7c005bb0 tx=0x7f1b7c03a040 comp rx=0 tx=0).stop 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.604+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 msgr2=0x7f1b941abde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.604+0000 7f1b637fe640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 0x7f1b941abde0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f1b8c00efc0 tx=0x7f1b8c00c490 comp rx=0 tx=0).stop 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.605+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 shutdown_connections 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.605+0000 7f1b637fe640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f1b700778e0 0x7f1b70079da0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.605+0000 7f1b637fe640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b941a79e0 0x7f1b941abde0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.605+0000 7f1b637fe640 1 --2- 192.168.123.103:0/2746580937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b94072a40 0x7f1b941a74a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.605+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 >> 192.168.123.103:0/2746580937 conn(0x7f1b9406c7d0 msgr2=0x7f1b94070eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.606+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 shutdown_connections 2026-03-09T16:19:52.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.606+0000 7f1b637fe640 1 -- 192.168.123.103:0/2746580937 wait complete. 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.673+0000 7f56f4885640 1 -- 192.168.123.103:0/2582004846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0102800 msgr2=0x7f56f0102c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.673+0000 7f56f4885640 1 --2- 192.168.123.103:0/2582004846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0102800 0x7f56f0102c60 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f56e00099b0 tx=0x7f56e002f240 comp rx=0 tx=0).stop 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 -- 192.168.123.103:0/2582004846 shutdown_connections 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 --2- 192.168.123.103:0/2582004846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0102800 0x7f56f0102c60 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 --2- 192.168.123.103:0/2582004846 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0108800 0x7f56f0108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 -- 192.168.123.103:0/2582004846 >> 192.168.123.103:0/2582004846 conn(0x7f56f00fe540 msgr2=0x7f56f0100960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 -- 192.168.123.103:0/2582004846 shutdown_connections 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 -- 192.168.123.103:0/2582004846 wait complete. 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 Processor -- start 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.674+0000 7f56f4885640 1 -- start start 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56f4885640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56f4885640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0108800 0x7f56f01a0a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56f4885640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56f01a1050 con 0x7f56f0108800 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56f4885640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56f019a5e0 con 0x7f56f0102800 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:44326/0 (socket says 192.168.123.103:44326) 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 -- 192.168.123.103:0/407918214 learned_addr learned my addr 192.168.123.103:0/407918214 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 -- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0108800 msgr2=0x7f56f01a0a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0108800 0x7f56f01a0a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.675+0000 7f56ef7fe640 1 -- 192.168.123.103:0/407918214 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56e0009660 con 0x7f56f0102800 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.676+0000 7f56ef7fe640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f56dc00e990 tx=0x7f56dc00ee60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.676+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56dc00cd30 con 0x7f56f0102800 2026-03-09T16:19:52.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.676+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f56dc00ce90 con 0x7f56f0102800 2026-03-09T16:19:52.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.676+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56dc010640 con 0x7f56f0102800 2026-03-09T16:19:52.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.676+0000 7f56f4885640 1 -- 192.168.123.103:0/407918214 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56f019a8c0 con 0x7f56f0102800 2026-03-09T16:19:52.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.677+0000 7f56f4885640 1 -- 192.168.123.103:0/407918214 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56f019ade0 con 0x7f56f0102800 2026-03-09T16:19:52.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.677+0000 7f56f4885640 1 -- 192.168.123.103:0/407918214 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56f0103f40 con 0x7f56f0102800 2026-03-09T16:19:52.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.679+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f56dc0026e0 con 0x7f56f0102800 2026-03-09T16:19:52.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.679+0000 7f56ecff9640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 0x7f56c8079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.680+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f56dc01d030 con 0x7f56f0102800 2026-03-09T16:19:52.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.680+0000 7f56eeffd640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 0x7f56c8079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.680+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f56dc0625e0 con 0x7f56f0102800 2026-03-09T16:19:52.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.688+0000 7f56eeffd640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 0x7f56c8079b90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f56f019ba50 tx=0x7f56e0038530 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.811+0000 7f56f4885640 1 -- 192.168.123.103:0/407918214 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f56f010fab0 con 0x7f56f0102800 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.812+0000 7f56ecff9640 1 -- 192.168.123.103:0/407918214 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f56dc061d30 con 0x7f56f0102800 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 13 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:19:52.813 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:19:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.815+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 msgr2=0x7f56c8079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.815+0000 7f56d27fc640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 0x7f56c8079b90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f56f019ba50 tx=0x7f56e0038530 comp rx=0 tx=0).stop 2026-03-09T16:19:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 msgr2=0x7f56f01a04f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f56dc00e990 tx=0x7f56dc00ee60 comp rx=0 tx=0).stop 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 shutdown_connections 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f56c80776d0 0x7f56c8079b90 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56f0108800 0x7f56f01a0a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 --2- 192.168.123.103:0/407918214 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56f0102800 0x7f56f01a04f0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 >> 192.168.123.103:0/407918214 conn(0x7f56f00fe540 msgr2=0x7f56f010c7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 shutdown_connections 2026-03-09T16:19:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.816+0000 7f56d27fc640 1 -- 192.168.123.103:0/407918214 wait complete. 2026-03-09T16:19:52.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 -- 192.168.123.103:0/2599593024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8480729d0 msgr2=0x7fa84810b9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 --2- 192.168.123.103:0/2599593024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8480729d0 0x7fa84810b9f0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fa84000b0a0 tx=0x7fa84002f550 comp rx=0 tx=0).stop 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 -- 192.168.123.103:0/2599593024 shutdown_connections 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 --2- 192.168.123.103:0/2599593024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8480729d0 0x7fa84810b9f0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 --2- 192.168.123.103:0/2599593024 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8480720b0 0x7fa848072490 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 -- 192.168.123.103:0/2599593024 >> 192.168.123.103:0/2599593024 conn(0x7fa84806c7e0 msgr2=0x7fa84806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.873+0000 7fa84eb02640 1 -- 192.168.123.103:0/2599593024 shutdown_connections 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 -- 192.168.123.103:0/2599593024 wait complete. 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 Processor -- start 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 -- start start 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8481a7750 0x7fa8481a7b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8481ab5b0 con 0x7fa8481a7750 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa84eb02640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8481ab6f0 con 0x7fa8481a8070 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa847fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa847fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:44340/0 (socket says 192.168.123.103:44340) 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.874+0000 7fa847fff640 1 -- 192.168.123.103:0/2451139800 learned_addr learned my addr 192.168.123.103:0/2451139800 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:52.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa847fff640 1 -- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8481a7750 msgr2=0x7fa8481a7b30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa847fff640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8481a7750 0x7fa8481a7b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa847fff640 1 -- 192.168.123.103:0/2451139800 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa840009d00 con 0x7fa8481a8070 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa847fff640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fa840009fd0 tx=0x7fa840009510 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa84003c070 con 0x7fa8481a8070 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa84eb02640 1 -- 192.168.123.103:0/2451139800 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa8481ab970 con 0x7fa8481a8070 2026-03-09T16:19:52.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.875+0000 7fa84eb02640 1 -- 192.168.123.103:0/2451139800 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa8480773c0 con 0x7fa8481a8070 2026-03-09T16:19:52.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.876+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa840032070 con 0x7fa8481a8070 2026-03-09T16:19:52.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.876+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa840037640 con 0x7fa8481a8070 2026-03-09T16:19:52.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.876+0000 7fa84eb02640 1 -- 192.168.123.103:0/2451139800 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa848108570 con 0x7fa8481a8070 2026-03-09T16:19:52.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.877+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa840004410 con 0x7fa8481a8070 2026-03-09T16:19:52.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.878+0000 7fa845ffb640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 0x7fa82c079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:52.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.878+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fa8400bee30 con 0x7fa8481a8070 2026-03-09T16:19:52.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.880+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa8400874e0 con 0x7fa8481a8070 2026-03-09T16:19:52.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.880+0000 7fa84c877640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 0x7fa82c079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:52.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:52.909+0000 7fa84c877640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 0x7fa82c079e40 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fa84806d990 tx=0x7fa830006cf0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:53.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.005+0000 7fa84eb02640 1 -- 192.168.123.103:0/2451139800 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fa8481a8ae0 con 0x7fa8481a8070 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.006+0000 7fa845ffb640 1 -- 192.168.123.103:0/2451139800 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 27 v27) v1 ==== 76+0+1787 (secure 0 0 0) 0x7fa840086c30 con 0x7fa8481a8070 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:e27 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T16:19:51:831005+0000 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:epoch 27 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:19:51.831000+0000 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:19:53.007 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 80 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34272} 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{0:34272} state up:replay seq 1 join_fscid=1 addr [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:34276} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/3112850580,v1:192.168.123.105:6825/3112850580] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{-1:44223} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6826/892320051,v1:192.168.123.103:6827/892320051] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:19:53.008 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 27 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 msgr2=0x7fa82c079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 0x7fa82c079e40 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fa84806d990 tx=0x7fa830006cf0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 msgr2=0x7fa8481aafe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fa840009fd0 tx=0x7fa840009510 comp rx=0 tx=0).stop 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 shutdown_connections 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.009+0000 7fa8177fe640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fa82c077980 0x7fa82c079e40 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.010+0000 7fa8177fe640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8481a8070 0x7fa8481aafe0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.010+0000 7fa8177fe640 1 --2- 192.168.123.103:0/2451139800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8481a7750 0x7fa8481a7b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.010+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 >> 192.168.123.103:0/2451139800 conn(0x7fa84806c7e0 msgr2=0x7fa84806eec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.010+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 shutdown_connections 2026-03-09T16:19:53.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.010+0000 7fa8177fe640 1 -- 192.168.123.103:0/2451139800 wait complete. 2026-03-09T16:19:53.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 -- 192.168.123.103:0/3514044527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072140 msgr2=0x7f2528072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 --2- 192.168.123.103:0/3514044527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072140 0x7f2528072520 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f2518008880 tx=0x7f251802eeb0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 -- 192.168.123.103:0/3514044527 shutdown_connections 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 --2- 192.168.123.103:0/3514044527 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2528072af0 0x7f252810ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 --2- 192.168.123.103:0/3514044527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072140 0x7f2528072520 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 -- 192.168.123.103:0/3514044527 >> 192.168.123.103:0/3514044527 conn(0x7f252806c7e0 msgr2=0x7f252806cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 -- 192.168.123.103:0/3514044527 shutdown_connections 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.073+0000 7f252d6f4640 1 -- 192.168.123.103:0/3514044527 wait complete. 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 Processor -- start 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 -- start start 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072af0 0x7f252807d470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2528084490 con 0x7f2528072af0 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f252d6f4640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2528084600 con 0x7f252807d9b0 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f25267fc640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f25267fc640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:44368/0 (socket says 192.168.123.103:44368) 2026-03-09T16:19:53.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.074+0000 7f25267fc640 1 -- 192.168.123.103:0/4170076055 learned_addr learned my addr 192.168.123.103:0/4170076055 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:53.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.075+0000 7f25267fc640 1 -- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072af0 msgr2=0x7f252807d470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.075+0000 7f25267fc640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072af0 0x7f252807d470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.075+0000 7f25267fc640 1 -- 192.168.123.103:0/4170076055 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2518008530 con 0x7f252807d9b0 2026-03-09T16:19:53.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.075+0000 7f25267fc640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f25200103f0 tx=0x7f25200108c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:53.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.079+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f252000b490 con 0x7f252807d9b0 2026-03-09T16:19:53.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.079+0000 7f252d6f4640 1 -- 192.168.123.103:0/4170076055 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2528082060 con 0x7f252807d9b0 2026-03-09T16:19:53.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.079+0000 7f252d6f4640 1 -- 192.168.123.103:0/4170076055 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25280825b0 con 0x7f252807d9b0 2026-03-09T16:19:53.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.079+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2520011040 con 0x7f252807d9b0 2026-03-09T16:19:53.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.079+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25200042e0 con 0x7f252807d9b0 2026-03-09T16:19:53.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.084+0000 7f252d6f4640 1 -- 192.168.123.103:0/4170076055 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2528108570 con 0x7f252807d9b0 2026-03-09T16:19:53.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.084+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2520009890 con 0x7f252807d9b0 2026-03-09T16:19:53.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.085+0000 7f2507fff640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 0x7f250c079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.085+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f252009ba40 con 0x7f252807d9b0 2026-03-09T16:19:53.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.085+0000 7f2526ffd640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 0x7f250c079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:53.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.085+0000 7f2526ffd640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 0x7f250c079f10 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f2518008880 tx=0x7f25180047e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:53.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.087+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f25200641a0 con 0x7f252807d9b0 2026-03-09T16:19:53.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:52 vm03.local ceph-mon[133973]: Upgrade: Updating mds.cephfs.vm05.sqhria 2026-03-09T16:19:53.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:52 vm03.local ceph-mon[133973]: Deploying daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:19:53.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:52 vm03.local ceph-mon[133973]: pgmap v126: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s rd, 6.1 KiB/s wr, 7 op/s 2026-03-09T16:19:53.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:52 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/407918214' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:53.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.194+0000 7f252d6f4640 1 -- 192.168.123.103:0/4170076055 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f25280797b0 con 0x7f250c077a50 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.196+0000 7f2507fff640 1 -- 192.168.123.103:0/4170076055 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f25280797b0 con 0x7f250c077a50 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "15/23 daemons upgraded", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:19:53.196 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:19:53.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 msgr2=0x7f250c079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 0x7f250c079f10 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f2518008880 tx=0x7f25180047e0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 msgr2=0x7f252807de10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f25200103f0 tx=0x7f25200108c0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 shutdown_connections 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f250c077a50 0x7f250c079f10 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f252807d9b0 0x7f252807de10 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 --2- 192.168.123.103:0/4170076055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2528072af0 0x7f252807d470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 >> 192.168.123.103:0/4170076055 conn(0x7f252806c7e0 msgr2=0x7f252806f430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 shutdown_connections 2026-03-09T16:19:53.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.198+0000 7f2505ffb640 1 -- 192.168.123.103:0/4170076055 wait complete. 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- 192.168.123.103:0/2723873009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e80072af0 msgr2=0x7f5e8010ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 --2- 192.168.123.103:0/2723873009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e80072af0 0x7f5e8010ba70 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5e7800b600 tx=0x7f5e78030670 comp rx=0 tx=0).stop 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- 192.168.123.103:0/2723873009 shutdown_connections 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 --2- 192.168.123.103:0/2723873009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e80072af0 0x7f5e8010ba70 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 --2- 192.168.123.103:0/2723873009 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e80072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- 192.168.123.103:0/2723873009 >> 192.168.123.103:0/2723873009 conn(0x7f5e8006c7e0 msgr2=0x7f5e8006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- 192.168.123.103:0/2723873009 shutdown_connections 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- 192.168.123.103:0/2723873009 wait complete. 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 Processor -- start 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- start start 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e801332b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 0x7f5e8007e8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e80133df0 con 0x7f5e801337f0 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.255+0000 7f5e85618640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e80133f60 con 0x7f5e80072140 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 0x7f5e8007e8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7e7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 0x7f5e8007e8e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45018/0 (socket says 192.168.123.103:45018) 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7effd640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e801332b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7e7fc640 1 -- 192.168.123.103:0/2255217994 learned_addr learned my addr 192.168.123.103:0/2255217994 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7effd640 1 -- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 msgr2=0x7f5e8007e8e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7effd640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 0x7f5e8007e8e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7effd640 1 -- 192.168.123.103:0/2255217994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5e78009d00 con 0x7f5e80072140 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e7effd640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e801332b0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f5e7000b4d0 tx=0x7f5e7000b9a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:53.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e70004300 con 0x7f5e80072140 2026-03-09T16:19:53.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e85618640 1 -- 192.168.123.103:0/2255217994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5e8007eee0 con 0x7f5e80072140 2026-03-09T16:19:53.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.256+0000 7f5e85618640 1 -- 192.168.123.103:0/2255217994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5e8007f430 con 0x7f5e80072140 2026-03-09T16:19:53.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.257+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5e70004460 con 0x7f5e80072140 2026-03-09T16:19:53.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.257+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e70010be0 con 0x7f5e80072140 2026-03-09T16:19:53.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.257+0000 7f5e85618640 1 -- 192.168.123.103:0/2255217994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5e80108570 con 0x7f5e80072140 2026-03-09T16:19:53.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.258+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5e70002780 con 0x7f5e80072140 2026-03-09T16:19:53.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.259+0000 7f5e5ffff640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 0x7f5e60079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:19:53.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.259+0000 7f5e7e7fc640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 0x7f5e60079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:19:53.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.259+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f5e70099ae0 con 0x7f5e80072140 2026-03-09T16:19:53.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.259+0000 7f5e7e7fc640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 0x7f5e60079da0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f5e78002790 tx=0x7f5e78009be0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:19:53.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.260+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5e70062110 con 0x7f5e80072140 2026-03-09T16:19:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:52 vm05.local ceph-mon[108543]: Upgrade: Updating mds.cephfs.vm05.sqhria 2026-03-09T16:19:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:52 vm05.local ceph-mon[108543]: Deploying daemon mds.cephfs.vm05.sqhria on vm05 2026-03-09T16:19:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:52 vm05.local ceph-mon[108543]: pgmap v126: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s rd, 6.1 KiB/s wr, 7 op/s 2026-03-09T16:19:53.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:52 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/407918214' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:19:53.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.398+0000 7f5e85618640 1 -- 192.168.123.103:0/2255217994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f5e80072af0 con 0x7f5e80072140 2026-03-09T16:19:53.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.398+0000 7f5e5ffff640 1 -- 192.168.123.103:0/2255217994 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+297 (secure 0 0 0) 0x7f5e70061860 con 0x7f5e80072140 2026-03-09T16:19:53.400 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem is degraded; 1 filesystem with deprecated feature inline_data 2026-03-09T16:19:53.400 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_DEGRADED: 1 filesystem is degraded 2026-03-09T16:19:53.400 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs is degraded 2026-03-09T16:19:53.400 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:19:53.400 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.401+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 msgr2=0x7f5e60079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.401+0000 7f5e5dffb640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 0x7f5e60079da0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f5e78002790 tx=0x7f5e78009be0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.401+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 msgr2=0x7f5e801332b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.401+0000 7f5e5dffb640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e801332b0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f5e7000b4d0 tx=0x7f5e7000b9a0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 shutdown_connections 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5e600778e0 0x7f5e60079da0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5e801337f0 0x7f5e8007e8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 --2- 192.168.123.103:0/2255217994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e80072140 0x7f5e801332b0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:19:53.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 >> 192.168.123.103:0/2255217994 conn(0x7f5e8006c7e0 msgr2=0x7f5e8006f960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:19:53.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 shutdown_connections 2026-03-09T16:19:53.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:19:53.402+0000 7f5e5dffb640 1 -- 192.168.123.103:0/2255217994 wait complete. 2026-03-09T16:19:54.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:53 vm05.local ceph-mon[108543]: from='client.34280 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:53 vm05.local ceph-mon[108543]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:53 vm05.local ceph-mon[108543]: from='client.44239 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:53 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2451139800' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:19:54.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:53 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2255217994' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:19:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:53 vm03.local ceph-mon[133973]: from='client.34280 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:53 vm03.local ceph-mon[133973]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:53 vm03.local ceph-mon[133973]: from='client.44239 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:54.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:53 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2451139800' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:19:54.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:53 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2255217994' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:19:55.403 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:55 vm03.local ceph-mon[133973]: from='client.44249 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:55.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:55 vm03.local ceph-mon[133973]: pgmap v127: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 2.5 MiB/s rd, 6.0 KiB/s wr, 3 op/s 2026-03-09T16:19:55.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:55 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:55 vm05.local ceph-mon[108543]: from='client.44249 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:19:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:55 vm05.local ceph-mon[108543]: pgmap v127: 65 pgs: 65 active+clean; 254 MiB data, 975 MiB used, 119 GiB / 120 GiB avail; 2.5 MiB/s rd, 6.0 KiB/s wr, 3 op/s 2026-03-09T16:19:55.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:55 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:19:56.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:56 vm05.local ceph-mon[108543]: pgmap v128: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 5.9 KiB/s wr, 8 op/s 2026-03-09T16:19:56.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:56 vm03.local ceph-mon[133973]: pgmap v128: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 5.9 KiB/s wr, 8 op/s 2026-03-09T16:19:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:58 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:reconnect 2026-03-09T16:19:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:58 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:reconnect} 2 up:standby 2026-03-09T16:19:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:58 vm03.local ceph-mon[133973]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0 2026-03-09T16:19:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:58 vm03.local ceph-mon[133973]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0 2026-03-09T16:19:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:58 vm03.local ceph-mon[133973]: pgmap v129: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4.8 KiB/s wr, 7 op/s 2026-03-09T16:19:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:58 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:reconnect 2026-03-09T16:19:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:58 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:reconnect} 2 up:standby 2026-03-09T16:19:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:58 vm05.local ceph-mon[108543]: reconnect by client.14522 192.168.144.1:0/2261305767 after 0 2026-03-09T16:19:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:58 vm05.local ceph-mon[108543]: reconnect by client.24311 192.168.144.1:0/4162745798 after 0 2026-03-09T16:19:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:58 vm05.local ceph-mon[108543]: pgmap v129: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4.8 KiB/s wr, 7 op/s 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:rejoin 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:rejoin} 2 up:standby 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: daemon mds.cephfs.vm03.kntrco is now active in filesystem cephfs as rank 0 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:59.306 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:19:59 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:rejoin 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: fsmap cephfs:1/1 {0=cephfs.vm03.kntrco=up:rejoin} 2 up:standby 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: daemon mds.cephfs.vm03.kntrco is now active in filesystem cephfs as rank 0 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:19:59.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:19:59 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:active 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: mds.? [v2:192.168.123.105:6826/2319193942,v1:192.168.123.105:6827/2319193942] up:boot 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: pgmap v130: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 8 op/s 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.638 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:00 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] up:active 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: mds.? [v2:192.168.123.105:6826/2319193942,v1:192.168.123.105:6827/2319193942] up:boot 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: pgmap v130: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 8 op/s 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:00.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:00 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kntrco"}]': finished 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kygyjl"}]': finished 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.jgzfvu"}]': finished 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.sqhria"}]': finished 2026-03-09T16:20:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:01 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kntrco"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kntrco"}]': finished 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kygyjl"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.kygyjl"}]': finished 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.jgzfvu"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.jgzfvu"}]': finished 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.sqhria"}]: dispatch 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.sqhria"}]': finished 2026-03-09T16:20:01.891 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:01 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T16:20:03.134 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:02 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all mds 2026-03-09T16:20:03.134 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:02 vm03.local ceph-mon[133973]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T16:20:03.134 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:02 vm03.local ceph-mon[133973]: pgmap v131: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 201 B/s wr, 9 op/s 2026-03-09T16:20:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:02 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all mds 2026-03-09T16:20:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:02 vm05.local ceph-mon[108543]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T16:20:03.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:02 vm05.local ceph-mon[108543]: pgmap v131: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 201 B/s wr, 9 op/s 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all rgw 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:20:04.021 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:03 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all rgw 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:20:04.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:03 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-09T16:20:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-09T16:20:05.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: pgmap v132: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-09T16:20:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:20:05.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:04 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: pgmap v132: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T16:20:05.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:04 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:06.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:05 vm03.local ceph-mon[133973]: Upgrade: Updating ceph-exporter.vm05 (2/2) 2026-03-09T16:20:06.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:05 vm03.local ceph-mon[133973]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-09T16:20:06.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:05 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:05 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:05 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:05 vm05.local ceph-mon[108543]: Upgrade: Updating ceph-exporter.vm05 (2/2) 2026-03-09T16:20:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:05 vm05.local ceph-mon[108543]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-09T16:20:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:05 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:05 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:05 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:06.872 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:06 vm03.local ceph-mon[133973]: pgmap v133: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-09T16:20:06.872 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.872 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.872 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:06.872 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:06 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.125 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:06 vm05.local ceph-mon[108543]: pgmap v133: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-09T16:20:07.125 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.125 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.125 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.125 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:06 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:07 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:07.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:07 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: pgmap v134: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 9.6 MiB/s rd, 4.2 KiB/s wr, 7 op/s 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.141 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:08 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: pgmap v134: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 9.6 MiB/s rd, 4.2 KiB/s wr, 7 op/s 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:09.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:08 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.392 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: fsmap cephfs:1 {0=cephfs.vm03.kntrco=up:active} 3 up:standby 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:10.527 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all iscsi 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all nfs 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Setting container_image for all nvmeof 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Finalizing container_image settings 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: Upgrade: Complete! 2026-03-09T16:20:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:11 vm03.local ceph-mon[133973]: pgmap v135: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 9.6 MiB/s rd, 4.2 KiB/s wr, 7 op/s 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all iscsi 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all nfs 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Setting container_image for all nvmeof 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Finalizing container_image settings 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: Upgrade: Complete! 2026-03-09T16:20:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:11 vm05.local ceph-mon[108543]: pgmap v135: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 9.6 MiB/s rd, 4.2 KiB/s wr, 7 op/s 2026-03-09T16:20:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:13 vm03.local ceph-mon[133973]: pgmap v136: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 2.4 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-09T16:20:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:13 vm05.local ceph-mon[108543]: pgmap v136: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 2.4 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-09T16:20:15.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:15 vm03.local ceph-mon[133973]: pgmap v137: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 2.2 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-09T16:20:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:15 vm05.local ceph-mon[108543]: pgmap v137: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 2.2 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-09T16:20:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:17 vm03.local ceph-mon[133973]: pgmap v138: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 0 B/s wr, 1 op/s 2026-03-09T16:20:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:17 vm05.local ceph-mon[108543]: pgmap v138: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 0 B/s wr, 1 op/s 2026-03-09T16:20:19.348 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:19 vm03.local ceph-mon[133973]: pgmap v139: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:19 vm05.local ceph-mon[108543]: pgmap v139: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:21.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:21 vm03.local ceph-mon[133973]: pgmap v140: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:21 vm05.local ceph-mon[108543]: pgmap v140: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:23 vm03.local ceph-mon[133973]: pgmap v141: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.461+0000 7f28831df640 1 -- 192.168.123.103:0/2784689489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072340 msgr2=0x7f287c072720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.461+0000 7f28831df640 1 --2- 192.168.123.103:0/2784689489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072340 0x7f287c072720 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f28680099b0 tx=0x7f286802f240 comp rx=0 tx=0).stop 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.462+0000 7f28831df640 1 -- 192.168.123.103:0/2784689489 shutdown_connections 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.462+0000 7f28831df640 1 --2- 192.168.123.103:0/2784689489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072cf0 0x7f287c10cd90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.462+0000 7f28831df640 1 --2- 192.168.123.103:0/2784689489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072340 0x7f287c072720 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.462+0000 7f28831df640 1 -- 192.168.123.103:0/2784689489 >> 192.168.123.103:0/2784689489 conn(0x7f287c06b7f0 msgr2=0x7f287c06bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 -- 192.168.123.103:0/2784689489 shutdown_connections 2026-03-09T16:20:23.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 -- 192.168.123.103:0/2784689489 wait complete. 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 Processor -- start 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 -- start start 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 0x7f287c1adaa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f287c1ae0c0 con 0x7f287c072cf0 2026-03-09T16:20:23.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.463+0000 7f28831df640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f287c1a7650 con 0x7f287c072340 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:33742/0 (socket says 192.168.123.103:33742) 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 -- 192.168.123.103:0/3278232894 learned_addr learned my addr 192.168.123.103:0/3278232894 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 -- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 msgr2=0x7f287c1adaa0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f287bfff640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 0x7f287c1adaa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 0x7f287c1adaa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f2880f54640 1 -- 192.168.123.103:0/3278232894 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2868009660 con 0x7f287c072340 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.464+0000 7f287bfff640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 0x7f287c1adaa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.465+0000 7f2880f54640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2868005ec0 tx=0x7f2868004300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:23.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.465+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f286803d070 con 0x7f287c072340 2026-03-09T16:20:23.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.465+0000 7f28831df640 1 -- 192.168.123.103:0/3278232894 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f287c1a78d0 con 0x7f287c072340 2026-03-09T16:20:23.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.465+0000 7f28831df640 1 -- 192.168.123.103:0/3278232894 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f287c1a7e40 con 0x7f287c072340 2026-03-09T16:20:23.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.465+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f28680043f0 con 0x7f287c072340 2026-03-09T16:20:23.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.466+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f28680418a0 con 0x7f287c072340 2026-03-09T16:20:23.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.466+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2840005350 con 0x7f287c072340 2026-03-09T16:20:23.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.467+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2868049050 con 0x7f287c072340 2026-03-09T16:20:23.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.467+0000 7f2879ffb640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 0x7f2854079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.467+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f28680beb00 con 0x7f287c072340 2026-03-09T16:20:23.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.467+0000 7f287bfff640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 0x7f2854079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:23.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.468+0000 7f287bfff640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 0x7f2854079b90 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2860006fd0 tx=0x7f2860008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:23.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.469+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2868086ae0 con 0x7f287c072340 2026-03-09T16:20:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:23 vm05.local ceph-mon[108543]: pgmap v141: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.573+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2840002bf0 con 0x7f28540776d0 2026-03-09T16:20:23.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.574+0000 7f2879ffb640 1 -- 192.168.123.103:0/3278232894 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f2840002bf0 con 0x7f28540776d0 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.576+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 msgr2=0x7f2854079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.576+0000 7f286f7fe640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 0x7f2854079b90 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2860006fd0 tx=0x7f2860008040 comp rx=0 tx=0).stop 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 msgr2=0x7f287c1ad560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2868005ec0 tx=0x7f2868004300 comp rx=0 tx=0).stop 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 shutdown_connections 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f28540776d0 0x7f2854079b90 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f287c072cf0 0x7f287c1adaa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 --2- 192.168.123.103:0/3278232894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f287c072340 0x7f287c1ad560 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 >> 192.168.123.103:0/3278232894 conn(0x7f287c06b7f0 msgr2=0x7f287c10df40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:23.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 shutdown_connections 2026-03-09T16:20:23.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.577+0000 7f286f7fe640 1 -- 192.168.123.103:0/3278232894 wait complete. 2026-03-09T16:20:23.628 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T16:20:23.769 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 -- 192.168.123.103:0/105750651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 msgr2=0x7f38f8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 --2- 192.168.123.103:0/105750651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f8102e30 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f38e4009a00 tx=0x7f38e402f280 comp rx=0 tx=0).stop 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 -- 192.168.123.103:0/105750651 shutdown_connections 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 --2- 192.168.123.103:0/105750651 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f8102e30 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 --2- 192.168.123.103:0/105750651 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38f81089d0 0x7f38f8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.991+0000 7f38fddaa640 1 -- 192.168.123.103:0/105750651 >> 192.168.123.103:0/105750651 conn(0x7f38f80fe710 msgr2=0x7f38f8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.992+0000 7f38fddaa640 1 -- 192.168.123.103:0/105750651 shutdown_connections 2026-03-09T16:20:23.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.992+0000 7f38fddaa640 1 -- 192.168.123.103:0/105750651 wait complete. 2026-03-09T16:20:23.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.992+0000 7f38fddaa640 1 Processor -- start 2026-03-09T16:20:23.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.992+0000 7f38fddaa640 1 -- start start 2026-03-09T16:20:23.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38fddaa640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38fddaa640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38f81089d0 0x7f38f81a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38fddaa640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38f81a1180 con 0x7f38f81029d0 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38fddaa640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38f819a710 con 0x7f38f81089d0 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37240/0 (socket says 192.168.123.103:37240) 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 -- 192.168.123.103:0/4124761219 learned_addr learned my addr 192.168.123.103:0/4124761219 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:20:23.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 -- 192.168.123.103:0/4124761219 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38f81089d0 msgr2=0x7f38f81a0b60 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:20:23.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38f81089d0 0x7f38f81a0b60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:23.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 -- 192.168.123.103:0/4124761219 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38e4009660 con 0x7f38f81029d0 2026-03-09T16:20:23.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.993+0000 7f38f77fe640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f38e800e9b0 tx=0x7f38e800ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:23.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.994+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38e800cd90 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.994+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f38e8004590 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.994+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38e8002a00 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.994+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38f819a990 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.994+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38f819aee0 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.996+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f38e8002ba0 con 0x7f38f81029d0 2026-03-09T16:20:23.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.996+0000 7f38f4ff9640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 0x7f38d0079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:23.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.996+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f38e80a2e20 con 0x7f38f81029d0 2026-03-09T16:20:23.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.996+0000 7f38f6ffd640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 0x7f38d0079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:23.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.997+0000 7f38f6ffd640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 0x7f38d0079da0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f38e40040c0 tx=0x7f38e40023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:23.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:23.997+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38c4005350 con 0x7f38f81029d0 2026-03-09T16:20:24.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.000+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f38e806c1b0 con 0x7f38f81029d0 2026-03-09T16:20:24.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.107+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f38c4002bf0 con 0x7f38d00778e0 2026-03-09T16:20:24.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.113+0000 7f38f4ff9640 1 -- 192.168.123.103:0/4124761219 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f38c4002bf0 con 0x7f38d00778e0 2026-03-09T16:20:24.113 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:20:24.113 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 17s ago 10m 25.4M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:20:24.113 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (20s) 17s ago 10m 9781k - 19.2.3-678-ge911bdeb 654f31e6858e 9faa71252568 2026-03-09T16:20:24.113 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (18s) 17s ago 9m 9713k - 19.2.3-678-ge911bdeb 654f31e6858e 7d4b674187be 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 17s ago 10m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (3m) 17s ago 9m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 17s ago 10m 91.2M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (41s) 17s ago 8m 98.4M - 19.2.3-678-ge911bdeb 654f31e6858e feea99babe02 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (61s) 17s ago 8m 18.6M - 19.2.3-678-ge911bdeb 654f31e6858e a895bdb15107 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (35s) 17s ago 8m 18.9M - 19.2.3-678-ge911bdeb 654f31e6858e 44438ec5f534 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (25s) 17s ago 8m 17.0M - 19.2.3-678-ge911bdeb 654f31e6858e 0443da67e5de 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (5m) 17s ago 10m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (5m) 17s ago 9m 497M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 17s ago 10m 64.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (3m) 17s ago 9m 53.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (5m) 17s ago 10m 9.77M - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (5m) 17s ago 9m 9.79M - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 17s ago 9m 131M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 17s ago 9m 112M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 17s ago 9m 99.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (119s) 17s ago 8m 155M 4096M 19.2.3-678-ge911bdeb 654f31e6858e c052610d74d5 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (95s) 17s ago 8m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4115e4720b89 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (70s) 17s ago 8m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e d93569840b13 2026-03-09T16:20:24.114 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 17s ago 9m 52.0M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 msgr2=0x7f38d0079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 0x7f38d0079da0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f38e40040c0 tx=0x7f38e40023d0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 msgr2=0x7f38f81a0620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f38e800e9b0 tx=0x7f38e800ee80 comp rx=0 tx=0).stop 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 shutdown_connections 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f38d00778e0 0x7f38d0079da0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38f81089d0 0x7f38f81a0b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 --2- 192.168.123.103:0/4124761219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38f81029d0 0x7f38f81a0620 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 >> 192.168.123.103:0/4124761219 conn(0x7f38f80fe710 msgr2=0x7f38f810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 shutdown_connections 2026-03-09T16:20:24.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.115+0000 7f38fddaa640 1 -- 192.168.123.103:0/4124761219 wait complete. 2026-03-09T16:20:24.162 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-09T16:20:24.314 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:24.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.543+0000 7f38213ab640 1 -- 192.168.123.103:0/51705896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1029d0 msgr2=0x7f381c102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.543+0000 7f38213ab640 1 --2- 192.168.123.103:0/51705896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1029d0 0x7f381c102e30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f3810009a00 tx=0x7f381002f280 comp rx=0 tx=0).stop 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 -- 192.168.123.103:0/51705896 shutdown_connections 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 --2- 192.168.123.103:0/51705896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1029d0 0x7f381c102e30 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 --2- 192.168.123.103:0/51705896 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1089d0 0x7f381c108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 -- 192.168.123.103:0/51705896 >> 192.168.123.103:0/51705896 conn(0x7f381c0fe710 msgr2=0x7f381c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 -- 192.168.123.103:0/51705896 shutdown_connections 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 -- 192.168.123.103:0/51705896 wait complete. 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 Processor -- start 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.544+0000 7f38213ab640 1 -- start start 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f38213ab640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 0x7f381c1a4c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f38213ab640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f38213ab640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f381c0784e0 con 0x7f381c1089d0 2026-03-09T16:20:24.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f38213ab640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f381c078650 con 0x7f381c1029d0 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381affd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 0x7f381c1a4c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37254/0 (socket says 192.168.123.103:37254) 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 -- 192.168.123.103:0/920059888 learned_addr learned my addr 192.168.123.103:0/920059888 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 -- 192.168.123.103:0/920059888 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 msgr2=0x7f381c1a4c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 0x7f381c1a4c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 -- 192.168.123.103:0/920059888 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3810009660 con 0x7f381c1089d0 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381affd640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 0x7f381c1a4c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.545+0000 7f381a7fc640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f381002f790 tx=0x7f3810004300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:24.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.546+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f381002fae0 con 0x7f381c1089d0 2026-03-09T16:20:24.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.546+0000 7f38213ab640 1 -- 192.168.123.103:0/920059888 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f381c0788d0 con 0x7f381c1089d0 2026-03-09T16:20:24.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.546+0000 7f38213ab640 1 -- 192.168.123.103:0/920059888 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f381c078dc0 con 0x7f381c1089d0 2026-03-09T16:20:24.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.546+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f381002fc40 con 0x7f381c1089d0 2026-03-09T16:20:24.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.546+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38100417e0 con 0x7f381c1089d0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.547+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f381c104110 con 0x7f381c1089d0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.548+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f381003f070 con 0x7f381c1089d0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.548+0000 7f37f3fff640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 0x7f37e8079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.548+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f38100be310 con 0x7f381c1089d0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.550+0000 7f381affd640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 0x7f37e8079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.551+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f38100869c0 con 0x7f381c1089d0 2026-03-09T16:20:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.551+0000 7f381affd640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 0x7f37e8079e70 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f381c079860 tx=0x7f3804009290 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:24.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.653+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f381c079900 con 0x7f37e80779b0 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.654+0000 7f37f3fff640 1 -- 192.168.123.103:0/920059888 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f381c079900 con 0x7f37e80779b0 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": null, 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": false, 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "which": "", 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "progress": null, 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T16:20:24.655 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:20:24.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.656+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 msgr2=0x7f37e8079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.656+0000 7f37f1ffb640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 0x7f37e8079e70 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f381c079860 tx=0x7f3804009290 comp rx=0 tx=0).stop 2026-03-09T16:20:24.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 msgr2=0x7f381c1a5160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:24.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f381002f790 tx=0x7f3810004300 comp rx=0 tx=0).stop 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 shutdown_connections 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f37e80779b0 0x7f37e8079e70 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f381c1089d0 0x7f381c1a5160 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 --2- 192.168.123.103:0/920059888 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f381c1029d0 0x7f381c1a4c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 >> 192.168.123.103:0/920059888 conn(0x7f381c0fe710 msgr2=0x7f381c0feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 shutdown_connections 2026-03-09T16:20:24.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:24.657+0000 7f37f1ffb640 1 -- 192.168.123.103:0/920059888 wait complete. 2026-03-09T16:20:24.718 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-09T16:20:24.859 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:25.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.100+0000 7f260ec77640 1 -- 192.168.123.103:0/1950568346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608075720 msgr2=0x7f2608075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.100+0000 7f260ec77640 1 --2- 192.168.123.103:0/1950568346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608075720 0x7f2608075b00 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f25f0009a00 tx=0x7f25f002f280 comp rx=0 tx=0).stop 2026-03-09T16:20:25.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 -- 192.168.123.103:0/1950568346 shutdown_connections 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 --2- 192.168.123.103:0/1950568346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608076040 0x7f2608111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 --2- 192.168.123.103:0/1950568346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608075720 0x7f2608075b00 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 -- 192.168.123.103:0/1950568346 >> 192.168.123.103:0/1950568346 conn(0x7f26080fe710 msgr2=0x7f2608100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 -- 192.168.123.103:0/1950568346 shutdown_connections 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.101+0000 7f260ec77640 1 -- 192.168.123.103:0/1950568346 wait complete. 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 Processor -- start 2026-03-09T16:20:25.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 -- start start 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 0x7f260819f350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f260819f9e0 con 0x7f2608076040 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260ec77640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26081a3750 con 0x7f2608075720 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260c9ec640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260c9ec640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:33796/0 (socket says 192.168.123.103:33796) 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f260c9ec640 1 -- 192.168.123.103:0/2740075507 learned_addr learned my addr 192.168.123.103:0/2740075507 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.102+0000 7f25fffff640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 0x7f260819f350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260c9ec640 1 -- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 msgr2=0x7f260819f350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260c9ec640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 0x7f260819f350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260c9ec640 1 -- 192.168.123.103:0/2740075507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25f0009660 con 0x7f2608075720 2026-03-09T16:20:25.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f25fffff640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 0x7f260819f350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:20:25.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260c9ec640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f25f002f790 tx=0x7f25f0004060 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:25.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25f0004410 con 0x7f2608075720 2026-03-09T16:20:25.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26081a39d0 con 0x7f2608075720 2026-03-09T16:20:25.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26081a3ec0 con 0x7f2608075720 2026-03-09T16:20:25.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f25f0038730 con 0x7f2608075720 2026-03-09T16:20:25.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.103+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25f0041780 con 0x7f2608075720 2026-03-09T16:20:25.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.105+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f25f003f070 con 0x7f2608075720 2026-03-09T16:20:25.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.105+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f25d4005350 con 0x7f2608075720 2026-03-09T16:20:25.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.105+0000 7f25fdffb640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 0x7f25d8079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.105+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f25f00be330 con 0x7f2608075720 2026-03-09T16:20:25.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.106+0000 7f25fffff640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 0x7f25d8079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:25.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.106+0000 7f25fffff640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 0x7f25d8079d50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f26081a03c0 tx=0x7f25f8009290 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:25.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.108+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f25f0086340 con 0x7f2608075720 2026-03-09T16:20:25.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:25 vm03.local ceph-mon[133973]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:25.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:25 vm03.local ceph-mon[133973]: pgmap v142: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:25.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:25 vm03.local ceph-mon[133973]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:25.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.239+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f25d40051c0 con 0x7f2608075720 2026-03-09T16:20:25.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.242+0000 7f25fdffb640 1 -- 192.168.123.103:0/2740075507 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f25f0086160 con 0x7f2608075720 2026-03-09T16:20:25.245 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T16:20:25.245 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T16:20:25.245 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T16:20:25.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.246+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 msgr2=0x7f25d8079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.246+0000 7f260ec77640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 0x7f25d8079d50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f26081a03c0 tx=0x7f25f8009290 comp rx=0 tx=0).stop 2026-03-09T16:20:25.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.246+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 msgr2=0x7f260819ee10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.246+0000 7f260ec77640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f25f002f790 tx=0x7f25f0004060 comp rx=0 tx=0).stop 2026-03-09T16:20:25.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 shutdown_connections 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f25d8077890 0x7f25d8079d50 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2608076040 0x7f260819f350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 --2- 192.168.123.103:0/2740075507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2608075720 0x7f260819ee10 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 >> 192.168.123.103:0/2740075507 conn(0x7f26080fe710 msgr2=0x7f26080ffe60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 shutdown_connections 2026-03-09T16:20:25.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.247+0000 7f260ec77640 1 -- 192.168.123.103:0/2740075507 wait complete. 2026-03-09T16:20:25.288 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T16:20:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:25 vm05.local ceph-mon[108543]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:25 vm05.local ceph-mon[108543]: pgmap v142: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:25 vm05.local ceph-mon[108543]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:25.450 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.711+0000 7f81bebfa640 1 -- 192.168.123.103:0/3143924742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81029d0 msgr2=0x7f81b8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.711+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3143924742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81029d0 0x7f81b8102e30 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f81a80099b0 tx=0x7f81a802f220 comp rx=0 tx=0).stop 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 -- 192.168.123.103:0/3143924742 shutdown_connections 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3143924742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81029d0 0x7f81b8102e30 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3143924742 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81b81089d0 0x7f81b8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 -- 192.168.123.103:0/3143924742 >> 192.168.123.103:0/3143924742 conn(0x7f81b80fe710 msgr2=0x7f81b8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:25.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 -- 192.168.123.103:0/3143924742 shutdown_connections 2026-03-09T16:20:25.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.712+0000 7f81bebfa640 1 -- 192.168.123.103:0/3143924742 wait complete. 2026-03-09T16:20:25.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 Processor -- start 2026-03-09T16:20:25.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 -- start start 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81b81029d0 0x7f81b81a0660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81b819a750 con 0x7f81b81089d0 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.713+0000 7f81bebfa640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81b819a8c0 con 0x7f81b81029d0 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37294/0 (socket says 192.168.123.103:37294) 2026-03-09T16:20:25.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 -- 192.168.123.103:0/3464804721 learned_addr learned my addr 192.168.123.103:0/3464804721 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:20:25.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 -- 192.168.123.103:0/3464804721 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81b81029d0 msgr2=0x7f81b81a0660 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:20:25.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81b81029d0 0x7f81b81a0660 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 -- 192.168.123.103:0/3464804721 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81a8009660 con 0x7f81b81089d0 2026-03-09T16:20:25.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81affff640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f81a8009980 tx=0x7f81a8004360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:25.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.714+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81a803d070 con 0x7f81b81089d0 2026-03-09T16:20:25.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.715+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f81a802fc90 con 0x7f81b81089d0 2026-03-09T16:20:25.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.715+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81b819ab40 con 0x7f81b81089d0 2026-03-09T16:20:25.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.715+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81b819b030 con 0x7f81b81089d0 2026-03-09T16:20:25.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.716+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81a80388e0 con 0x7f81b81089d0 2026-03-09T16:20:25.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.717+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f81a8038cf0 con 0x7f81b81089d0 2026-03-09T16:20:25.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.717+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8184005350 con 0x7f81b81089d0 2026-03-09T16:20:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.718+0000 7f81adffb640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 0x7f8188079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:20:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.718+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f81a80be9c0 con 0x7f81b81089d0 2026-03-09T16:20:25.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.721+0000 7f81bc96f640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 0x7f8188079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:20:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.721+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f81a8086f70 con 0x7f81b81089d0 2026-03-09T16:20:25.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.721+0000 7f81bc96f640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 0x7f8188079d50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f81a0009800 tx=0x7f81a0006d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:20:25.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.859+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8184005e10 con 0x7f81b81089d0 2026-03-09T16:20:25.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.859+0000 7f81adffb640 1 -- 192.168.123.103:0/3464804721 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f81a80866c0 con 0x7f81b81089d0 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:20:25.862 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:20:25.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 msgr2=0x7f8188079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 0x7f8188079d50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f81a0009800 tx=0x7f81a0006d20 comp rx=0 tx=0).stop 2026-03-09T16:20:25.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 msgr2=0x7f81b81a0ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:20:25.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f81a8009980 tx=0x7f81a8004360 comp rx=0 tx=0).stop 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 shutdown_connections 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f8188077890 0x7f8188079d50 secure :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f81a0009800 tx=0x7f81a0006d20 comp rx=0 tx=0).stop 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81b81089d0 0x7f81b81a0ba0 secure :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f81a8009980 tx=0x7f81a8004360 comp rx=0 tx=0).stop 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 --2- 192.168.123.103:0/3464804721 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f81b81029d0 0x7f81b81a0660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.863+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 >> 192.168.123.103:0/3464804721 conn(0x7f81b80fe710 msgr2=0x7f81b810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.864+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 shutdown_connections 2026-03-09T16:20:25.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:20:25.864+0000 7f81bebfa640 1 -- 192.168.123.103:0/3464804721 wait complete. 2026-03-09T16:20:25.922 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-09T16:20:26.081 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:26.131 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:26 vm03.local ceph-mon[133973]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:26.131 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:26 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2740075507' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:20:26.131 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:26 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3464804721' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:26.376 INFO:teuthology.orchestra.run.vm03.stdout:wait for servicemap items w/ changing names to refresh 2026-03-09T16:20:26.429 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-09T16:20:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:26 vm05.local ceph-mon[108543]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:20:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:26 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2740075507' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T16:20:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:26 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3464804721' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:20:26.604 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:20:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:27 vm03.local ceph-mon[133973]: pgmap v143: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:27 vm05.local ceph-mon[108543]: pgmap v143: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:29.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:29 vm03.local ceph-mon[133973]: pgmap v144: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:29 vm05.local ceph-mon[108543]: pgmap v144: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:31 vm03.local ceph-mon[133973]: pgmap v145: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:31 vm05.local ceph-mon[108543]: pgmap v145: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:33 vm03.local ceph-mon[133973]: pgmap v146: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:33 vm05.local ceph-mon[108543]: pgmap v146: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:35 vm03.local ceph-mon[133973]: pgmap v147: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:35 vm05.local ceph-mon[108543]: pgmap v147: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:37 vm03.local ceph-mon[133973]: pgmap v148: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:37 vm05.local ceph-mon[108543]: pgmap v148: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:38.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:38 vm03.local ceph-mon[133973]: pgmap v149: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:38 vm05.local ceph-mon[108543]: pgmap v149: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:40.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:40 vm03.local ceph-mon[133973]: pgmap v150: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:40.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:40 vm05.local ceph-mon[108543]: pgmap v150: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:43 vm03.local ceph-mon[133973]: pgmap v151: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:43 vm05.local ceph-mon[108543]: pgmap v151: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:45.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:45 vm03.local ceph-mon[133973]: pgmap v152: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:45.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:45 vm05.local ceph-mon[108543]: pgmap v152: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:47 vm03.local ceph-mon[133973]: pgmap v153: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:47 vm05.local ceph-mon[108543]: pgmap v153: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:49 vm03.local ceph-mon[133973]: pgmap v154: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:49 vm05.local ceph-mon[108543]: pgmap v154: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:51.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:51 vm05.local ceph-mon[108543]: pgmap v155: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:51 vm03.local ceph-mon[133973]: pgmap v155: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:53 vm03.local ceph-mon[133973]: pgmap v156: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:53 vm05.local ceph-mon[108543]: pgmap v156: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:20:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:55 vm03.local ceph-mon[133973]: pgmap v157: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:55.428 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:55 vm05.local ceph-mon[108543]: pgmap v157: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:57 vm03.local ceph-mon[133973]: pgmap v158: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:57 vm05.local ceph-mon[108543]: pgmap v158: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:20:59 vm03.local ceph-mon[133973]: pgmap v159: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:20:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:20:59 vm05.local ceph-mon[108543]: pgmap v159: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:01.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:01 vm05.local ceph-mon[108543]: pgmap v160: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:01 vm03.local ceph-mon[133973]: pgmap v160: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:03 vm03.local ceph-mon[133973]: pgmap v161: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:03 vm05.local ceph-mon[108543]: pgmap v161: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:04 vm05.local ceph-mon[108543]: pgmap v162: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:04.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:04 vm03.local ceph-mon[133973]: pgmap v162: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:07 vm03.local ceph-mon[133973]: pgmap v163: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:07 vm05.local ceph-mon[108543]: pgmap v163: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:09 vm03.local ceph-mon[133973]: pgmap v164: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:09 vm05.local ceph-mon[108543]: pgmap v164: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:21:10.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:21:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:21:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:11 vm03.local ceph-mon[133973]: pgmap v165: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:11 vm05.local ceph-mon[108543]: pgmap v165: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:13 vm03.local ceph-mon[133973]: pgmap v166: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:13 vm05.local ceph-mon[108543]: pgmap v166: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:15 vm03.local ceph-mon[133973]: pgmap v167: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:15 vm05.local ceph-mon[108543]: pgmap v167: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:17 vm03.local ceph-mon[133973]: pgmap v168: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:17 vm05.local ceph-mon[108543]: pgmap v168: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:19 vm03.local ceph-mon[133973]: pgmap v169: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:19 vm05.local ceph-mon[108543]: pgmap v169: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:21.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:21 vm03.local ceph-mon[133973]: pgmap v170: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:21 vm05.local ceph-mon[108543]: pgmap v170: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:23 vm03.local ceph-mon[133973]: pgmap v171: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:23 vm05.local ceph-mon[108543]: pgmap v171: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:24.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:25.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:25 vm03.local ceph-mon[133973]: pgmap v172: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:25.428 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:25 vm05.local ceph-mon[108543]: pgmap v172: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:26.862 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T16:21:26.999 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.244+0000 7f20f5065640 1 -- 192.168.123.103:0/1263484999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 msgr2=0x7f20f01039d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.244+0000 7f20f5065640 1 --2- 192.168.123.103:0/1263484999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01039d0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f20e4009a10 tx=0x7f20e4031530 comp rx=0 tx=0).stop 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.245+0000 7f20f5065640 1 -- 192.168.123.103:0/1263484999 shutdown_connections 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.245+0000 7f20f5065640 1 --2- 192.168.123.103:0/1263484999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01039d0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.245+0000 7f20f5065640 1 --2- 192.168.123.103:0/1263484999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 0x7f20f00ff720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.245+0000 7f20f5065640 1 -- 192.168.123.103:0/1263484999 >> 192.168.123.103:0/1263484999 conn(0x7f20f00fa5e0 msgr2=0x7f20f00fca00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.246+0000 7f20f5065640 1 -- 192.168.123.103:0/1263484999 shutdown_connections 2026-03-09T16:21:27.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.246+0000 7f20f5065640 1 -- 192.168.123.103:0/1263484999 wait complete. 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.246+0000 7f20f5065640 1 Processor -- start 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.246+0000 7f20f5065640 1 -- start start 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20f5065640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 0x7f20f01a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20f5065640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20f5065640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20f019a710 con 0x7f20f00ffcf0 2026-03-09T16:21:27.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20f5065640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20f019a880 con 0x7f20f00ff340 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20ee575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20ee575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42962/0 (socket says 192.168.123.103:42962) 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20ee575640 1 -- 192.168.123.103:0/529946546 learned_addr learned my addr 192.168.123.103:0/529946546 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.247+0000 7f20eed76640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 0x7f20f01a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20ee575640 1 -- 192.168.123.103:0/529946546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 msgr2=0x7f20f01a0620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20ee575640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 0x7f20f01a0620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20ee575640 1 -- 192.168.123.103:0/529946546 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20e4009660 con 0x7f20f00ffcf0 2026-03-09T16:21:27.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20ee575640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f20e4009630 tx=0x7f20e400e450 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20e4004520 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f20e4002ba0 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20e4039430 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f20f019ab00 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.248+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f20f019aff0 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.250+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f20e4039590 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.250+0000 7f20cffff640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 0x7f20c8079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.250+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f20e40c01b0 con 0x7f20f00ffcf0 2026-03-09T16:21:27.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.250+0000 7f20eed76640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 0x7f20c8079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.251+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f20bc005350 con 0x7f20f00ffcf0 2026-03-09T16:21:27.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.251+0000 7f20eed76640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 0x7f20c8079da0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f20d800bd40 tx=0x7f20d800f040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:27.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.255+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f20e4088890 con 0x7f20f00ffcf0 2026-03-09T16:21:27.350 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:27 vm03.local ceph-mon[133973]: pgmap v173: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:27.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.349+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 --> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f20bc002bf0 con 0x7f20c80778e0 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.354+0000 7f20cffff640 1 -- 192.168.123.103:0/529946546 <== mgr.34104 v2:192.168.123.103:6800/2191779109 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f20bc002bf0 con 0x7f20c80778e0 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (5m) 80s ago 11m 25.4M - 0.25.0 c8568f914cd2 61c29cd7a09d 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (83s) 80s ago 11m 9781k - 19.2.3-678-ge911bdeb 654f31e6858e 9faa71252568 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm05 vm05 running (81s) 81s ago 10m 9713k - 19.2.3-678-ge911bdeb 654f31e6858e 7d4b674187be 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 80s ago 11m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 03c86bd1bf32 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm05 vm05 running (4m) 81s ago 10m 7830k - 19.2.3-678-ge911bdeb 654f31e6858e 192f6dbc3145 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (5m) 80s ago 11m 91.2M - 10.4.0 c8b91775d855 6f4f55eef4bb 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kntrco vm03 running (104s) 80s ago 9m 98.4M - 19.2.3-678-ge911bdeb 654f31e6858e feea99babe02 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.kygyjl vm03 running (2m) 80s ago 9m 18.6M - 19.2.3-678-ge911bdeb 654f31e6858e a895bdb15107 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.jgzfvu vm05 running (98s) 81s ago 9m 18.9M - 19.2.3-678-ge911bdeb 654f31e6858e 44438ec5f534 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm05.sqhria vm05 running (88s) 81s ago 9m 17.0M - 19.2.3-678-ge911bdeb 654f31e6858e 0443da67e5de 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.gbgzmu vm03 *:8443,9283,8765 running (6m) 80s ago 11m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f10e9f43c355 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm05.dygxfv vm05 *:8443,9283,8765 running (6m) 81s ago 10m 497M - 19.2.3-678-ge911bdeb 654f31e6858e 5276dc4902e9 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 80s ago 11m 64.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f90a2e8dc751 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm05 vm05 running (4m) 81s ago 10m 53.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b6d6af84a66d 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (6m) 80s ago 11m 9.77M - 1.7.0 72c9c2088986 73da4350a8ed 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm05 vm05 *:9100 running (6m) 81s ago 10m 9.79M - 1.7.0 72c9c2088986 0be807a191b0 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 80s ago 10m 131M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fba6e40f54d4 2026-03-09T16:21:27.355 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 80s ago 10m 112M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9e86c92fc9cd 2026-03-09T16:21:27.356 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 80s ago 10m 99.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 2e666ccd4bf7 2026-03-09T16:21:27.356 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm05 running (3m) 81s ago 9m 155M 4096M 19.2.3-678-ge911bdeb 654f31e6858e c052610d74d5 2026-03-09T16:21:27.356 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm05 running (2m) 81s ago 9m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4115e4720b89 2026-03-09T16:21:27.356 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm05 running (2m) 81s ago 9m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e d93569840b13 2026-03-09T16:21:27.356 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (5m) 80s ago 10m 52.0M - 2.51.0 1d3b7f56885b ce88dd379864 2026-03-09T16:21:27.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 msgr2=0x7f20c8079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 0x7f20c8079da0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f20d800bd40 tx=0x7f20d800f040 comp rx=0 tx=0).stop 2026-03-09T16:21:27.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 msgr2=0x7f20f01a0b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f20e4009630 tx=0x7f20e400e450 comp rx=0 tx=0).stop 2026-03-09T16:21:27.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 shutdown_connections 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f20c80778e0 0x7f20c8079da0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20f00ffcf0 0x7f20f01a0b60 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 --2- 192.168.123.103:0/529946546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f20f00ff340 0x7f20f01a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 >> 192.168.123.103:0/529946546 conn(0x7f20f00fa5e0 msgr2=0x7f20f00fbd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 shutdown_connections 2026-03-09T16:21:27.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.357+0000 7f20f5065640 1 -- 192.168.123.103:0/529946546 wait complete. 2026-03-09T16:21:27.429 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T16:21:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:27 vm05.local ceph-mon[108543]: pgmap v173: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:27.581 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.815+0000 7fbd074a5640 1 -- 192.168.123.103:0/3589037891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 msgr2=0x7fbd00075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.815+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3589037891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd00075b00 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fbcf40099b0 tx=0x7fbcf402f240 comp rx=0 tx=0).stop 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 -- 192.168.123.103:0/3589037891 shutdown_connections 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3589037891 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd00111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3589037891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd00075b00 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 -- 192.168.123.103:0/3589037891 >> 192.168.123.103:0/3589037891 conn(0x7fbd000fe710 msgr2=0x7fbd00100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 -- 192.168.123.103:0/3589037891 shutdown_connections 2026-03-09T16:21:27.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.816+0000 7fbd074a5640 1 -- 192.168.123.103:0/3589037891 wait complete. 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 Processor -- start 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 -- start start 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd0019ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd0019f3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd0019fa40 con 0x7fbd00075720 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd074a5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd001a37b0 con 0x7fbd00076040 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd0521a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd0019ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd0521a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd0019ee70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42972/0 (socket says 192.168.123.103:42972) 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd0521a640 1 -- 192.168.123.103:0/3672197671 learned_addr learned my addr 192.168.123.103:0/3672197671 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:27.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.817+0000 7fbd04a19640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd0019f3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd04a19640 1 -- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 msgr2=0x7fbd0019ee70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd04a19640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd0019ee70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd04a19640 1 -- 192.168.123.103:0/3672197671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbcf4009660 con 0x7fbd00076040 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd04a19640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd0019f3b0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fbcf000e970 tx=0x7fbcf000ee40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbcf000ccb0 con 0x7fbd00076040 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd001a3a90 con 0x7fbd00076040 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.818+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd001a3fe0 con 0x7fbd00076040 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.819+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbcf0004590 con 0x7fbd00076040 2026-03-09T16:21:27.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.819+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbcf0010640 con 0x7fbd00076040 2026-03-09T16:21:27.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.819+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd00076e60 con 0x7fbd00076040 2026-03-09T16:21:27.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.820+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbcf00028e0 con 0x7fbd00076040 2026-03-09T16:21:27.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.820+0000 7fbcee7fc640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 0x7fbcd4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:27.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.820+0000 7fbd0521a640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 0x7fbcd4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:27.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.821+0000 7fbd0521a640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 0x7fbcd4079da0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fbcf40040c0 tx=0x7fbcf403a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:27.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.821+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fbcf0014070 con 0x7fbd00076040 2026-03-09T16:21:27.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.823+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbcf0062850 con 0x7fbd00076040 2026-03-09T16:21:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.955+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fbd001a0230 con 0x7fbd00076040 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.956+0000 7fbcee7fc640 1 -- 192.168.123.103:0/3672197671 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fbcf0061fa0 con 0x7fbd00076040 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T16:21:27.957 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 msgr2=0x7fbcd4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 0x7fbcd4079da0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fbcf40040c0 tx=0x7fbcf403a040 comp rx=0 tx=0).stop 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 msgr2=0x7fbd0019f3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd0019f3b0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fbcf000e970 tx=0x7fbcf000ee40 comp rx=0 tx=0).stop 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 shutdown_connections 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbcd40778e0 0x7fbcd4079da0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd00076040 0x7fbd0019f3b0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 --2- 192.168.123.103:0/3672197671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd00075720 0x7fbd0019ee70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.958+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 >> 192.168.123.103:0/3672197671 conn(0x7fbd000fe710 msgr2=0x7fbd000ffe30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.959+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 shutdown_connections 2026-03-09T16:21:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:27.959+0000 7fbd074a5640 1 -- 192.168.123.103:0/3672197671 wait complete. 2026-03-09T16:21:28.011 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-09T16:21:28.156 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:28.207 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:28 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3672197671' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 -- 192.168.123.103:0/532460916 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c100760 msgr2=0x7f785c100bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 --2- 192.168.123.103:0/532460916 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c100760 0x7f785c100bc0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f78480099b0 tx=0x7f784802f220 comp rx=0 tx=0).stop 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 -- 192.168.123.103:0/532460916 shutdown_connections 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 --2- 192.168.123.103:0/532460916 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c100760 0x7f785c100bc0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 --2- 192.168.123.103:0/532460916 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c106760 0x7f785c106b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.410+0000 7f786483f640 1 -- 192.168.123.103:0/532460916 >> 192.168.123.103:0/532460916 conn(0x7f785c0fc480 msgr2=0x7f785c0fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 -- 192.168.123.103:0/532460916 shutdown_connections 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 -- 192.168.123.103:0/532460916 wait complete. 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 Processor -- start 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 -- start start 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c100760 0x7f785c1964f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f785c1970c0 con 0x7f785c106760 2026-03-09T16:21:28.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.411+0000 7f786483f640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f785c19ae30 con 0x7f785c100760 2026-03-09T16:21:28.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:28.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42986/0 (socket says 192.168.123.103:42986) 2026-03-09T16:21:28.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 -- 192.168.123.103:0/1693359093 learned_addr learned my addr 192.168.123.103:0/1693359093 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:28.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 -- 192.168.123.103:0/1693359093 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c100760 msgr2=0x7f785c1964f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:28.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f78625b4640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c100760 0x7f785c1964f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:28.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c100760 0x7f785c1964f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 -- 192.168.123.103:0/1693359093 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7848009660 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f7861db3640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f7848002410 tx=0x7f7848004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f784803d070 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f78480043b0 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.412+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7848041850 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.413+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f785c19b0b0 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.413+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f785c19b5a0 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.414+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f784802fc90 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.414+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f785c101ea0 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.415+0000 7f78477fe640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 0x7f7838079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.415+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f78480be6f0 con 0x7f785c106760 2026-03-09T16:21:28.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.415+0000 7f78625b4640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 0x7f7838079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:28.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.416+0000 7f78625b4640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 0x7f7838079da0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f785400a8b0 tx=0x7f7854008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:28.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.418+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7848086d20 con 0x7f785c106760 2026-03-09T16:21:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:28 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3672197671' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:28.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.552+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f785c100bc0 con 0x7f785c106760 2026-03-09T16:21:28.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.553+0000 7f78477fe640 1 -- 192.168.123.103:0/1693359093 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f7848086470 con 0x7f785c106760 2026-03-09T16:21:28.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 msgr2=0x7f7838079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 0x7f7838079da0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f785400a8b0 tx=0x7f7854008040 comp rx=0 tx=0).stop 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 msgr2=0x7f785c196a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f7848002410 tx=0x7f7848004290 comp rx=0 tx=0).stop 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 shutdown_connections 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f78380778e0 0x7f7838079da0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f785c106760 0x7f785c196a30 secure :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f7848002410 tx=0x7f7848004290 comp rx=0 tx=0).stop 2026-03-09T16:21:28.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 --2- 192.168.123.103:0/1693359093 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f785c100760 0x7f785c1964f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 >> 192.168.123.103:0/1693359093 conn(0x7f785c0fc480 msgr2=0x7f785c10a720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:28.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.557+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 shutdown_connections 2026-03-09T16:21:28.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.558+0000 7f786483f640 1 -- 192.168.123.103:0/1693359093 wait complete. 2026-03-09T16:21:28.566 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T16:21:28.621 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-09T16:21:28.763 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:28.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.989+0000 7f7b17b7a640 1 -- 192.168.123.103:0/3144683238 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10075720 msgr2=0x7f7b10075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.989+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/3144683238 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10075720 0x7f7b10075b00 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f7b000099b0 tx=0x7f7b0002f220 comp rx=0 tx=0).stop 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 -- 192.168.123.103:0/3144683238 shutdown_connections 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/3144683238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b10076040 0x7f7b10111360 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/3144683238 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10075720 0x7f7b10075b00 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 -- 192.168.123.103:0/3144683238 >> 192.168.123.103:0/3144683238 conn(0x7f7b100fe700 msgr2=0x7f7b10100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 -- 192.168.123.103:0/3144683238 shutdown_connections 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 -- 192.168.123.103:0/3144683238 wait complete. 2026-03-09T16:21:28.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.990+0000 7f7b17b7a640 1 Processor -- start 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b17b7a640 1 -- start start 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b17b7a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b10075720 0x7f7b1010bd80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b17b7a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b17b7a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b1010c950 con 0x7f7b10076040 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b17b7a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b101ad2d0 con 0x7f7b10075720 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56500/0 (socket says 192.168.123.103:56500) 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 -- 192.168.123.103:0/4115073724 learned_addr learned my addr 192.168.123.103:0/4115073724 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 -- 192.168.123.103:0/4115073724 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b10075720 msgr2=0x7f7b1010bd80 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b10075720 0x7f7b1010bd80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:28.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.991+0000 7f7b150ee640 1 -- 192.168.123.103:0/4115073724 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b00009660 con 0x7f7b10076040 2026-03-09T16:21:28.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7b150ee640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7b0400da30 tx=0x7f7b0400df00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:28.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b0400bb80 con 0x7f7b10076040 2026-03-09T16:21:28.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7b04004590 con 0x7f7b10076040 2026-03-09T16:21:28.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b04010460 con 0x7f7b10076040 2026-03-09T16:21:28.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b101ad4d0 con 0x7f7b10076040 2026-03-09T16:21:28.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.992+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b101ada20 con 0x7f7b10076040 2026-03-09T16:21:28.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.993+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7b10076e60 con 0x7f7b10076040 2026-03-09T16:21:28.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.996+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7b0400bce0 con 0x7f7b10076040 2026-03-09T16:21:28.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.997+0000 7f7afeffd640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 0x7f7aec079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:28.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.997+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f7b0409a110 con 0x7f7b10076040 2026-03-09T16:21:28.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.997+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7b0409a520 con 0x7f7b10076040 2026-03-09T16:21:28.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.997+0000 7f7b158ef640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 0x7f7aec079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:28.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:28.997+0000 7f7b158ef640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 0x7f7aec079da0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7b000099b0 tx=0x7f7b000023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:29.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.134+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f7b101ad660 con 0x7f7b10076040 2026-03-09T16:21:29.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.135+0000 7f7afeffd640 1 -- 192.168.123.103:0/4115073724 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f7b101ad660 con 0x7f7b10076040 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 msgr2=0x7f7aec079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 0x7f7aec079da0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7b000099b0 tx=0x7f7b000023d0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 msgr2=0x7f7b1010c2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7b0400da30 tx=0x7f7b0400df00 comp rx=0 tx=0).stop 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 shutdown_connections 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f7aec0778e0 0x7f7aec079da0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7b10076040 0x7f7b1010c2c0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 --2- 192.168.123.103:0/4115073724 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b10075720 0x7f7b1010bd80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 >> 192.168.123.103:0/4115073724 conn(0x7f7b100fe700 msgr2=0x7f7b100ffde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 shutdown_connections 2026-03-09T16:21:29.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.137+0000 7f7b17b7a640 1 -- 192.168.123.103:0/4115073724 wait complete. 2026-03-09T16:21:29.146 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T16:21:29.202 DEBUG:teuthology.parallel:result is None 2026-03-09T16:21:29.203 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T16:21:29.206 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T16:21:29.206 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- bash -c 'ceph fs dump' 2026-03-09T16:21:29.357 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:29.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:29 vm03.local ceph-mon[133973]: from='client.34326 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:21:29.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:29 vm03.local ceph-mon[133973]: pgmap v174: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:29.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:29 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1693359093' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:29 vm05.local ceph-mon[108543]: from='client.34326 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T16:21:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:29 vm05.local ceph-mon[108543]: pgmap v174: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:29 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1693359093' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/151740120 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81029d0 msgr2=0x7fc3b8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/151740120 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81029d0 0x7fc3b8102e30 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fc3a80099b0 tx=0x7fc3a802f220 comp rx=0 tx=0).stop 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/151740120 shutdown_connections 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/151740120 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81029d0 0x7fc3b8102e30 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/151740120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3b81089d0 0x7fc3b8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.594+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/151740120 >> 192.168.123.103:0/151740120 conn(0x7fc3b80fe710 msgr2=0x7fc3b8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/151740120 shutdown_connections 2026-03-09T16:21:29.595 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/151740120 wait complete. 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 Processor -- start 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 -- start start 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3b81029d0 0x7fc3b8075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc3b8079780 con 0x7fc3b81089d0 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.595+0000 7fc3bf4bb640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc3b80798f0 con 0x7fc3b81029d0 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56514/0 (socket says 192.168.123.103:56514) 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 -- 192.168.123.103:0/3664594487 learned_addr learned my addr 192.168.123.103:0/3664594487 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 -- 192.168.123.103:0/3664594487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3b81029d0 msgr2=0x7fc3b8075700 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:29.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3b81029d0 0x7fc3b8075700 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 -- 192.168.123.103:0/3664594487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc3a8009660 con 0x7fc3b81089d0 2026-03-09T16:21:29.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3bca2f640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc3a8009980 tx=0x7fc3a8031d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:29.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.596+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc3a803d070 con 0x7fc3b81089d0 2026-03-09T16:21:29.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.597+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc3b80761e0 con 0x7fc3b81089d0 2026-03-09T16:21:29.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.597+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc3b81a90d0 con 0x7fc3b81089d0 2026-03-09T16:21:29.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.598+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc3a80043d0 con 0x7fc3b81089d0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.598+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc3a8038d80 con 0x7fc3b81089d0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.598+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc3a8031070 con 0x7fc3b81089d0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.598+0000 7fc3ae7fc640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 0x7fc38c079f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.599+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fc3a80bf6e0 con 0x7fc3b81089d0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.599+0000 7fc3bd230640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 0x7fc38c079f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.599+0000 7fc3bd230640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 0x7fc38c079f40 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc3a0005fd0 tx=0x7fc3a0009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:29.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.600+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc3b8104110 con 0x7fc3b81089d0 2026-03-09T16:21:29.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.602+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc3a8087d90 con 0x7fc3b81089d0 2026-03-09T16:21:29.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.722+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fc3b8102e30 con 0x7fc3b81089d0 2026-03-09T16:21:29.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.722+0000 7fc3ae7fc640 1 -- 192.168.123.103:0/3664594487 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 32 v32) v1 ==== 76+0+1974 (secure 0 0 0) 0x7fc3a80874e0 con 0x7fc3b81089d0 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:e32 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T16:20:09:253659+0000 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:epoch 32 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T16:12:12.560035+0000 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T16:20:08.326420+0000 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 80 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T16:21:29.724 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34272} 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 34272 members: 34272 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kntrco{0:34272} state up:active seq 7 join_fscid=1 addr [v2:192.168.123.103:6828/2230073446,v1:192.168.123.103:6829/2230073446] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.jgzfvu{-1:34276} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6824/3112850580,v1:192.168.123.105:6825/3112850580] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.kygyjl{-1:44223} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6826/892320051,v1:192.168.123.103:6827/892320051] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm05.sqhria{-1:44255} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6826/2319193942,v1:192.168.123.105:6827/2319193942] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 32 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 msgr2=0x7fc38c079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:29.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 0x7fc38c079f40 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc3a0005fd0 tx=0x7fc3a0009450 comp rx=0 tx=0).stop 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 msgr2=0x7fc3b8075c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc3a8009980 tx=0x7fc3a8031d20 comp rx=0 tx=0).stop 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 shutdown_connections 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc38c077a80 0x7fc38c079f40 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3b81089d0 0x7fc3b8075c40 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.725+0000 7fc3bf4bb640 1 --2- 192.168.123.103:0/3664594487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc3b81029d0 0x7fc3b8075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.726+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 >> 192.168.123.103:0/3664594487 conn(0x7fc3b80fe710 msgr2=0x7fc3b810c920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.726+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 shutdown_connections 2026-03-09T16:21:29.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:29.726+0000 7fc3bf4bb640 1 -- 192.168.123.103:0/3664594487 wait complete. 2026-03-09T16:21:29.767 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-09T16:21:29.770 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 2026-03-09T16:21:29.912 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:30.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.159+0000 7f32f500c640 1 -- 192.168.123.103:0/1903635603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 msgr2=0x7f32f0108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.159+0000 7f32f500c640 1 --2- 192.168.123.103:0/1903635603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f0108db0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f32e40099b0 tx=0x7f32e402f220 comp rx=0 tx=0).stop 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 -- 192.168.123.103:0/1903635603 shutdown_connections 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 --2- 192.168.123.103:0/1903635603 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f0102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 --2- 192.168.123.103:0/1903635603 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f0108db0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 -- 192.168.123.103:0/1903635603 >> 192.168.123.103:0/1903635603 conn(0x7f32f00fe710 msgr2=0x7f32f0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 -- 192.168.123.103:0/1903635603 shutdown_connections 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 -- 192.168.123.103:0/1903635603 wait complete. 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 Processor -- start 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.161+0000 7f32f500c640 1 -- start start 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32f500c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32f500c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f01a0b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32f500c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32f01a1190 con 0x7f32f01089d0 2026-03-09T16:21:30.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32f500c640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32f019a770 con 0x7f32f01029d0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43308/0 (socket says 192.168.123.103:43308) 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 -- 192.168.123.103:0/3892682471 learned_addr learned my addr 192.168.123.103:0/3892682471 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32ee575640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f01a0b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 -- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 msgr2=0x7f32f01a0b70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f01a0b70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 -- 192.168.123.103:0/3892682471 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32e4009660 con 0x7f32f01029d0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32ee575640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f01a0b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.162+0000 7f32eed76640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f32e4009980 tx=0x7f32e4004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.163+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f32e403d070 con 0x7f32f01029d0 2026-03-09T16:21:30.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.163+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f32f019a9f0 con 0x7f32f01029d0 2026-03-09T16:21:30.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.163+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f32e4004400 con 0x7f32f01029d0 2026-03-09T16:21:30.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.163+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f32e40416b0 con 0x7f32f01029d0 2026-03-09T16:21:30.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.163+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f32f019aee0 con 0x7f32f01029d0 2026-03-09T16:21:30.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.164+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f32b4005350 con 0x7f32f01029d0 2026-03-09T16:21:30.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.164+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f32e4038730 con 0x7f32f01029d0 2026-03-09T16:21:30.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.165+0000 7f32cffff640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 0x7f32c4079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.165+0000 7f32ee575640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 0x7f32c4079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.165+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f32e408af50 con 0x7f32f01029d0 2026-03-09T16:21:30.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.165+0000 7f32ee575640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 0x7f32c4079d50 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f32d80046d0 tx=0x7f32d8004420 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:30.168 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.167+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f32e40c3050 con 0x7f32f01029d0 2026-03-09T16:21:30.282 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:30 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/4115073724' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:30.282 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:30 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3664594487' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:21:30.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.281+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f32b40051c0 con 0x7f32f01029d0 2026-03-09T16:21:30.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.282+0000 7f32cffff640 1 -- 192.168.123.103:0/3892682471 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 32 v32) v1 ==== 94+0+5261 (secure 0 0 0) 0x7f32e4086970 con 0x7f32f01029d0 2026-03-09T16:21:30.284 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:30.286 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"btime":"2026-03-09T16:20:09:253659+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15},{"gid":44255,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2319193942","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2319193942},{"type":"v1","addr":"192.168.123.105:6827","nonce":2319193942}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:20:08.326420+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:active","state_seq":7,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34272,"qdb_cluster":[34272]},"id":1}]} 2026-03-09T16:21:30.286 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 32 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 msgr2=0x7f32c4079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 0x7f32c4079d50 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f32d80046d0 tx=0x7f32d8004420 comp rx=0 tx=0).stop 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 msgr2=0x7f32f01a0630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f32e4009980 tx=0x7f32e4004290 comp rx=0 tx=0).stop 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 shutdown_connections 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f32c4077890 0x7f32c4079d50 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f01089d0 0x7f32f01a0b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 --2- 192.168.123.103:0/3892682471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f32f01029d0 0x7f32f01a0630 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.287+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 >> 192.168.123.103:0/3892682471 conn(0x7f32f00fe710 msgr2=0x7f32f00feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:30.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.288+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 shutdown_connections 2026-03-09T16:21:30.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.288+0000 7f32f500c640 1 -- 192.168.123.103:0/3892682471 wait complete. 2026-03-09T16:21:30.332 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 18} 2026-03-09T16:21:30.332 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 10 2026-03-09T16:21:30.469 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:30.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:30 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/4115073724' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T16:21:30.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:30 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3664594487' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.711+0000 7f88af59c640 1 -- 192.168.123.103:0/1425653685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81089d0 msgr2=0x7f88a8108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.711+0000 7f88af59c640 1 --2- 192.168.123.103:0/1425653685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81089d0 0x7f88a8108db0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f889c0098e0 tx=0x7f889c02f190 comp rx=0 tx=0).stop 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 -- 192.168.123.103:0/1425653685 shutdown_connections 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 --2- 192.168.123.103:0/1425653685 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81029d0 0x7f88a8102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 --2- 192.168.123.103:0/1425653685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81089d0 0x7f88a8108db0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 -- 192.168.123.103:0/1425653685 >> 192.168.123.103:0/1425653685 conn(0x7f88a80fe710 msgr2=0x7f88a8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 -- 192.168.123.103:0/1425653685 shutdown_connections 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.712+0000 7f88af59c640 1 -- 192.168.123.103:0/1425653685 wait complete. 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 Processor -- start 2026-03-09T16:21:30.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 -- start start 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 0x7f88a81a0a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88a81a10b0 con 0x7f88a81089d0 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88af59c640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88a819a640 con 0x7f88a81029d0 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88ad311640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88ad311640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43328/0 (socket says 192.168.123.103:43328) 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.713+0000 7f88ad311640 1 -- 192.168.123.103:0/413419614 learned_addr learned my addr 192.168.123.103:0/413419614 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88acb10640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 0x7f88a81a0a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88ad311640 1 -- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 msgr2=0x7f88a81a0a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88ad311640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 0x7f88a81a0a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88ad311640 1 -- 192.168.123.103:0/413419614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8898009660 con 0x7f88a81029d0 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88acb10640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 0x7f88a81a0a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88ad311640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f889c0098b0 tx=0x7f889c004550 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f889c03d070 con 0x7f88a81029d0 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.714+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f889c009590 con 0x7f88a81029d0 2026-03-09T16:21:30.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.715+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88a819ac20 con 0x7f88a81029d0 2026-03-09T16:21:30.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.715+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f889c02fbc0 con 0x7f88a81029d0 2026-03-09T16:21:30.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.715+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f889c041660 con 0x7f88a81029d0 2026-03-09T16:21:30.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.716+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f889c049050 con 0x7f88a81029d0 2026-03-09T16:21:30.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.716+0000 7f88967fc640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 0x7f887c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:30.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.716+0000 7f88acb10640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 0x7f887c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:30.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.717+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f889c0be3e0 con 0x7f88a81029d0 2026-03-09T16:21:30.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.717+0000 7f88acb10640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 0x7f887c079da0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f88a819bab0 tx=0x7f8898009340 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:30.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.717+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f88a8104110 con 0x7f88a81029d0 2026-03-09T16:21:30.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.720+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f889c086a10 con 0x7f88a81029d0 2026-03-09T16:21:30.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.840+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 10, "format": "json"} v 0) v1 -- 0x7f88a8102e30 con 0x7f88a81029d0 2026-03-09T16:21:30.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.842+0000 7f88967fc640 1 -- 192.168.123.103:0/413419614 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 10, "format": "json"}]=0 dumped fsmap epoch 10 v32) v1 ==== 107+0+4917 (secure 0 0 0) 0x7f889c049330 con 0x7f88a81029d0 2026-03-09T16:21:30.843 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:30.843 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":10,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:12:19.641182+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm03.kygyjl","rank":0,"incarnation":9,"state":"up:reconnect","state_seq":3,"addr":"192.168.123.103:6827/1622851291","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":1622851291},{"type":"v1","addr":"192.168.123.103:6827","nonce":1622851291}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:30.843 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 10 2026-03-09T16:21:30.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 msgr2=0x7f887c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 0x7f887c079da0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f88a819bab0 tx=0x7f8898009340 comp rx=0 tx=0).stop 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 msgr2=0x7f88a81a0550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f889c0098b0 tx=0x7f889c004550 comp rx=0 tx=0).stop 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 shutdown_connections 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f887c0778e0 0x7f887c079da0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88a81089d0 0x7f88a81a0a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.845+0000 7f88af59c640 1 --2- 192.168.123.103:0/413419614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a81029d0 0x7f88a81a0550 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.846+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 >> 192.168.123.103:0/413419614 conn(0x7f88a80fe710 msgr2=0x7f88a810b880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.846+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 shutdown_connections 2026-03-09T16:21:30.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:30.846+0000 7f88af59c640 1 -- 192.168.123.103:0/413419614 wait complete. 2026-03-09T16:21:30.890 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 11 2026-03-09T16:21:31.034 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 -- 192.168.123.103:0/2220975561 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541089d0 msgr2=0x7f3454108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 --2- 192.168.123.103:0/2220975561 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541089d0 0x7f3454108db0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f343c0099b0 tx=0x7f343c02f220 comp rx=0 tx=0).stop 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 -- 192.168.123.103:0/2220975561 shutdown_connections 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 --2- 192.168.123.103:0/2220975561 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34541029d0 0x7f3454102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 --2- 192.168.123.103:0/2220975561 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541089d0 0x7f3454108db0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.294+0000 7f345aae3640 1 -- 192.168.123.103:0/2220975561 >> 192.168.123.103:0/2220975561 conn(0x7f34540fe710 msgr2=0x7f3454100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:31.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.295+0000 7f345aae3640 1 -- 192.168.123.103:0/2220975561 shutdown_connections 2026-03-09T16:21:31.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.295+0000 7f345aae3640 1 -- 192.168.123.103:0/2220975561 wait complete. 2026-03-09T16:21:31.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.295+0000 7f345aae3640 1 Processor -- start 2026-03-09T16:21:31.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.295+0000 7f345aae3640 1 -- start start 2026-03-09T16:21:31.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f345aae3640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f345aae3640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34541089d0 0x7f34541a0bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f345aae3640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f345419a770 con 0x7f34541029d0 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f345aae3640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f345419a8e0 con 0x7f34541089d0 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56564/0 (socket says 192.168.123.103:56564) 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 -- 192.168.123.103:0/1719375176 learned_addr learned my addr 192.168.123.103:0/1719375176 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:31.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 -- 192.168.123.103:0/1719375176 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34541089d0 msgr2=0x7f34541a0bc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34541089d0 0x7f34541a0bc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 -- 192.168.123.103:0/1719375176 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f343c009660 con 0x7f34541029d0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.296+0000 7f3458858640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f343c002410 tx=0x7f343c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.297+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f343c03d070 con 0x7f34541029d0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.297+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f343c038730 con 0x7f34541029d0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.297+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f343c0416b0 con 0x7f34541029d0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.297+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f345419ab60 con 0x7f34541029d0 2026-03-09T16:21:31.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.297+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f345419aec0 con 0x7f34541029d0 2026-03-09T16:21:31.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.299+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f343c04b430 con 0x7f34541029d0 2026-03-09T16:21:31.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.299+0000 7f3449ffb640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 0x7f3424079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.299+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f343c0bf450 con 0x7f34541029d0 2026-03-09T16:21:31.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.300+0000 7f344bfff640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 0x7f3424079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:31.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.300+0000 7f344bfff640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 0x7f3424079e70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3444005fd0 tx=0x7f3444009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:31.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.300+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3420005350 con 0x7f34541029d0 2026-03-09T16:21:31.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.303+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f343c087b00 con 0x7f34541029d0 2026-03-09T16:21:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:31 vm03.local ceph-mon[133973]: pgmap v175: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:31.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:31 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3892682471' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:21:31.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:31 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/413419614' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-09T16:21:31.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.417+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7f34200051c0 con 0x7f34541029d0 2026-03-09T16:21:31.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.418+0000 7f3449ffb640 1 -- 192.168.123.103:0/1719375176 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v32) v1 ==== 107+0+4914 (secure 0 0 0) 0x7f343c046090 con 0x7f34541029d0 2026-03-09T16:21:31.419 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:31.419 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":11,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:12:20.652924+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm03.kygyjl","rank":0,"incarnation":9,"state":"up:rejoin","state_seq":4,"addr":"192.168.123.103:6827/1622851291","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":1622851291},{"type":"v1","addr":"192.168.123.103:6827","nonce":1622851291}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:31.419 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T16:21:31.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.420+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 msgr2=0x7f3424079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:31.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.420+0000 7f345aae3640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 0x7f3424079e70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3444005fd0 tx=0x7f3444009450 comp rx=0 tx=0).stop 2026-03-09T16:21:31.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 msgr2=0x7f34541a0680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:31.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f343c002410 tx=0x7f343c004290 comp rx=0 tx=0).stop 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 shutdown_connections 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f34240779b0 0x7f3424079e70 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34541089d0 0x7f34541a0bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 --2- 192.168.123.103:0/1719375176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34541029d0 0x7f34541a0680 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 >> 192.168.123.103:0/1719375176 conn(0x7f34540fe710 msgr2=0x7f3454106550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 shutdown_connections 2026-03-09T16:21:31.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.421+0000 7f345aae3640 1 -- 192.168.123.103:0/1719375176 wait complete. 2026-03-09T16:21:31.490 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 12 2026-03-09T16:21:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:31 vm05.local ceph-mon[108543]: pgmap v175: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:31 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3892682471' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T16:21:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:31 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/413419614' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-09T16:21:31.628 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:31.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.870+0000 7f902b907640 1 -- 192.168.123.103:0/2426914851 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 msgr2=0x7f9024108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.870+0000 7f902b907640 1 --2- 192.168.123.103:0/2426914851 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f9024108db0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f900c0099b0 tx=0x7f900c02f220 comp rx=0 tx=0).stop 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 -- 192.168.123.103:0/2426914851 shutdown_connections 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 --2- 192.168.123.103:0/2426914851 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 0x7f9024102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 --2- 192.168.123.103:0/2426914851 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f9024108db0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 -- 192.168.123.103:0/2426914851 >> 192.168.123.103:0/2426914851 conn(0x7f90240fe710 msgr2=0x7f9024100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 -- 192.168.123.103:0/2426914851 shutdown_connections 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.871+0000 7f902b907640 1 -- 192.168.123.103:0/2426914851 wait complete. 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 Processor -- start 2026-03-09T16:21:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 -- start start 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 0x7f90241a06a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f902419a790 con 0x7f90241089d0 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f902b907640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f902419a900 con 0x7f90241029d0 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f9028e7b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f9028e7b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56588/0 (socket says 192.168.123.103:56588) 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.872+0000 7f9028e7b640 1 -- 192.168.123.103:0/850691557 learned_addr learned my addr 192.168.123.103:0/850691557 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:31.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.873+0000 7f902967c640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 0x7f90241a06a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.873+0000 7f9028e7b640 1 -- 192.168.123.103:0/850691557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 msgr2=0x7f90241a06a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.873+0000 7f9028e7b640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 0x7f90241a06a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.873+0000 7f9028e7b640 1 -- 192.168.123.103:0/850691557 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f900c009660 con 0x7f90241089d0 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.873+0000 7f9028e7b640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f901400da40 tx=0x7f901400df10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.874+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9014004280 con 0x7f90241089d0 2026-03-09T16:21:31.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.874+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f901400be10 con 0x7f90241089d0 2026-03-09T16:21:31.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.874+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f902419abe0 con 0x7f90241089d0 2026-03-09T16:21:31.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.874+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f902419b130 con 0x7f90241089d0 2026-03-09T16:21:31.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.874+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9014005230 con 0x7f90241089d0 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.876+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8fec005350 con 0x7f90241089d0 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.876+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9014004930 con 0x7f90241089d0 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.877+0000 7f901a7fc640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 0x7f9000079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.877+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f90140996b0 con 0x7f90241089d0 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.877+0000 7f902967c640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 0x7f9000079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.878+0000 7f902967c640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 0x7f9000079e70 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f900c005ec0 tx=0x7f900c03a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:31.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.879+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9014061d60 con 0x7f90241089d0 2026-03-09T16:21:31.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:31.998+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7f8fec0051c0 con 0x7f90241089d0 2026-03-09T16:21:32.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.000+0000 7f901a7fc640 1 -- 192.168.123.103:0/850691557 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v32) v1 ==== 107+0+4914 (secure 0 0 0) 0x7f90140614b0 con 0x7f90241089d0 2026-03-09T16:21:32.001 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:32.001 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:12:21.661284+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm03.kygyjl","rank":0,"incarnation":9,"state":"up:active","state_seq":5,"addr":"192.168.123.103:6827/1622851291","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":1622851291},{"type":"v1","addr":"192.168.123.103:6827","nonce":1622851291}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:32.001 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T16:21:32.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 msgr2=0x7f9000079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 0x7f9000079e70 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f900c005ec0 tx=0x7f900c03a040 comp rx=0 tx=0).stop 2026-03-09T16:21:32.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 msgr2=0x7f90241a0be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f901400da40 tx=0x7f901400df10 comp rx=0 tx=0).stop 2026-03-09T16:21:32.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 shutdown_connections 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f90000779b0 0x7f9000079e70 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90241089d0 0x7f90241a0be0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 --2- 192.168.123.103:0/850691557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90241029d0 0x7f90241a06a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.002+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 >> 192.168.123.103:0/850691557 conn(0x7f90240fe710 msgr2=0x7f90241001d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.003+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 shutdown_connections 2026-03-09T16:21:32.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.003+0000 7f902b907640 1 -- 192.168.123.103:0/850691557 wait complete. 2026-03-09T16:21:32.066 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 13 2026-03-09T16:21:32.211 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:32.251 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:32 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1719375176' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-09T16:21:32.251 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:32 vm03.local ceph-mon[133973]: pgmap v176: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:32.251 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:32 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/850691557' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.442+0000 7fe59a327640 1 -- 192.168.123.103:0/3420583096 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 msgr2=0x7fe594108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.442+0000 7fe59a327640 1 --2- 192.168.123.103:0/3420583096 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe594108db0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fe578009a00 tx=0x7fe57802f280 comp rx=0 tx=0).stop 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 -- 192.168.123.103:0/3420583096 shutdown_connections 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 --2- 192.168.123.103:0/3420583096 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe594102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 --2- 192.168.123.103:0/3420583096 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe594108db0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 -- 192.168.123.103:0/3420583096 >> 192.168.123.103:0/3420583096 conn(0x7fe5940fe710 msgr2=0x7fe594100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 -- 192.168.123.103:0/3420583096 shutdown_connections 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.443+0000 7fe59a327640 1 -- 192.168.123.103:0/3420583096 wait complete. 2026-03-09T16:21:32.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 Processor -- start 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 -- start start 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe5941a0600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe5941a11d0 con 0x7fe5941089d0 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.444+0000 7fe59a327640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe59419a710 con 0x7fe5941029d0 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe593fff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe5941a0600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56604/0 (socket says 192.168.123.103:56604) 2026-03-09T16:21:32.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 -- 192.168.123.103:0/2702133835 learned_addr learned my addr 192.168.123.103:0/2702133835 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 -- 192.168.123.103:0/2702133835 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 msgr2=0x7fe5941a0600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe5941a0600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 -- 192.168.123.103:0/2702133835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe578009660 con 0x7fe5941089d0 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe593fff640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe5941a0600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5937fe640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fe58000e9b0 tx=0x7fe58000ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.445+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe58000cd90 con 0x7fe5941089d0 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.446+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe59419a9f0 con 0x7fe5941089d0 2026-03-09T16:21:32.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.446+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe59419af40 con 0x7fe5941089d0 2026-03-09T16:21:32.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.446+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe580004590 con 0x7fe5941089d0 2026-03-09T16:21:32.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.446+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe580010640 con 0x7fe5941089d0 2026-03-09T16:21:32.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.446+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe594104110 con 0x7fe5941089d0 2026-03-09T16:21:32.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.449+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe5800040d0 con 0x7fe5941089d0 2026-03-09T16:21:32.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.450+0000 7fe5917fa640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 0x7fe564079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:32.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.450+0000 7fe593fff640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 0x7fe564079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:32.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.451+0000 7fe593fff640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 0x7fe564079e70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fe578004580 tx=0x7fe578004060 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:32.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.451+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fe58001d030 con 0x7fe5941089d0 2026-03-09T16:21:32.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.451+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe58009a510 con 0x7fe5941089d0 2026-03-09T16:21:32.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:32 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1719375176' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-09T16:21:32.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:32 vm05.local ceph-mon[108543]: pgmap v176: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:32.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:32 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/850691557' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T16:21:32.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.563+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7fe594102e30 con 0x7fe5941089d0 2026-03-09T16:21:32.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.566+0000 7fe5917fa640 1 -- 192.168.123.103:0/2702133835 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v32) v1 ==== 107+0+4121 (secure 0 0 0) 0x7fe580062a50 con 0x7fe5941089d0 2026-03-09T16:21:32.567 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:32.567 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":13,"btime":"2026-03-09T16:19:18:202839+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:18.201614+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:32.567 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 13 2026-03-09T16:21:32.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.568+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 msgr2=0x7fe564079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.568+0000 7fe59a327640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 0x7fe564079e70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fe578004580 tx=0x7fe578004060 comp rx=0 tx=0).stop 2026-03-09T16:21:32.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 msgr2=0x7fe5941a0b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:32.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fe58000e9b0 tx=0x7fe58000ee80 comp rx=0 tx=0).stop 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 shutdown_connections 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe5640779b0 0x7fe564079e70 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5941089d0 0x7fe5941a0b40 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 --2- 192.168.123.103:0/2702133835 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe5941029d0 0x7fe5941a0600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 >> 192.168.123.103:0/2702133835 conn(0x7fe5940fe710 msgr2=0x7fe5940feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 shutdown_connections 2026-03-09T16:21:32.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:32.569+0000 7fe59a327640 1 -- 192.168.123.103:0/2702133835 wait complete. 2026-03-09T16:21:32.631 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 14 2026-03-09T16:21:32.778 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.011+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2920969283 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 msgr2=0x7f9d44100be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.011+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2920969283 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44100be0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f9d280099b0 tx=0x7f9d2802f220 comp rx=0 tx=0).stop 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2920969283 shutdown_connections 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2920969283 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44100be0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2920969283 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44106b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2920969283 >> 192.168.123.103:0/2920969283 conn(0x7f9d440fc460 msgr2=0x7f9d440fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2920969283 shutdown_connections 2026-03-09T16:21:33.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.012+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2920969283 wait complete. 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.013+0000 7f9d49f2b640 1 Processor -- start 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.013+0000 7f9d49f2b640 1 -- start start 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d49f2b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44198770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d49f2b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d49f2b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d44199390 con 0x7f9d44100780 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d49f2b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d4419bfb0 con 0x7f9d44106780 2026-03-09T16:21:33.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43404/0 (socket says 192.168.123.103:43404) 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 -- 192.168.123.103:0/2593251443 learned_addr learned my addr 192.168.123.103:0/2593251443 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d437fe640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44198770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 -- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 msgr2=0x7f9d44198770 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44198770 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 -- 192.168.123.103:0/2593251443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d28009660 con 0x7f9d44106780 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d437fe640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44198770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.014+0000 7f9d42ffd640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f9d2802f730 tx=0x7f9d280043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d2803d070 con 0x7f9d44106780 2026-03-09T16:21:33.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d4419c230 con 0x7f9d44106780 2026-03-09T16:21:33.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d4419c720 con 0x7f9d44106780 2026-03-09T16:21:33.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9d2802fc90 con 0x7f9d44106780 2026-03-09T16:21:33.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d28041770 con 0x7f9d44106780 2026-03-09T16:21:33.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.015+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d44101ec0 con 0x7f9d44106780 2026-03-09T16:21:33.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.016+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9d28038730 con 0x7f9d44106780 2026-03-09T16:21:33.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.017+0000 7f9d40ff9640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 0x7f9d1c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.017+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f9d280be5c0 con 0x7f9d44106780 2026-03-09T16:21:33.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.017+0000 7f9d437fe640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 0x7f9d1c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.017+0000 7f9d437fe640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 0x7f9d1c079da0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f9d30006fd0 tx=0x7f9d30008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:33.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.019+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d28086ca0 con 0x7f9d44106780 2026-03-09T16:21:33.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.130+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f9d4410dab0 con 0x7f9d44106780 2026-03-09T16:21:33.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.132+0000 7f9d40ff9640 1 -- 192.168.123.103:0/2593251443 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v32) v1 ==== 107+0+4132 (secure 0 0 0) 0x7f9d280863f0 con 0x7f9d44106780 2026-03-09T16:21:33.134 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:33.134 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":14,"btime":"2026-03-09T16:19:18:213414+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:18.213355+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":14,"state":"up:replay","state_seq":2,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:33.134 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-09T16:21:33.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 msgr2=0x7f9d1c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 0x7f9d1c079da0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f9d30006fd0 tx=0x7f9d30008040 comp rx=0 tx=0).stop 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 msgr2=0x7f9d44198cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f9d2802f730 tx=0x7f9d280043d0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 shutdown_connections 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9d1c0778e0 0x7f9d1c079da0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d44106780 0x7f9d44198cb0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 --2- 192.168.123.103:0/2593251443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d44100780 0x7f9d44198770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.136+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 >> 192.168.123.103:0/2593251443 conn(0x7f9d440fc460 msgr2=0x7f9d4410a740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.137+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 shutdown_connections 2026-03-09T16:21:33.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.137+0000 7f9d49f2b640 1 -- 192.168.123.103:0/2593251443 wait complete. 2026-03-09T16:21:33.202 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 15 2026-03-09T16:21:33.348 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:33.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:33 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2702133835' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T16:21:33.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:33 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2593251443' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T16:21:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:33 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2702133835' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T16:21:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:33 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2593251443' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T16:21:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.580+0000 7f069ca56640 1 -- 192.168.123.103:0/3532338780 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 msgr2=0x7f0698075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.580+0000 7f069ca56640 1 --2- 192.168.123.103:0/3532338780 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698075b00 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f0680009a00 tx=0x7f068002f280 comp rx=0 tx=0).stop 2026-03-09T16:21:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 -- 192.168.123.103:0/3532338780 shutdown_connections 2026-03-09T16:21:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 --2- 192.168.123.103:0/3532338780 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f0698111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 --2- 192.168.123.103:0/3532338780 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698075b00 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 -- 192.168.123.103:0/3532338780 >> 192.168.123.103:0/3532338780 conn(0x7f06980fe710 msgr2=0x7f0698100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 -- 192.168.123.103:0/3532338780 shutdown_connections 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 -- 192.168.123.103:0/3532338780 wait complete. 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 Processor -- start 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.581+0000 7f069ca56640 1 -- start start 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f069ca56640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698108090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f069ca56640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f069ca56640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06981031b0 con 0x7f0698075720 2026-03-09T16:21:33.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f069ca56640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0698103320 con 0x7f0698076040 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43422/0 (socket says 192.168.123.103:43422) 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 -- 192.168.123.103:0/2021832892 learned_addr learned my addr 192.168.123.103:0/2021832892 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 -- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 msgr2=0x7f0698108090 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0696575640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698108090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698108090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0695d74640 1 -- 192.168.123.103:0/2021832892 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0680009660 con 0x7f0698076040 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.582+0000 7f0696575640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698108090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.583+0000 7f0695d74640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f068c00e9b0 tx=0x7f068c00ee80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:33.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.583+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f068c00cd90 con 0x7f0698076040 2026-03-09T16:21:33.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.583+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0698103600 con 0x7f0698076040 2026-03-09T16:21:33.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.583+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0698103b50 con 0x7f0698076040 2026-03-09T16:21:33.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.584+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f068c004590 con 0x7f0698076040 2026-03-09T16:21:33.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.584+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f068c010640 con 0x7f0698076040 2026-03-09T16:21:33.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.584+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0664005350 con 0x7f0698076040 2026-03-09T16:21:33.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.585+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f068c0040d0 con 0x7f0698076040 2026-03-09T16:21:33.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.585+0000 7f067f7fe640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 0x7f0668079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:33.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.585+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f068c014070 con 0x7f0698076040 2026-03-09T16:21:33.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.586+0000 7f0696575640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 0x7f0668079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:33.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.586+0000 7f0696575640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 0x7f0668079d50 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f06800040c0 tx=0x7f06800023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:33.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.588+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f068c062ad0 con 0x7f0698076040 2026-03-09T16:21:33.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.703+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f06640051c0 con 0x7f0698076040 2026-03-09T16:21:33.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.704+0000 7f067f7fe640 1 -- 192.168.123.103:0/2021832892 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v32) v1 ==== 107+0+4980 (secure 0 0 0) 0x7f068c062220 con 0x7f0698076040 2026-03-09T16:21:33.705 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:33.706 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":15,"btime":"2026-03-09T16:19:23:531000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:18.213355+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":14,"state":"up:replay","state_seq":2,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:33.706 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 15 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 msgr2=0x7f0668079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 0x7f0668079d50 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f06800040c0 tx=0x7f06800023d0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 msgr2=0x7f06981085d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f068c00e9b0 tx=0x7f068c00ee80 comp rx=0 tx=0).stop 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 shutdown_connections 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f0668077890 0x7f0668079d50 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0698076040 0x7f06981085d0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 --2- 192.168.123.103:0/2021832892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0698075720 0x7f0698108090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.707+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 >> 192.168.123.103:0/2021832892 conn(0x7f06980fe710 msgr2=0x7f06980ff590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:33.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.708+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 shutdown_connections 2026-03-09T16:21:33.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:33.708+0000 7f069ca56640 1 -- 192.168.123.103:0/2021832892 wait complete. 2026-03-09T16:21:33.767 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 16 2026-03-09T16:21:33.912 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:34.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.149+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1773398838 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f00ff760 msgr2=0x7f85f00ffb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.149+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1773398838 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f00ff760 0x7f85f00ffb40 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f85dc0098e0 tx=0x7f85dc02f1e0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1773398838 shutdown_connections 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1773398838 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f0100080 0x7f85f0104550 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1773398838 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f00ff760 0x7f85f00ffb40 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1773398838 >> 192.168.123.103:0/1773398838 conn(0x7f85f00fb3d0 msgr2=0x7f85f00fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1773398838 shutdown_connections 2026-03-09T16:21:34.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.150+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1773398838 wait complete. 2026-03-09T16:21:34.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 Processor -- start 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 -- start start 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 0x7f85f019ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f019fa20 con 0x7f85f00ff760 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85f4eb1640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f01a3790 con 0x7f85f0100080 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85edd74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85edd74640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43446/0 (socket says 192.168.123.103:43446) 2026-03-09T16:21:34.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.152+0000 7f85edd74640 1 -- 192.168.123.103:0/1838411124 learned_addr learned my addr 192.168.123.103:0/1838411124 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.153+0000 7f85edd74640 1 -- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 msgr2=0x7f85f019ee50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85ee575640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 0x7f85f019ee50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85edd74640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 0x7f85f019ee50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85edd74640 1 -- 192.168.123.103:0/1838411124 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85e4009660 con 0x7f85f0100080 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85ee575640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 0x7f85f019ee50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:34.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85edd74640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f85e4002760 tx=0x7f85e4002c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:34.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85e400ecf0 con 0x7f85f0100080 2026-03-09T16:21:34.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85dc009590 con 0x7f85f0100080 2026-03-09T16:21:34.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.154+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85f01a3d50 con 0x7f85f0100080 2026-03-09T16:21:34.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.155+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f85e4002e90 con 0x7f85f0100080 2026-03-09T16:21:34.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.155+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85e400f6f0 con 0x7f85f0100080 2026-03-09T16:21:34.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.156+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f85e4016070 con 0x7f85f0100080 2026-03-09T16:21:34.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.157+0000 7f85d37fe640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 0x7f85c0079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.157+0000 7f85ee575640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 0x7f85c0079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:34.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.157+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f85e409b1a0 con 0x7f85f0100080 2026-03-09T16:21:34.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.157+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f85b4005350 con 0x7f85f0100080 2026-03-09T16:21:34.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.158+0000 7f85ee575640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 0x7f85c0079da0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f85dc002410 tx=0x7f85dc005c20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:34.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.160+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f85e40637d0 con 0x7f85f0100080 2026-03-09T16:21:34.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.271+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f85b40051c0 con 0x7f85f0100080 2026-03-09T16:21:34.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:34 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2021832892' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T16:21:34.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:34 vm03.local ceph-mon[133973]: pgmap v177: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:34.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.272+0000 7f85d37fe640 1 -- 192.168.123.103:0/1838411124 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v32) v1 ==== 107+0+4985 (secure 0 0 0) 0x7f85e4062f20 con 0x7f85f0100080 2026-03-09T16:21:34.275 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:34.276 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":16,"btime":"2026-03-09T16:19:25:560057+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:24.865138+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":14,"state":"up:reconnect","state_seq":109,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:34.276 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 msgr2=0x7f85c0079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 0x7f85c0079da0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f85dc002410 tx=0x7f85dc005c20 comp rx=0 tx=0).stop 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 msgr2=0x7f85f019f390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f85e4002760 tx=0x7f85e4002c30 comp rx=0 tx=0).stop 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 shutdown_connections 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85c00778e0 0x7f85c0079da0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85f0100080 0x7f85f019f390 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 --2- 192.168.123.103:0/1838411124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f00ff760 0x7f85f019ee50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.277+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 >> 192.168.123.103:0/1838411124 conn(0x7f85f00fb3d0 msgr2=0x7f85f00fcd00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.278+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 shutdown_connections 2026-03-09T16:21:34.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.278+0000 7f85f4eb1640 1 -- 192.168.123.103:0/1838411124 wait complete. 2026-03-09T16:21:34.337 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 17 2026-03-09T16:21:34.475 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:34.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:34 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2021832892' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T16:21:34.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:34 vm05.local ceph-mon[108543]: pgmap v177: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:34.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.706+0000 7f856453a640 1 -- 192.168.123.103:0/4190335045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c073a50 msgr2=0x7f855c073e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.706+0000 7f856453a640 1 --2- 192.168.123.103:0/4190335045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c073a50 0x7f855c073e90 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f854c009a30 tx=0x7f854c02f380 comp rx=0 tx=0).stop 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 -- 192.168.123.103:0/4190335045 shutdown_connections 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 --2- 192.168.123.103:0/4190335045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c073a50 0x7f855c073e90 secure :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f854c009a30 tx=0x7f854c02f380 comp rx=0 tx=0).stop 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 --2- 192.168.123.103:0/4190335045 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855c104650 0x7f855c073510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 -- 192.168.123.103:0/4190335045 >> 192.168.123.103:0/4190335045 conn(0x7f855c0fc460 msgr2=0x7f855c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 -- 192.168.123.103:0/4190335045 shutdown_connections 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.707+0000 7f856453a640 1 -- 192.168.123.103:0/4190335045 wait complete. 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 Processor -- start 2026-03-09T16:21:34.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 -- start start 2026-03-09T16:21:34.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855c19f630 0x7f855c1a3a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f855c19fc30 con 0x7f855c104650 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f856453a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f855c19fda0 con 0x7f855c19f630 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56678/0 (socket says 192.168.123.103:56678) 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 -- 192.168.123.103:0/1058981744 learned_addr learned my addr 192.168.123.103:0/1058981744 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 -- 192.168.123.103:0/1058981744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855c19f630 msgr2=0x7f855c1a3a30 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:21:34.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855c19f630 0x7f855c1a3a30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.708+0000 7f85622af640 1 -- 192.168.123.103:0/1058981744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f854c009660 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f85622af640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f854400d8d0 tx=0x7f854400dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8544004490 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f854400bd00 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8544010460 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f855c1a4030 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.709+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f855c1a4580 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.710+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f85440105c0 con 0x7f855c104650 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.711+0000 7f85537fe640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 0x7f8538079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:34.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.711+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f854409acd0 con 0x7f855c104650 2026-03-09T16:21:34.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.712+0000 7f8561aae640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 0x7f8538079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:34.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.712+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f855c110c90 con 0x7f855c104650 2026-03-09T16:21:34.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.715+0000 7f8561aae640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 0x7f8538079e70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f854c008000 tx=0x7f854c0023d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:34.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.715+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8544063380 con 0x7f855c104650 2026-03-09T16:21:34.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.827+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f855c1a4860 con 0x7f855c104650 2026-03-09T16:21:34.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.827+0000 7f85537fe640 1 -- 192.168.123.103:0/1058981744 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v32) v1 ==== 107+0+4982 (secure 0 0 0) 0x7f8544014090 con 0x7f855c104650 2026-03-09T16:21:34.828 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:34.828 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":17,"btime":"2026-03-09T16:19:26:562529+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:25.569786+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":14,"state":"up:rejoin","state_seq":110,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:34.828 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 17 2026-03-09T16:21:34.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 msgr2=0x7f8538079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 0x7f8538079e70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f854c008000 tx=0x7f854c0023d0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 msgr2=0x7f855c19f0f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f854400d8d0 tx=0x7f854400dda0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 shutdown_connections 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f85380779b0 0x7f8538079e70 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855c19f630 0x7f855c1a3a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 --2- 192.168.123.103:0/1058981744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f855c104650 0x7f855c19f0f0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.830+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 >> 192.168.123.103:0/1058981744 conn(0x7f855c0fc460 msgr2=0x7f855c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.831+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 shutdown_connections 2026-03-09T16:21:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:34.831+0000 7f856453a640 1 -- 192.168.123.103:0/1058981744 wait complete. 2026-03-09T16:21:34.893 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 18 2026-03-09T16:21:35.036 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 -- 192.168.123.103:0/3354090917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072cf0 msgr2=0x7f08e410cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3354090917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072cf0 0x7f08e410cd90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f08d0009a00 tx=0x7f08d002f290 comp rx=0 tx=0).stop 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 -- 192.168.123.103:0/3354090917 shutdown_connections 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3354090917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072cf0 0x7f08e410cd90 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3354090917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072340 0x7f08e4072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.276+0000 7f08e94f5640 1 -- 192.168.123.103:0/3354090917 >> 192.168.123.103:0/3354090917 conn(0x7f08e406b7f0 msgr2=0x7f08e406bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.277+0000 7f08e94f5640 1 -- 192.168.123.103:0/3354090917 shutdown_connections 2026-03-09T16:21:35.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.277+0000 7f08e94f5640 1 -- 192.168.123.103:0/3354090917 wait complete. 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 Processor -- start 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 -- start start 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 0x7f08e41adbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f08e41a7730 con 0x7f08e4072340 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e94f5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f08e41a7870 con 0x7f08e4072cf0 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56688/0 (socket says 192.168.123.103:56688) 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 -- 192.168.123.103:0/3746077966 learned_addr learned my addr 192.168.123.103:0/3746077966 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08da5ff640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 0x7f08e41adbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 -- 192.168.123.103:0/3746077966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 msgr2=0x7f08e41adbc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 0x7f08e41adbc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.278+0000 7f08e2ffd640 1 -- 192.168.123.103:0/3746077966 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f08d0009660 con 0x7f08e4072340 2026-03-09T16:21:35.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08da5ff640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 0x7f08e41adbc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:35.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08e2ffd640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f08cc00b4f0 tx=0x7f08cc00b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:35.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f08cc004280 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f08cc0043e0 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f08e41a7a70 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.279+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f08cc010af0 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.280+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f08e41a7f70 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.281+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f08cc002780 con 0x7f08e4072340 2026-03-09T16:21:35.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.281+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f08e4108780 con 0x7f08e4072340 2026-03-09T16:21:35.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.281+0000 7f08e0ff9640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 0x7f08b8079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.281+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f08cc0990d0 con 0x7f08e4072340 2026-03-09T16:21:35.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.282+0000 7f08da5ff640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 0x7f08b8079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.282+0000 7f08da5ff640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 0x7f08b8079b90 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f08d0038660 tx=0x7f08d0038470 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:35.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.284+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f08cc0617c0 con 0x7f08e4072340 2026-03-09T16:21:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:35 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1838411124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T16:21:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:35 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/1058981744' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T16:21:35.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.396+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7f08e41a8cb0 con 0x7f08e4072340 2026-03-09T16:21:35.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.397+0000 7f08e0ff9640 1 -- 192.168.123.103:0/3746077966 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v32) v1 ==== 107+0+4991 (secure 0 0 0) 0x7f08cc060f10 con 0x7f08e4072340 2026-03-09T16:21:35.397 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:35.398 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":18,"btime":"2026-03-09T16:19:27:564851+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:27.564849+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":14,"state":"up:active","state_seq":111,"addr":"192.168.123.103:6829/3419491835","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3419491835},{"type":"v1","addr":"192.168.123.103:6829","nonce":3419491835}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14492,"qdb_cluster":[14492]},"id":1}]} 2026-03-09T16:21:35.398 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 18 2026-03-09T16:21:35.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.399+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 msgr2=0x7f08b8079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.399+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 0x7f08b8079b90 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f08d0038660 tx=0x7f08d0038470 comp rx=0 tx=0).stop 2026-03-09T16:21:35.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.399+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 msgr2=0x7f08e41ad680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.399+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f08cc00b4f0 tx=0x7f08cc00b9c0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 shutdown_connections 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f08b80776d0 0x7f08b8079b90 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f08e4072cf0 0x7f08e41adbc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 --2- 192.168.123.103:0/3746077966 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08e4072340 0x7f08e41ad680 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 >> 192.168.123.103:0/3746077966 conn(0x7f08e406b7f0 msgr2=0x7f08e410e010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 shutdown_connections 2026-03-09T16:21:35.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.400+0000 7f08e94f5640 1 -- 192.168.123.103:0/3746077966 wait complete. 2026-03-09T16:21:35.463 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 19 2026-03-09T16:21:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:35 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1838411124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T16:21:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:35 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/1058981744' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T16:21:35.607 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 -- 192.168.123.103:0/4035298512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072a40 msgr2=0x7fd88410ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 --2- 192.168.123.103:0/4035298512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072a40 0x7fd88410ca90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fd870009a00 tx=0x7fd87002f270 comp rx=0 tx=0).stop 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 -- 192.168.123.103:0/4035298512 shutdown_connections 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 --2- 192.168.123.103:0/4035298512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072a40 0x7fd88410ca90 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 --2- 192.168.123.103:0/4035298512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072120 0x7fd884072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 -- 192.168.123.103:0/4035298512 >> 192.168.123.103:0/4035298512 conn(0x7fd88406c7d0 msgr2=0x7fd88406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:35.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 -- 192.168.123.103:0/4035298512 shutdown_connections 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.832+0000 7fd888cac640 1 -- 192.168.123.103:0/4035298512 wait complete. 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.833+0000 7fd888cac640 1 Processor -- start 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.833+0000 7fd888cac640 1 -- start start 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.833+0000 7fd888cac640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56706/0 (socket says 192.168.123.103:56706) 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd888cac640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072a40 0x7fd8841a7b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd888cac640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8841a81f0 con 0x7fd884072120 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd888cac640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8841abf60 con 0x7fd884072a40 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 -- 192.168.123.103:0/2662054 learned_addr learned my addr 192.168.123.103:0/2662054 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd882ffd640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072a40 0x7fd8841a7b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 -- 192.168.123.103:0/2662054 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072a40 msgr2=0x7fd8841a7b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072a40 0x7fd8841a7b60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 -- 192.168.123.103:0/2662054 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd870009660 con 0x7fd884072120 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.834+0000 7fd8837fe640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fd87400ece0 tx=0x7fd87400c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:35.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.835+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd87400eea0 con 0x7fd884072120 2026-03-09T16:21:35.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.835+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd874004590 con 0x7fd884072120 2026-03-09T16:21:35.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.835+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8841ac240 con 0x7fd884072120 2026-03-09T16:21:35.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.835+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8841ac790 con 0x7fd884072120 2026-03-09T16:21:35.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.836+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd874010640 con 0x7fd884072120 2026-03-09T16:21:35.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.836+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd850005350 con 0x7fd884072120 2026-03-09T16:21:35.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.841+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd8740026e0 con 0x7fd884072120 2026-03-09T16:21:35.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.842+0000 7fd880ff9640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 0x7fd84c079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:35.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.842+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fd874014070 con 0x7fd884072120 2026-03-09T16:21:35.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.842+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd87409a290 con 0x7fd884072120 2026-03-09T16:21:35.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.843+0000 7fd882ffd640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 0x7fd84c079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:35.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.846+0000 7fd882ffd640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 0x7fd84c079b90 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fd8841a8bd0 tx=0x7fd870005c50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:35.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.957+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7fd8500058d0 con 0x7fd884072120 2026-03-09T16:21:35.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.960+0000 7fd880ff9640 1 -- 192.168.123.103:0/2662054 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v32) v1 ==== 107+0+4186 (secure 0 0 0) 0x7fd8740626e0 con 0x7fd884072120 2026-03-09T16:21:35.961 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:35.961 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":19,"btime":"2026-03-09T16:19:37:067810+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:37.067809+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:35.961 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 19 2026-03-09T16:21:35.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 msgr2=0x7fd84c079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 0x7fd84c079b90 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fd8841a8bd0 tx=0x7fd870005c50 comp rx=0 tx=0).stop 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 msgr2=0x7fd8841a7620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fd87400ece0 tx=0x7fd87400c6a0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 shutdown_connections 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fd84c0776d0 0x7fd84c079b90 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd884072a40 0x7fd8841a7b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 --2- 192.168.123.103:0/2662054 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd884072120 0x7fd8841a7620 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 >> 192.168.123.103:0/2662054 conn(0x7fd88406c7d0 msgr2=0x7fd88410dd40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 shutdown_connections 2026-03-09T16:21:35.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:35.963+0000 7fd888cac640 1 -- 192.168.123.103:0/2662054 wait complete. 2026-03-09T16:21:36.031 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 20 2026-03-09T16:21:36.182 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:36.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:36 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3746077966' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T16:21:36.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:36 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2662054' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T16:21:36.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:36 vm03.local ceph-mon[133973]: pgmap v178: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.412+0000 7f5bc817b640 1 -- 192.168.123.103:0/2759316456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01089d0 msgr2=0x7f5bc0108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.412+0000 7f5bc817b640 1 --2- 192.168.123.103:0/2759316456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01089d0 0x7f5bc0108db0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f5bb00099b0 tx=0x7f5bb002f220 comp rx=0 tx=0).stop 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 -- 192.168.123.103:0/2759316456 shutdown_connections 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 --2- 192.168.123.103:0/2759316456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01029d0 0x7f5bc0102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 --2- 192.168.123.103:0/2759316456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01089d0 0x7f5bc0108db0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 -- 192.168.123.103:0/2759316456 >> 192.168.123.103:0/2759316456 conn(0x7f5bc00fe710 msgr2=0x7f5bc0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 -- 192.168.123.103:0/2759316456 shutdown_connections 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.413+0000 7f5bc817b640 1 -- 192.168.123.103:0/2759316456 wait complete. 2026-03-09T16:21:36.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 Processor -- start 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 -- start start 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 0x7f5bc01a0700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bc019a7f0 con 0x7f5bc01029d0 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.414+0000 7f5bc817b640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bc019a960 con 0x7f5bc01089d0 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43504/0 (socket says 192.168.123.103:43504) 2026-03-09T16:21:36.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 -- 192.168.123.103:0/564201037 learned_addr learned my addr 192.168.123.103:0/564201037 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc5ef0640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 0x7f5bc01a0700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 -- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 msgr2=0x7f5bc01a0700 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 0x7f5bc01a0700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 -- 192.168.123.103:0/564201037 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5bb0009660 con 0x7f5bc01089d0 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc5ef0640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 0x7f5bc01a0700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5bc56ef640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5bb400b700 tx=0x7f5bb400bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.415+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bb400be90 con 0x7f5bc01089d0 2026-03-09T16:21:36.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.416+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5bc019ac40 con 0x7f5bc01089d0 2026-03-09T16:21:36.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.416+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5bc019b190 con 0x7f5bc01089d0 2026-03-09T16:21:36.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.416+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5bb4002ba0 con 0x7f5bc01089d0 2026-03-09T16:21:36.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.416+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bb400ca40 con 0x7f5bc01089d0 2026-03-09T16:21:36.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.417+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5bb400cc80 con 0x7f5bc01089d0 2026-03-09T16:21:36.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.417+0000 7f5baeffd640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 0x7f5b94079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.418+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f5bb4099040 con 0x7f5bc01089d0 2026-03-09T16:21:36.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.418+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5bc0104110 con 0x7f5bc01089d0 2026-03-09T16:21:36.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.418+0000 7f5bc5ef0640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 0x7f5b94079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.418+0000 7f5bc5ef0640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 0x7f5b94079da0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f5bb0002410 tx=0x7f5bb003a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:36.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.421+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5bb40627b0 con 0x7f5bc01089d0 2026-03-09T16:21:36.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:36 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3746077966' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T16:21:36.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:36 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2662054' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T16:21:36.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:36 vm05.local ceph-mon[108543]: pgmap v178: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:36.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.537+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f5bc019bd00 con 0x7f5bc01089d0 2026-03-09T16:21:36.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.538+0000 7f5baeffd640 1 -- 192.168.123.103:0/564201037 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v32) v1 ==== 107+0+4197 (secure 0 0 0) 0x7f5bb4061f00 con 0x7f5bc01089d0 2026-03-09T16:21:36.539 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:36.539 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":20,"btime":"2026-03-09T16:19:37:075943+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:37.075929+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:replay","state_seq":2,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:36.539 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 20 2026-03-09T16:21:36.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 msgr2=0x7f5b94079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 0x7f5b94079da0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f5bb0002410 tx=0x7f5bb003a040 comp rx=0 tx=0).stop 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 msgr2=0x7f5bc01a0c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5bb400b700 tx=0x7f5bb400bbd0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 shutdown_connections 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5b940778e0 0x7f5b94079da0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5bc01089d0 0x7f5bc01a0c40 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 --2- 192.168.123.103:0/564201037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bc01029d0 0x7f5bc01a0700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.541+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 >> 192.168.123.103:0/564201037 conn(0x7f5bc00fe710 msgr2=0x7f5bc010c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.542+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 shutdown_connections 2026-03-09T16:21:36.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.542+0000 7f5bc817b640 1 -- 192.168.123.103:0/564201037 wait complete. 2026-03-09T16:21:36.585 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 21 2026-03-09T16:21:36.729 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.970+0000 7f8988759640 1 -- 192.168.123.103:0/669697351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 msgr2=0x7f8980108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.970+0000 7f8988759640 1 --2- 192.168.123.103:0/669697351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f8980108db0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f89740099b0 tx=0x7f897402f240 comp rx=0 tx=0).stop 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 -- 192.168.123.103:0/669697351 shutdown_connections 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 --2- 192.168.123.103:0/669697351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 0x7f8980102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 --2- 192.168.123.103:0/669697351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f8980108db0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 -- 192.168.123.103:0/669697351 >> 192.168.123.103:0/669697351 conn(0x7f89800fe710 msgr2=0x7f8980100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 -- 192.168.123.103:0/669697351 shutdown_connections 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.971+0000 7f8988759640 1 -- 192.168.123.103:0/669697351 wait complete. 2026-03-09T16:21:36.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 Processor -- start 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 -- start start 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 0x7f89801a0600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89801a1160 con 0x7f89801089d0 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8988759640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f898019a6f0 con 0x7f89801029d0 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.972+0000 7f8985ccd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56746/0 (socket says 192.168.123.103:56746) 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 -- 192.168.123.103:0/4074825854 learned_addr learned my addr 192.168.123.103:0/4074825854 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:36.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f89864ce640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 0x7f89801a0600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 -- 192.168.123.103:0/4074825854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 msgr2=0x7f89801a0600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:36.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 0x7f89801a0600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:36.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 -- 192.168.123.103:0/4074825854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8974009660 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8985ccd640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f897000e990 tx=0x7f897000ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f897000cd30 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f898019a9d0 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.973+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f898019af20 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.974+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f897000ce90 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.974+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8970022640 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.975+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f89700227a0 con 0x7f89801089d0 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.975+0000 7f89677fe640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 0x7f8954079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:36.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.976+0000 7f89864ce640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 0x7f8954079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:36.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.976+0000 7f89864ce640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 0x7f8954079da0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f89740040c0 tx=0x7f897403a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:36.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.976+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f8970014070 con 0x7f89801089d0 2026-03-09T16:21:36.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.976+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8948005350 con 0x7f89801089d0 2026-03-09T16:21:36.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:36.979+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f89700639b0 con 0x7f89801089d0 2026-03-09T16:21:37.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.088+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f89480051c0 con 0x7f89801089d0 2026-03-09T16:21:37.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.090+0000 7f89677fe640 1 -- 192.168.123.103:0/4074825854 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v32) v1 ==== 107+0+4202 (secure 0 0 0) 0x7f8970063100 con 0x7f89801089d0 2026-03-09T16:21:37.092 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:37.092 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":21,"btime":"2026-03-09T16:19:41:447378+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:40.605156+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:reconnect","state_seq":113,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:37.092 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 21 2026-03-09T16:21:37.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.093+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 msgr2=0x7f8954079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.093+0000 7f8988759640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 0x7f8954079da0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f89740040c0 tx=0x7f897403a040 comp rx=0 tx=0).stop 2026-03-09T16:21:37.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.093+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 msgr2=0x7f89801a0b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.093+0000 7f8988759640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f897000e990 tx=0x7f897000ee60 comp rx=0 tx=0).stop 2026-03-09T16:21:37.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 shutdown_connections 2026-03-09T16:21:37.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f89540778e0 0x7f8954079da0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f89801089d0 0x7f89801a0b40 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 --2- 192.168.123.103:0/4074825854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89801029d0 0x7f89801a0600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 >> 192.168.123.103:0/4074825854 conn(0x7f89800fe710 msgr2=0x7f898010b860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:37.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.094+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 shutdown_connections 2026-03-09T16:21:37.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.095+0000 7f8988759640 1 -- 192.168.123.103:0/4074825854 wait complete. 2026-03-09T16:21:37.159 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 22 2026-03-09T16:21:37.304 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:37 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/564201037' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T16:21:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:37 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/4074825854' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T16:21:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:37 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/564201037' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T16:21:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:37 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/4074825854' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.533+0000 7f30baaf9640 1 -- 192.168.123.103:0/2932050461 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41089d0 msgr2=0x7f30b4108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.533+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2932050461 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41089d0 0x7f30b4108db0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f309c0099b0 tx=0x7f309c02f220 comp rx=0 tx=0).stop 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.534+0000 7f30baaf9640 1 -- 192.168.123.103:0/2932050461 shutdown_connections 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.534+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2932050461 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41029d0 0x7f30b4102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.534+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2932050461 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41089d0 0x7f30b4108db0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.534+0000 7f30baaf9640 1 -- 192.168.123.103:0/2932050461 >> 192.168.123.103:0/2932050461 conn(0x7f30b40fe710 msgr2=0x7f30b4100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.534+0000 7f30baaf9640 1 -- 192.168.123.103:0/2932050461 shutdown_connections 2026-03-09T16:21:37.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 -- 192.168.123.103:0/2932050461 wait complete. 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 Processor -- start 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 -- start start 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 0x7f30b41a0600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30b41a10d0 con 0x7f30b41029d0 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.535+0000 7f30baaf9640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30b419a6f0 con 0x7f30b41089d0 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30abfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:37.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30b886e640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 0x7f30b41a0600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30abfff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:43548/0 (socket says 192.168.123.103:43548) 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30abfff640 1 -- 192.168.123.103:0/2135699842 learned_addr learned my addr 192.168.123.103:0/2135699842 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30b886e640 1 -- 192.168.123.103:0/2135699842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 msgr2=0x7f30b41a0b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30b886e640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30b886e640 1 -- 192.168.123.103:0/2135699842 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f309c009660 con 0x7f30b41029d0 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30abfff640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.536+0000 7f30b886e640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 0x7f30b41a0600 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f309c0099b0 tx=0x7f309c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.537+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f309c03d070 con 0x7f30b41029d0 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.537+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f309c038730 con 0x7f30b41029d0 2026-03-09T16:21:37.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.537+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f309c041740 con 0x7f30b41029d0 2026-03-09T16:21:37.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.537+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f30b419a970 con 0x7f30b41029d0 2026-03-09T16:21:37.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.537+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f30b419ade0 con 0x7f30b41029d0 2026-03-09T16:21:37.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.538+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f309c02fa80 con 0x7f30b41029d0 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.538+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3080005350 con 0x7f30b41029d0 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.540+0000 7f30a9ffb640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 0x7f3084079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.540+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f309c0be5a0 con 0x7f30b41029d0 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.541+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f309c086c50 con 0x7f30b41029d0 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.541+0000 7f30abfff640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 0x7f3084079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:37.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.542+0000 7f30abfff640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 0x7f3084079b90 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f30a40046d0 tx=0x7f30a4004420 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:37.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.652+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7f3080005600 con 0x7f30b41029d0 2026-03-09T16:21:37.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.652+0000 7f30a9ffb640 1 -- 192.168.123.103:0/2135699842 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v32) v1 ==== 107+0+4199 (secure 0 0 0) 0x7f309c0863a0 con 0x7f30b41029d0 2026-03-09T16:21:37.653 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:37.653 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":22,"btime":"2026-03-09T16:19:42:510832+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:41.515671+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:rejoin","state_seq":114,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:37.653 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 22 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.654+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 msgr2=0x7f3084079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.654+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 0x7f3084079b90 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f30a40046d0 tx=0x7f30a4004420 comp rx=0 tx=0).stop 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.654+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 msgr2=0x7f30b41a0600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.654+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 0x7f30b41a0600 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f309c0099b0 tx=0x7f309c004290 comp rx=0 tx=0).stop 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 shutdown_connections 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f30840776d0 0x7f3084079b90 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f30b41089d0 0x7f30b41a0b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 --2- 192.168.123.103:0/2135699842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f30b41029d0 0x7f30b41a0600 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 >> 192.168.123.103:0/2135699842 conn(0x7f30b40fe710 msgr2=0x7f30b410b860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 shutdown_connections 2026-03-09T16:21:37.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:37.655+0000 7f30baaf9640 1 -- 192.168.123.103:0/2135699842 wait complete. 2026-03-09T16:21:37.699 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 23 2026-03-09T16:21:37.837 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:38.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.083+0000 7fbe30f67640 1 -- 192.168.123.103:0/2620856306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 msgr2=0x7fbe2c111330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.083+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2620856306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c111330 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fbe200099b0 tx=0x7fbe2002f260 comp rx=0 tx=0).stop 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 -- 192.168.123.103:0/2620856306 shutdown_connections 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2620856306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c111330 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2620856306 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe2c075720 0x7fbe2c075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 -- 192.168.123.103:0/2620856306 >> 192.168.123.103:0/2620856306 conn(0x7fbe2c0fe710 msgr2=0x7fbe2c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 -- 192.168.123.103:0/2620856306 shutdown_connections 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 -- 192.168.123.103:0/2620856306 wait complete. 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.084+0000 7fbe30f67640 1 Processor -- start 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe30f67640 1 -- start start 2026-03-09T16:21:38.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe30f67640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe2c075720 0x7fbe2c19ee80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe30f67640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe30f67640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe2c19fa50 con 0x7fbe2c076040 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe30f67640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe2c1a37c0 con 0x7fbe2c075720 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56784/0 (socket says 192.168.123.103:56784) 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 -- 192.168.123.103:0/2122535618 learned_addr learned my addr 192.168.123.103:0/2122535618 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 -- 192.168.123.103:0/2122535618 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe2c075720 msgr2=0x7fbe2c19ee80 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe2c075720 0x7fbe2c19ee80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 -- 192.168.123.103:0/2122535618 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe20009660 con 0x7fbe2c076040 2026-03-09T16:21:38.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.085+0000 7fbe29d74640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fbe20002410 tx=0x7fbe20004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:38.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.086+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe2003d070 con 0x7fbe2c076040 2026-03-09T16:21:38.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.086+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbe200043b0 con 0x7fbe2c076040 2026-03-09T16:21:38.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.086+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe20041850 con 0x7fbe2c076040 2026-03-09T16:21:38.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.086+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbe2c1a3a40 con 0x7fbe2c076040 2026-03-09T16:21:38.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.086+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbe2c1a3f30 con 0x7fbe2c076040 2026-03-09T16:21:38.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.087+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbdf0005350 con 0x7fbe2c076040 2026-03-09T16:21:38.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.090+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbe2002fc90 con 0x7fbe2c076040 2026-03-09T16:21:38.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.090+0000 7fbe137fe640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 0x7fbe04079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.091+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fbe200bec60 con 0x7fbe2c076040 2026-03-09T16:21:38.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.091+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbe200bf0e0 con 0x7fbe2c076040 2026-03-09T16:21:38.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.091+0000 7fbe2a575640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 0x7fbe04079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:38.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.095+0000 7fbe2a575640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 0x7fbe04079da0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fbe140097b0 tx=0x7fbe14006d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:38.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.207+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7fbdf00051c0 con 0x7fbe2c076040 2026-03-09T16:21:38.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.208+0000 7fbe137fe640 1 -- 192.168.123.103:0/2122535618 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v32) v1 ==== 107+0+5059 (secure 0 0 0) 0x7fbe20087310 con 0x7fbe2c076040 2026-03-09T16:21:38.208 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:38.209 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":23,"btime":"2026-03-09T16:19:43:516826+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24291,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/1621230713","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":1621230713},{"type":"v1","addr":"192.168.123.105:6825","nonce":1621230713}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34272,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:43.516826+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:active","state_seq":115,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T16:21:38.209 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 23 2026-03-09T16:21:38.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 msgr2=0x7fbe04079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 0x7fbe04079da0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fbe140097b0 tx=0x7fbe14006d20 comp rx=0 tx=0).stop 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 msgr2=0x7fbe2c19f3c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fbe20002410 tx=0x7fbe20004290 comp rx=0 tx=0).stop 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 shutdown_connections 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fbe040778e0 0x7fbe04079da0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.210+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe2c076040 0x7fbe2c19f3c0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.211+0000 7fbe30f67640 1 --2- 192.168.123.103:0/2122535618 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe2c075720 0x7fbe2c19ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.211+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 >> 192.168.123.103:0/2122535618 conn(0x7fbe2c0fe710 msgr2=0x7fbe2c0ffdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.211+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 shutdown_connections 2026-03-09T16:21:38.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.211+0000 7fbe30f67640 1 -- 192.168.123.103:0/2122535618 wait complete. 2026-03-09T16:21:38.270 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 24 2026-03-09T16:21:38.406 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:38.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:38 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2135699842' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T16:21:38.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:38 vm03.local ceph-mon[133973]: pgmap v179: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:38 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2135699842' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T16:21:38.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:38 vm05.local ceph-mon[108543]: pgmap v179: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.652+0000 7f9c9ace6640 1 -- 192.168.123.103:0/1260181077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94102a00 msgr2=0x7f9c94102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.652+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/1260181077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94102a00 0x7f9c94102e60 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f9c84009a00 tx=0x7f9c8402f290 comp rx=0 tx=0).stop 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 -- 192.168.123.103:0/1260181077 shutdown_connections 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/1260181077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94102a00 0x7f9c94102e60 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/1260181077 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94108a00 0x7f9c94108de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 -- 192.168.123.103:0/1260181077 >> 192.168.123.103:0/1260181077 conn(0x7f9c940fe700 msgr2=0x7f9c94100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 -- 192.168.123.103:0/1260181077 shutdown_connections 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.653+0000 7f9c9ace6640 1 -- 192.168.123.103:0/1260181077 wait complete. 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 Processor -- start 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 -- start start 2026-03-09T16:21:38.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94102a00 0x7f9c941a06b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c941a1240 con 0x7f9c94108a00 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c9ace6640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c9419a7a0 con 0x7f9c94102a00 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c8bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c8bfff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36648/0 (socket says 192.168.123.103:36648) 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c8bfff640 1 -- 192.168.123.103:0/907426843 learned_addr learned my addr 192.168.123.103:0/907426843 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.654+0000 7f9c98a5b640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94102a00 0x7f9c941a06b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c8bfff640 1 -- 192.168.123.103:0/907426843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94102a00 msgr2=0x7f9c941a06b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c8bfff640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94102a00 0x7f9c941a06b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c8bfff640 1 -- 192.168.123.103:0/907426843 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c84009660 con 0x7f9c94108a00 2026-03-09T16:21:38.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c8bfff640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f9c840386a0 tx=0x7f9c840386d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:38.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c8403d070 con 0x7f9c94108a00 2026-03-09T16:21:38.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9c84002a50 con 0x7f9c94108a00 2026-03-09T16:21:38.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c84031110 con 0x7f9c94108a00 2026-03-09T16:21:38.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.655+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c9419aa20 con 0x7f9c94108a00 2026-03-09T16:21:38.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.656+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c9419ae90 con 0x7f9c94108a00 2026-03-09T16:21:38.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.657+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9c8402faf0 con 0x7f9c94108a00 2026-03-09T16:21:38.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.657+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c60005350 con 0x7f9c94108a00 2026-03-09T16:21:38.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.659+0000 7f9c89ffb640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 0x7f9c64079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:38.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.659+0000 7f9c98a5b640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 0x7f9c64079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:38.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.659+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f9c840be9c0 con 0x7f9c94108a00 2026-03-09T16:21:38.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.659+0000 7f9c98a5b640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 0x7f9c64079b90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f9c7c004620 tx=0x7f9c7c00a400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:38.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.661+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9c84087070 con 0x7f9c94108a00 2026-03-09T16:21:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.770+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f9c60005600 con 0x7f9c94108a00 2026-03-09T16:21:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.773+0000 7f9c89ffb640 1 -- 192.168.123.103:0/907426843 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v32) v1 ==== 107+0+4276 (secure 0 0 0) 0x7f9c840867c0 con 0x7f9c94108a00 2026-03-09T16:21:38.774 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:38.774 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":24,"btime":"2026-03-09T16:19:47:884222+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34272,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:43.516826+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:active","state_seq":115,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T16:21:38.774 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 24 2026-03-09T16:21:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 msgr2=0x7f9c64079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 0x7f9c64079b90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f9c7c004620 tx=0x7f9c7c00a400 comp rx=0 tx=0).stop 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 msgr2=0x7f9c941a0bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f9c840386a0 tx=0x7f9c840386d0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 shutdown_connections 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f9c640776d0 0x7f9c64079b90 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c94108a00 0x7f9c941a0bf0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 --2- 192.168.123.103:0/907426843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c94102a00 0x7f9c941a06b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 >> 192.168.123.103:0/907426843 conn(0x7f9c940fe700 msgr2=0x7f9c9410c9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 shutdown_connections 2026-03-09T16:21:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:38.775+0000 7f9c9ace6640 1 -- 192.168.123.103:0/907426843 wait complete. 2026-03-09T16:21:38.832 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 25 2026-03-09T16:21:38.978 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 -- 192.168.123.103:0/2912549267 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28075720 msgr2=0x7eff28075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 --2- 192.168.123.103:0/2912549267 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28075720 0x7eff28075b00 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7eff10009a00 tx=0x7eff1002f270 comp rx=0 tx=0).stop 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 -- 192.168.123.103:0/2912549267 shutdown_connections 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 --2- 192.168.123.103:0/2912549267 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28076040 0x7eff28111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 --2- 192.168.123.103:0/2912549267 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28075720 0x7eff28075b00 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.224+0000 7eff2d94a640 1 -- 192.168.123.103:0/2912549267 >> 192.168.123.103:0/2912549267 conn(0x7eff280fe710 msgr2=0x7eff28100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 -- 192.168.123.103:0/2912549267 shutdown_connections 2026-03-09T16:21:39.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 -- 192.168.123.103:0/2912549267 wait complete. 2026-03-09T16:21:39.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 Processor -- start 2026-03-09T16:21:39.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 -- start start 2026-03-09T16:21:39.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 0x7eff2819f390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.225+0000 7eff2d94a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff2819fa20 con 0x7eff28076040 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff2d94a640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff281a3740 con 0x7eff28075720 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff26ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff26ffd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55388/0 (socket says 192.168.123.103:55388) 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff26ffd640 1 -- 192.168.123.103:0/3794197570 learned_addr learned my addr 192.168.123.103:0/3794197570 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff26ffd640 1 -- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 msgr2=0x7eff2819f390 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.226+0000 7eff267fc640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 0x7eff2819f390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff26ffd640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 0x7eff2819f390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff26ffd640 1 -- 192.168.123.103:0/3794197570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7eff10009660 con 0x7eff28075720 2026-03-09T16:21:39.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff26ffd640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7eff1002f780 tx=0x7eff10004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:39.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff267fc640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 0x7eff2819f390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff10004430 con 0x7eff28075720 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7eff10002e10 con 0x7eff28075720 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff10041820 con 0x7eff28075720 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7eff281a39c0 con 0x7eff28075720 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.227+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7eff281a3eb0 con 0x7eff28075720 2026-03-09T16:21:39.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.228+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7eff1003f070 con 0x7eff28075720 2026-03-09T16:21:39.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.229+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7eff28076e60 con 0x7eff28075720 2026-03-09T16:21:39.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.229+0000 7eff2c948640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 0x7efef8079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.229+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7eff100be3a0 con 0x7eff28075720 2026-03-09T16:21:39.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.230+0000 7eff267fc640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 0x7efef8079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.230+0000 7eff267fc640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 0x7efef8079d50 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7eff281a03b0 tx=0x7eff14005f50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:39.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.232+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7eff100863b0 con 0x7eff28075720 2026-03-09T16:21:39.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:39 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2122535618' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T16:21:39.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:39 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/907426843' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T16:21:39.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:39.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.345+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7eff281141d0 con 0x7eff28075720 2026-03-09T16:21:39.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.346+0000 7eff2c948640 1 -- 192.168.123.103:0/3794197570 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v32) v1 ==== 107+0+5127 (secure 0 0 0) 0x7eff100861d0 con 0x7eff28075720 2026-03-09T16:21:39.349 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:39.349 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":25,"btime":"2026-03-09T16:19:49:742772+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34272,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:43.516826+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm05.sqhria","rank":0,"incarnation":20,"state":"up:active","state_seq":115,"addr":"192.168.123.105:6827/1138709798","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":1138709798},{"type":"v1","addr":"192.168.123.105:6827","nonce":1138709798}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T16:21:39.349 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 25 2026-03-09T16:21:39.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.350+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 msgr2=0x7efef8079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.350+0000 7eff2d94a640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 0x7efef8079d50 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7eff281a03b0 tx=0x7eff14005f50 comp rx=0 tx=0).stop 2026-03-09T16:21:39.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 msgr2=0x7eff2819ee50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7eff1002f780 tx=0x7eff10004290 comp rx=0 tx=0).stop 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 shutdown_connections 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efef8077890 0x7efef8079d50 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7eff28076040 0x7eff2819f390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 --2- 192.168.123.103:0/3794197570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7eff28075720 0x7eff2819ee50 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.351+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 >> 192.168.123.103:0/3794197570 conn(0x7eff280fe710 msgr2=0x7eff280ffdc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.352+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 shutdown_connections 2026-03-09T16:21:39.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.352+0000 7eff2d94a640 1 -- 192.168.123.103:0/3794197570 wait complete. 2026-03-09T16:21:39.414 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 26 2026-03-09T16:21:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:39 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2122535618' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T16:21:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:39 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/907426843' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T16:21:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:39.575 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/3546934173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 msgr2=0x7fb2f8102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/3546934173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f8102e30 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fb2e40099b0 tx=0x7fb2e402f220 comp rx=0 tx=0).stop 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/3546934173 shutdown_connections 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/3546934173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f8102e30 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/3546934173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f8108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.822+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/3546934173 >> 192.168.123.103:0/3546934173 conn(0x7fb2f80fe710 msgr2=0x7fb2f8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.823+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/3546934173 shutdown_connections 2026-03-09T16:21:39.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.823+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/3546934173 wait complete. 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.823+0000 7fb2fe2bd640 1 Processor -- start 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.823+0000 7fb2fe2bd640 1 -- start start 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2fe2bd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f81a0650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2fe2bd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2fe2bd640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2f819a790 con 0x7fb2f81029d0 2026-03-09T16:21:39.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2fe2bd640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2f819a900 con 0x7fb2f81089d0 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f7fff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f81a0650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55410/0 (socket says 192.168.123.103:55410) 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 -- 192.168.123.103:0/2802469318 learned_addr learned my addr 192.168.123.103:0/2802469318 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 -- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 msgr2=0x7fb2f81a0650 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f81a0650 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f77fe640 1 -- 192.168.123.103:0/2802469318 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2e4009660 con 0x7fb2f81089d0 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.824+0000 7fb2f7fff640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f81a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.825+0000 7fb2f77fe640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fb2e4002410 tx=0x7fb2e4004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:39.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.825+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e403d070 con 0x7fb2f81089d0 2026-03-09T16:21:39.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.825+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2f819ab80 con 0x7fb2f81089d0 2026-03-09T16:21:39.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.825+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2f819b070 con 0x7fb2f81089d0 2026-03-09T16:21:39.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.826+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb2e40043b0 con 0x7fb2f81089d0 2026-03-09T16:21:39.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.826+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e404a630 con 0x7fb2f81089d0 2026-03-09T16:21:39.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.826+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2f8104110 con 0x7fb2f81089d0 2026-03-09T16:21:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.827+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb2e4038730 con 0x7fb2f81089d0 2026-03-09T16:21:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.827+0000 7fb2f57fa640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 0x7fb2d0079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.827+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fb2e40be5d0 con 0x7fb2f81089d0 2026-03-09T16:21:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.828+0000 7fb2f7fff640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 0x7fb2d0079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.828+0000 7fb2f7fff640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 0x7fb2d0079da0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fb2dc0059c0 tx=0x7fb2dc00a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:39.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.830+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb2e4086cb0 con 0x7fb2f81089d0 2026-03-09T16:21:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.951+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7fb2f819bcf0 con 0x7fb2f81089d0 2026-03-09T16:21:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.952+0000 7fb2f57fa640 1 -- 192.168.123.103:0/2802469318 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v32) v1 ==== 107+0+4322 (secure 0 0 0) 0x7fb2e4086400 con 0x7fb2f81089d0 2026-03-09T16:21:39.953 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:39.953 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":26,"btime":"2026-03-09T16:19:51:825476+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34272,"name":"cephfs.vm03.kntrco","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:51.825475+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:39.953 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 26 2026-03-09T16:21:39.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 msgr2=0x7fb2d0079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 0x7fb2d0079da0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fb2dc0059c0 tx=0x7fb2dc00a380 comp rx=0 tx=0).stop 2026-03-09T16:21:39.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 msgr2=0x7fb2f81a0b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:39.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fb2e4002410 tx=0x7fb2e4004290 comp rx=0 tx=0).stop 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 shutdown_connections 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fb2d00778e0 0x7fb2d0079da0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2f81089d0 0x7fb2f81a0b90 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 --2- 192.168.123.103:0/2802469318 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb2f81029d0 0x7fb2f81a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 >> 192.168.123.103:0/2802469318 conn(0x7fb2f80fe710 msgr2=0x7fb2f810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 shutdown_connections 2026-03-09T16:21:39.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:39.955+0000 7fb2fe2bd640 1 -- 192.168.123.103:0/2802469318 wait complete. 2026-03-09T16:21:40.018 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 27 2026-03-09T16:21:40.163 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:40.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:40 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3794197570' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T16:21:40.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:40 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2802469318' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T16:21:40.223 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:40 vm03.local ceph-mon[133973]: pgmap v180: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 -- 192.168.123.103:0/840908181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 msgr2=0x7fc804102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 --2- 192.168.123.103:0/840908181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc804102e30 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fc7f40099b0 tx=0x7fc7f402f220 comp rx=0 tx=0).stop 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 -- 192.168.123.103:0/840908181 shutdown_connections 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 --2- 192.168.123.103:0/840908181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc804102e30 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 --2- 192.168.123.103:0/840908181 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc804108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.423+0000 7fc80c321640 1 -- 192.168.123.103:0/840908181 >> 192.168.123.103:0/840908181 conn(0x7fc8040fe710 msgr2=0x7fc804100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.424+0000 7fc80c321640 1 -- 192.168.123.103:0/840908181 shutdown_connections 2026-03-09T16:21:40.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.424+0000 7fc80c321640 1 -- 192.168.123.103:0/840908181 wait complete. 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.424+0000 7fc80c321640 1 Processor -- start 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.424+0000 7fc80c321640 1 -- start start 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.424+0000 7fc80c321640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36726/0 (socket says 192.168.123.103:36726) 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80c321640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc8041a0c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80c321640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80419a7d0 con 0x7fc8041029d0 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80c321640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80419a940 con 0x7fc8041089d0 2026-03-09T16:21:40.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 -- 192.168.123.103:0/3099093551 learned_addr learned my addr 192.168.123.103:0/3099093551 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc809895640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc8041a0c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 -- 192.168.123.103:0/3099093551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 msgr2=0x7fc8041a0c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc8041a0c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 -- 192.168.123.103:0/3099093551 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7f4009660 con 0x7fc8041029d0 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc809895640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc8041a0c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.425+0000 7fc80a096640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fc7ec00b790 tx=0x7fc7ec00bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.426+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7ec004070 con 0x7fc8041029d0 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.426+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc80419ac20 con 0x7fc8041029d0 2026-03-09T16:21:40.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.426+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc80419b170 con 0x7fc8041029d0 2026-03-09T16:21:40.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.426+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc7ec0026e0 con 0x7fc8041029d0 2026-03-09T16:21:40.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.426+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7ec00cb30 con 0x7fc8041029d0 2026-03-09T16:21:40.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.429+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc7ec00cc90 con 0x7fc8041029d0 2026-03-09T16:21:40.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.429+0000 7fc7fb7fe640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 0x7fc7dc079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:40.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.430+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fc7ec09a0b0 con 0x7fc8041029d0 2026-03-09T16:21:40.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.432+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc804104110 con 0x7fc8041029d0 2026-03-09T16:21:40.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.438+0000 7fc809895640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 0x7fc7dc079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:40.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.441+0000 7fc809895640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 0x7fc7dc079da0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fc80419bc40 tx=0x7fc7f403a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:40.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.441+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc7ec0627f0 con 0x7fc8041029d0 2026-03-09T16:21:40.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:40 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3794197570' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T16:21:40.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:40 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2802469318' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T16:21:40.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:40 vm05.local ceph-mon[108543]: pgmap v180: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:40.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.557+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7fc804102e70 con 0x7fc8041029d0 2026-03-09T16:21:40.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.558+0000 7fc7fb7fe640 1 -- 192.168.123.103:0/3099093551 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v32) v1 ==== 107+0+4401 (secure 0 0 0) 0x7fc7ec061f40 con 0x7fc8041029d0 2026-03-09T16:21:40.559 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:40.559 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":27,"btime":"2026-03-09T16:19:51:831005+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:51.831000+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:40.559 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 27 2026-03-09T16:21:40.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 msgr2=0x7fc7dc079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:40.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 0x7fc7dc079da0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fc80419bc40 tx=0x7fc7f403a040 comp rx=0 tx=0).stop 2026-03-09T16:21:40.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 msgr2=0x7fc8041a06e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:40.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fc7ec00b790 tx=0x7fc7ec00bc60 comp rx=0 tx=0).stop 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 shutdown_connections 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fc7dc0778e0 0x7fc7dc079da0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8041089d0 0x7fc8041a0c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 --2- 192.168.123.103:0/3099093551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc8041029d0 0x7fc8041a06e0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 >> 192.168.123.103:0/3099093551 conn(0x7fc8040fe710 msgr2=0x7fc80410c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 shutdown_connections 2026-03-09T16:21:40.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.560+0000 7fc80c321640 1 -- 192.168.123.103:0/3099093551 wait complete. 2026-03-09T16:21:40.620 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 28 2026-03-09T16:21:40.759 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.999+0000 7fe63ffff640 1 -- 192.168.123.103:0/1913522176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 msgr2=0x7fe64010c7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:40.999+0000 7fe63ffff640 1 --2- 192.168.123.103:0/1913522176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64010c7f0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fe628009a30 tx=0x7fe62802f380 comp rx=0 tx=0).stop 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 -- 192.168.123.103:0/1913522176 shutdown_connections 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 --2- 192.168.123.103:0/1913522176 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64010c7f0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 --2- 192.168.123.103:0/1913522176 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe6400fec50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 -- 192.168.123.103:0/1913522176 >> 192.168.123.103:0/1913522176 conn(0x7fe6400fa4a0 msgr2=0x7fe6400fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 -- 192.168.123.103:0/1913522176 shutdown_connections 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.000+0000 7fe63ffff640 1 -- 192.168.123.103:0/1913522176 wait complete. 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 Processor -- start 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 -- start start 2026-03-09T16:21:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64019ac70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe64019b350 con 0x7fe6400ff190 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63ffff640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe64019dfc0 con 0x7fe6400fe870 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63effd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63effd640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.103:55444/0 (socket says 192.168.123.103:55444) 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.001+0000 7fe63effd640 1 -- 192.168.123.103:0/630800010 learned_addr learned my addr 192.168.123.103:0/630800010 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63e7fc640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64019ac70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63effd640 1 -- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 msgr2=0x7fe64019ac70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63effd640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64019ac70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63effd640 1 -- 192.168.123.103:0/630800010 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe628009660 con 0x7fe6400fe870 2026-03-09T16:21:41.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63e7fc640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64019ac70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:41.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe63effd640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fe63400e9b0 tx=0x7fe63400ee80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:41.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.002+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe63400cd90 con 0x7fe6400fe870 2026-03-09T16:21:41.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.003+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe634004590 con 0x7fe6400fe870 2026-03-09T16:21:41.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.003+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe64019e2a0 con 0x7fe6400fe870 2026-03-09T16:21:41.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.003+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe634010640 con 0x7fe6400fe870 2026-03-09T16:21:41.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.003+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe64019e7c0 con 0x7fe6400fe870 2026-03-09T16:21:41.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.004+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe6340040d0 con 0x7fe6400fe870 2026-03-09T16:21:41.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.004+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe604005350 con 0x7fe6400fe870 2026-03-09T16:21:41.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.004+0000 7fe61ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 0x7fe614079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.004+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fe634014070 con 0x7fe6400fe870 2026-03-09T16:21:41.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.005+0000 7fe63e7fc640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 0x7fe614079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.005+0000 7fe63e7fc640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 0x7fe614079d50 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fe64019bde0 tx=0x7fe6280047c0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.007+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe634062a80 con 0x7fe6400fe870 2026-03-09T16:21:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.120+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7fe6040051c0 con 0x7fe6400fe870 2026-03-09T16:21:41.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.122+0000 7fe61ffff640 1 -- 192.168.123.103:0/630800010 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v32) v1 ==== 107+0+4404 (secure 0 0 0) 0x7fe6340621d0 con 0x7fe6400fe870 2026-03-09T16:21:41.124 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:41.124 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":28,"btime":"2026-03-09T16:19:57:168038+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:57.008569+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:reconnect","state_seq":5,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:41.124 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 28 2026-03-09T16:21:41.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.125+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 msgr2=0x7fe614079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.125+0000 7fe63ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 0x7fe614079d50 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fe64019bde0 tx=0x7fe6280047c0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 msgr2=0x7fe64019a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fe63400e9b0 tx=0x7fe63400ee80 comp rx=0 tx=0).stop 2026-03-09T16:21:41.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 shutdown_connections 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7fe614077890 0x7fe614079d50 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6400ff190 0x7fe64019ac70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 --2- 192.168.123.103:0/630800010 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6400fe870 0x7fe64019a730 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 >> 192.168.123.103:0/630800010 conn(0x7fe6400fa4a0 msgr2=0x7fe6400fbb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 shutdown_connections 2026-03-09T16:21:41.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.126+0000 7fe63ffff640 1 -- 192.168.123.103:0/630800010 wait complete. 2026-03-09T16:21:41.227 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 29 2026-03-09T16:21:41.380 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:41 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3099093551' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T16:21:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:41 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/630800010' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T16:21:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:41 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3099093551' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T16:21:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:41 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/630800010' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2767061453 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01029d0 msgr2=0x7f3bb0102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2767061453 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01029d0 0x7f3bb0102e30 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c009a00 tx=0x7f3b9c02f280 comp rx=0 tx=0).stop 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2767061453 shutdown_connections 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2767061453 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01029d0 0x7f3bb0102e30 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2767061453 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01089d0 0x7f3bb0108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2767061453 >> 192.168.123.103:0/2767061453 conn(0x7f3bb00fe710 msgr2=0x7f3bb0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.629+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2767061453 shutdown_connections 2026-03-09T16:21:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2767061453 wait complete. 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 Processor -- start 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 -- start start 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 0x7f3bb0075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3bb0079870 con 0x7f3bb01089d0 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.630+0000 7f3bb5c00640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3bb00799e0 con 0x7f3bb01029d0 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36766/0 (socket says 192.168.123.103:36766) 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 -- 192.168.123.103:0/2515709609 learned_addr learned my addr 192.168.123.103:0/2515709609 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baf7fe640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 0x7f3bb0075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 -- 192.168.123.103:0/2515709609 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 msgr2=0x7f3bb0075700 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 0x7f3bb0075700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 -- 192.168.123.103:0/2515709609 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b9c009660 con 0x7f3bb01089d0 2026-03-09T16:21:41.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baf7fe640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 0x7f3bb0075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T16:21:41.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.631+0000 7f3baeffd640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c02f790 tx=0x7f3b9c004300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b9c02fae0 con 0x7f3bb01089d0 2026-03-09T16:21:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3b9c02fc40 con 0x7f3bb01089d0 2026-03-09T16:21:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b9c0418c0 con 0x7f3bb01089d0 2026-03-09T16:21:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3bb0076240 con 0x7f3bb01089d0 2026-03-09T16:21:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3bb0076560 con 0x7f3bb01089d0 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.632+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b7c005350 con 0x7f3bb01089d0 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.636+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3b9c03f070 con 0x7f3bb01089d0 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.636+0000 7f3bacff9640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 0x7f3b88079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.636+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f3b9c0bedf0 con 0x7f3bb01089d0 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.637+0000 7f3baf7fe640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 0x7f3b88079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:41.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.637+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3b9c0bf270 con 0x7f3bb01089d0 2026-03-09T16:21:41.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.637+0000 7f3baf7fe640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 0x7f3b88079e70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f3ba0002730 tx=0x7f3ba0009290 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.752+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7f3b7c0051c0 con 0x7f3bb01089d0 2026-03-09T16:21:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.756+0000 7f3bacff9640 1 -- 192.168.123.103:0/2515709609 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v32) v1 ==== 107+0+4401 (secure 0 0 0) 0x7f3b9c0874a0 con 0x7f3bb01089d0 2026-03-09T16:21:41.756 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:41.756 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":29,"btime":"2026-03-09T16:19:58:295909+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:57.300712+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:rejoin","state_seq":6,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T16:21:41.757 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 29 2026-03-09T16:21:41.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 msgr2=0x7f3b88079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 0x7f3b88079e70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f3ba0002730 tx=0x7f3ba0009290 comp rx=0 tx=0).stop 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 msgr2=0x7f3bb0075c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f3b9c02f790 tx=0x7f3b9c004300 comp rx=0 tx=0).stop 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 shutdown_connections 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.758+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f3b880779b0 0x7f3b88079e70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.759+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3bb01089d0 0x7f3bb0075c40 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.759+0000 7f3bb5c00640 1 --2- 192.168.123.103:0/2515709609 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bb01029d0 0x7f3bb0075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.759+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 >> 192.168.123.103:0/2515709609 conn(0x7f3bb00fe710 msgr2=0x7f3bb00feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.759+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 shutdown_connections 2026-03-09T16:21:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:41.759+0000 7f3bb5c00640 1 -- 192.168.123.103:0/2515709609 wait complete. 2026-03-09T16:21:41.817 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 30 2026-03-09T16:21:41.962 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:42.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 -- 192.168.123.103:0/953133376 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 msgr2=0x7efefc102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 --2- 192.168.123.103:0/953133376 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc102e30 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7efef00099b0 tx=0x7efef002f220 comp rx=0 tx=0).stop 2026-03-09T16:21:42.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 -- 192.168.123.103:0/953133376 shutdown_connections 2026-03-09T16:21:42.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 --2- 192.168.123.103:0/953133376 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc102e30 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.193 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 --2- 192.168.123.103:0/953133376 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.192+0000 7eff00f61640 1 -- 192.168.123.103:0/953133376 >> 192.168.123.103:0/953133376 conn(0x7efefc0fe710 msgr2=0x7efefc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.193+0000 7eff00f61640 1 -- 192.168.123.103:0/953133376 shutdown_connections 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.193+0000 7eff00f61640 1 -- 192.168.123.103:0/953133376 wait complete. 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.193+0000 7eff00f61640 1 Processor -- start 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.193+0000 7eff00f61640 1 -- start start 2026-03-09T16:21:42.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7eff00f61640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7efefa575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7efefa575640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36780/0 (socket says 192.168.123.103:36780) 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7efefa575640 1 -- 192.168.123.103:0/3049965658 learned_addr learned my addr 192.168.123.103:0/3049965658 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc19af20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efefc19b600 con 0x7efefc1029d0 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efefc19f350 con 0x7efefc1089d0 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.194+0000 7efef9d74640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc19af20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.195+0000 7efefa575640 1 -- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 msgr2=0x7efefc19af20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.195+0000 7efefa575640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc19af20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.195+0000 7efefa575640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efef0009660 con 0x7efefc1029d0 2026-03-09T16:21:42.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.195+0000 7efef9d74640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc19af20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:21:42.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.195+0000 7efefa575640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7efee800b790 tx=0x7efee800bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:42.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.196+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efee8004070 con 0x7efefc1029d0 2026-03-09T16:21:42.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.196+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efee80026e0 con 0x7efefc1029d0 2026-03-09T16:21:42.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.196+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efee800ca10 con 0x7efefc1029d0 2026-03-09T16:21:42.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.196+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efefc19f630 con 0x7efefc1029d0 2026-03-09T16:21:42.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.196+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efefc19fb00 con 0x7efefc1029d0 2026-03-09T16:21:42.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.198+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efee800cb70 con 0x7efefc1029d0 2026-03-09T16:21:42.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.198+0000 7efee77fe640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 0x7efecc079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.198+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7efee801a030 con 0x7efefc1029d0 2026-03-09T16:21:42.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.198+0000 7efef9d74640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 0x7efecc079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.198+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efec8005350 con 0x7efefc1029d0 2026-03-09T16:21:42.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.199+0000 7efef9d74640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 0x7efecc079b40 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7efefc19bf90 tx=0x7efef003a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:42.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.201+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efee8060e10 con 0x7efefc1029d0 2026-03-09T16:21:42.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.318+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7efec8005600 con 0x7efefc1029d0 2026-03-09T16:21:42.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:42 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2515709609' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T16:21:42.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:42 vm03.local ceph-mon[133973]: pgmap v181: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:42.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.322+0000 7efee77fe640 1 -- 192.168.123.103:0/3049965658 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v32) v1 ==== 107+0+5261 (secure 0 0 0) 0x7efee8083070 con 0x7efefc1029d0 2026-03-09T16:21:42.322 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:42.322 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":30,"btime":"2026-03-09T16:19:59:303143+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15},{"gid":44255,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2319193942","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2319193942},{"type":"v1","addr":"192.168.123.105:6827","nonce":2319193942}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:19:59.303142+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:active","state_seq":7,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34272,"qdb_cluster":[34272]},"id":1}]} 2026-03-09T16:21:42.322 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 30 2026-03-09T16:21:42.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 msgr2=0x7efecc079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 0x7efecc079b40 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7efefc19bf90 tx=0x7efef003a040 comp rx=0 tx=0).stop 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 msgr2=0x7efefc19a9e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7efee800b790 tx=0x7efee800bc60 comp rx=0 tx=0).stop 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 shutdown_connections 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7efecc077680 0x7efecc079b40 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efefc1089d0 0x7efefc19af20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 --2- 192.168.123.103:0/3049965658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efefc1029d0 0x7efefc19a9e0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.324+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 >> 192.168.123.103:0/3049965658 conn(0x7efefc0fe710 msgr2=0x7efefc10c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.325+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 shutdown_connections 2026-03-09T16:21:42.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.325+0000 7eff00f61640 1 -- 192.168.123.103:0/3049965658 wait complete. 2026-03-09T16:21:42.369 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph fs dump --format=json 31 2026-03-09T16:21:42.506 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:42 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2515709609' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T16:21:42.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:42 vm05.local ceph-mon[108543]: pgmap v181: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.720+0000 7f5d5529a640 1 -- 192.168.123.103:0/3689126763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501089d0 msgr2=0x7f5d50108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.720+0000 7f5d5529a640 1 --2- 192.168.123.103:0/3689126763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501089d0 0x7f5d50108db0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009a00 tx=0x7f5d3802f280 comp rx=0 tx=0).stop 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.721+0000 7f5d5529a640 1 -- 192.168.123.103:0/3689126763 shutdown_connections 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.721+0000 7f5d5529a640 1 --2- 192.168.123.103:0/3689126763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501029d0 0x7f5d50102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.721+0000 7f5d5529a640 1 --2- 192.168.123.103:0/3689126763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501089d0 0x7f5d50108db0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.721+0000 7f5d5529a640 1 -- 192.168.123.103:0/3689126763 >> 192.168.123.103:0/3689126763 conn(0x7f5d500fe710 msgr2=0x7f5d50100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.721+0000 7f5d5529a640 1 -- 192.168.123.103:0/3689126763 shutdown_connections 2026-03-09T16:21:42.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.722+0000 7f5d5529a640 1 -- 192.168.123.103:0/3689126763 wait complete. 2026-03-09T16:21:42.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.722+0000 7f5d5529a640 1 Processor -- start 2026-03-09T16:21:42.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.722+0000 7f5d5529a640 1 -- start start 2026-03-09T16:21:42.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d5529a640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d4effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d5529a640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 0x7f5d501a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d5529a640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d501a1180 con 0x7f5d501029d0 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d4effd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36796/0 (socket says 192.168.123.103:36796) 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d4effd640 1 -- 192.168.123.103:0/2904501050 learned_addr learned my addr 192.168.123.103:0/2904501050 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d5019a710 con 0x7f5d501089d0 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.723+0000 7f5d4e7fc640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 0x7f5d501a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d4e7fc640 1 -- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 msgr2=0x7f5d501a0620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d4e7fc640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d4e7fc640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d38009660 con 0x7f5d501089d0 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d4effd640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d4e7fc640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 0x7f5d501a0b60 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f5d4400d8d0 tx=0x7f5d4400dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d44004490 con 0x7f5d501089d0 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5d44007e70 con 0x7f5d501089d0 2026-03-09T16:21:42.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d44005230 con 0x7f5d501089d0 2026-03-09T16:21:42.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d5019a9f0 con 0x7f5d501089d0 2026-03-09T16:21:42.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.724+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d5019af40 con 0x7f5d501089d0 2026-03-09T16:21:42.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.725+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d14005350 con 0x7f5d501089d0 2026-03-09T16:21:42.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.726+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d44005390 con 0x7f5d501089d0 2026-03-09T16:21:42.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.726+0000 7f5d2ffff640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 0x7f5d24079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:42.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.726+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f5d44099cc0 con 0x7f5d501089d0 2026-03-09T16:21:42.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.727+0000 7f5d4effd640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 0x7f5d24079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:42.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.727+0000 7f5d4effd640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 0x7f5d24079e70 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5d38002410 tx=0x7f5d38005c50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:42.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.728+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d44062370 con 0x7f5d501089d0 2026-03-09T16:21:42.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.844+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f5d140051c0 con 0x7f5d501089d0 2026-03-09T16:21:42.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.845+0000 7f5d2ffff640 1 -- 192.168.123.103:0/2904501050 <== mon.1 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v32) v1 ==== 107+0+5261 (secure 0 0 0) 0x7f5d44061ac0 con 0x7f5d501089d0 2026-03-09T16:21:42.847 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:21:42.847 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":31,"btime":"2026-03-09T16:20:02:424356+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34276,"name":"cephfs.vm05.jgzfvu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6825/3112850580","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3112850580},{"type":"v1","addr":"192.168.123.105:6825","nonce":3112850580}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44223,"name":"cephfs.vm03.kygyjl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/892320051","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":892320051},{"type":"v1","addr":"192.168.123.103:6827","nonce":892320051}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":15},{"gid":44255,"name":"cephfs.vm05.sqhria","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2319193942","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2319193942},{"type":"v1","addr":"192.168.123.105:6827","nonce":2319193942}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T16:12:12.560035+0000","modified":"2026-03-09T16:20:01.427018+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34272},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34272":{"gid":34272,"name":"cephfs.vm03.kntrco","rank":0,"incarnation":27,"state":"up:active","state_seq":7,"addr":"192.168.123.103:6829/2230073446","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2230073446},{"type":"v1","addr":"192.168.123.103:6829","nonce":2230073446}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34272,"qdb_cluster":[34272]},"id":1}]} 2026-03-09T16:21:42.847 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 31 2026-03-09T16:21:42.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 msgr2=0x7f5d24079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 0x7f5d24079e70 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5d38002410 tx=0x7f5d38005c50 comp rx=0 tx=0).stop 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 msgr2=0x7f5d501a0b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 0x7f5d501a0b60 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f5d4400d8d0 tx=0x7f5d4400dda0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 shutdown_connections 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.849+0000 7f5d5529a640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d240779b0 0x7f5d24079e70 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.850+0000 7f5d5529a640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d501089d0 0x7f5d501a0b60 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.850+0000 7f5d5529a640 1 --2- 192.168.123.103:0/2904501050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d501029d0 0x7f5d501a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:42.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.850+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 >> 192.168.123.103:0/2904501050 conn(0x7f5d500fe710 msgr2=0x7f5d50106550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:42.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.850+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 shutdown_connections 2026-03-09T16:21:42.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:42.850+0000 7f5d5529a640 1 -- 192.168.123.103:0/2904501050 wait complete. 2026-03-09T16:21:42.914 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-09T16:21:42.917 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-09T16:21:42.917 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:21:42.917 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T16:21:42.935 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:21:42.935 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T16:21:42.993 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd blocklist ls 2026-03-09T16:21:43.181 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.417+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/552410270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 msgr2=0x7f2b58111330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.417+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/552410270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b58111330 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f2b48009a00 tx=0x7f2b4802f280 comp rx=0 tx=0).stop 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.418+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/552410270 shutdown_connections 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.418+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/552410270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b58111330 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.418+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/552410270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 0x7f2b58075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.418+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/552410270 >> 192.168.123.103:0/552410270 conn(0x7f2b580fe710 msgr2=0x7f2b58100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.418+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/552410270 shutdown_connections 2026-03-09T16:21:43.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.419+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/552410270 wait complete. 2026-03-09T16:21:43.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.419+0000 7f2b5ecd5640 1 Processor -- start 2026-03-09T16:21:43.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.419+0000 7f2b5ecd5640 1 -- start start 2026-03-09T16:21:43.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b5ecd5640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 0x7f2b5819ee00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:43.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b5ecd5640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:43.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b5ecd5640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b5819f9d0 con 0x7f2b58076040 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b5ecd5640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b581a3740 con 0x7f2b58075720 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36806/0 (socket says 192.168.123.103:36806) 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 -- 192.168.123.103:0/3250352585 learned_addr learned my addr 192.168.123.103:0/3250352585 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b5ca4a640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 0x7f2b5819ee00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 -- 192.168.123.103:0/3250352585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 msgr2=0x7f2b5819ee00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 0x7f2b5819ee00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.420+0000 7f2b4ffff640 1 -- 192.168.123.103:0/3250352585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b48009660 con 0x7f2b58076040 2026-03-09T16:21:43.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.421+0000 7f2b4ffff640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f2b48002c20 tx=0x7f2b48002c50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:43.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.421+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b48031c00 con 0x7f2b58076040 2026-03-09T16:21:43.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.421+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2b48031d60 con 0x7f2b58076040 2026-03-09T16:21:43.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.421+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b581a39c0 con 0x7f2b58076040 2026-03-09T16:21:43.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.423+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b48038470 con 0x7f2b58076040 2026-03-09T16:21:43.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.423+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b581a3eb0 con 0x7f2b58076040 2026-03-09T16:21:43.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.424+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2b4803f070 con 0x7f2b58076040 2026-03-09T16:21:43.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.424+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b58076e60 con 0x7f2b58076040 2026-03-09T16:21:43.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.427+0000 7f2b4dffb640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 0x7f2b28079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:43.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.427+0000 7f2b5ca4a640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 0x7f2b28079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:43.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.428+0000 7f2b5ca4a640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 0x7f2b28079c60 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f2b40004520 tx=0x7f2b40009290 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:43.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.428+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f2b48031000 con 0x7f2b58076040 2026-03-09T16:21:43.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.428+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2b480c4050 con 0x7f2b58076040 2026-03-09T16:21:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:43 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3049965658' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T16:21:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:43 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/2904501050' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T16:21:43.530 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:43 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3049965658' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T16:21:43.530 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:43 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/2904501050' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T16:21:43.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.526+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f2b58075b00 con 0x7f2b58076040 2026-03-09T16:21:43.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.529+0000 7f2b4dffb640 1 -- 192.168.123.103:0/3250352585 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 43 entries v80) v1 ==== 81+0+2677 (secure 0 0 0) 0x7f2b48087810 con 0x7f2b58076040 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6827/1138709798 2026-03-10T16:19:51.825288+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6827/1622851291 2026-03-10T16:19:18.201597+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6826/1622851291 2026-03-10T16:19:18.201597+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3251037420 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3925759916 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/808004487 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2256909910 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3168090362 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/1463602741 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/4159093290 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/2277604392 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/4285644309 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2298651818 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/384698677 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/551325590 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1811194494 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6825/3577484575 2026-03-10T16:12:18.628672+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/906769150 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3405276359 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2364062892 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6824/3577484575 2026-03-10T16:12:18.628672+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/4285644309 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6828/3419491835 2026-03-10T16:19:37.067629+0000 2026-03-09T16:21:43.531 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2646707583 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1314893658 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/4159093290 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3405276359 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/553055862 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2831546175 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/808004487 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1922548321 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3168090362 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/3810151125 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3754062357 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6828/2751989419 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/4258831734 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6826/1138709798 2026-03-10T16:19:51.825288+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/4280840102 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6829/3419491835 2026-03-10T16:19:37.067629+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6829/2751989419 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3979296636 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1871615672 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1273452242 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:43.532 INFO:teuthology.orchestra.run.vm03.stderr:listed 43 entries 2026-03-09T16:21:43.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 msgr2=0x7f2b28079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:43.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 0x7f2b28079c60 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f2b40004520 tx=0x7f2b40009290 comp rx=0 tx=0).stop 2026-03-09T16:21:43.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 msgr2=0x7f2b5819f340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:43.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f2b48002c20 tx=0x7f2b48002c50 comp rx=0 tx=0).stop 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 shutdown_connections 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f2b280777a0 0x7f2b28079c60 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2b58076040 0x7f2b5819f340 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 --2- 192.168.123.103:0/3250352585 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b58075720 0x7f2b5819ee00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 >> 192.168.123.103:0/3250352585 conn(0x7f2b580fe710 msgr2=0x7f2b580ffdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.533+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 shutdown_connections 2026-03-09T16:21:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:43.534+0000 7f2b5ecd5640 1 -- 192.168.123.103:0/3250352585 wait complete. 2026-03-09T16:21:43.596 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T16:21:43.596 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T16:21:43.613 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph osd blocklist ls 2026-03-09T16:21:43.794 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.026+0000 7f5d61fb9640 1 -- 192.168.123.103:0/2701029162 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 msgr2=0x7f5d5c106b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.026+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/2701029162 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c106b60 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f5d4c0099b0 tx=0x7f5d4c02f220 comp rx=0 tx=0).stop 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.027+0000 7f5d61fb9640 1 -- 192.168.123.103:0/2701029162 shutdown_connections 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.027+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/2701029162 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 0x7f5d5c100be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.027+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/2701029162 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c106b60 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.027+0000 7f5d61fb9640 1 -- 192.168.123.103:0/2701029162 >> 192.168.123.103:0/2701029162 conn(0x7f5d5c0fc460 msgr2=0x7f5d5c0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.028+0000 7f5d61fb9640 1 -- 192.168.123.103:0/2701029162 shutdown_connections 2026-03-09T16:21:44.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.028+0000 7f5d61fb9640 1 -- 192.168.123.103:0/2701029162 wait complete. 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.028+0000 7f5d61fb9640 1 Processor -- start 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.028+0000 7f5d61fb9640 1 -- start start 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.028+0000 7f5d61fb9640 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 0x7f5d5c196490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d61fb9640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d61fb9640 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d5c197060 con 0x7f5d5c106780 2026-03-09T16:21:44.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d61fb9640 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d5c19add0 con 0x7f5d5c100780 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36818/0 (socket says 192.168.123.103:36818) 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 -- 192.168.123.103:0/309080309 learned_addr learned my addr 192.168.123.103:0/309080309 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5b7fe640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 0x7f5d5c196490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 -- 192.168.123.103:0/309080309 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 msgr2=0x7f5d5c196490 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 0x7f5d5c196490 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 -- 192.168.123.103:0/309080309 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d4c009660 con 0x7f5d5c106780 2026-03-09T16:21:44.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.029+0000 7f5d5affd640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f5d4800b790 tx=0x7f5d4800bc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:44.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.030+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d48004070 con 0x7f5d5c106780 2026-03-09T16:21:44.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.030+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5d480026e0 con 0x7f5d5c106780 2026-03-09T16:21:44.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.030+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d4800cb30 con 0x7f5d5c106780 2026-03-09T16:21:44.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.030+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d5c19b0b0 con 0x7f5d5c106780 2026-03-09T16:21:44.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.030+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d5c19b580 con 0x7f5d5c106780 2026-03-09T16:21:44.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.031+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d20005350 con 0x7f5d5c106780 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.035+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d4800cc90 con 0x7f5d5c106780 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.035+0000 7f5d58ff9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 0x7f5d30079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.035+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f5d48098970 con 0x7f5d5c106780 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.035+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d4809c2a0 con 0x7f5d5c106780 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.035+0000 7f5d5b7fe640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 0x7f5d30079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T16:21:44.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.036+0000 7f5d5b7fe640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 0x7f5d30079c60 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5d4c005ec0 tx=0x7f5d4c03a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T16:21:44.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.135+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f5d20005b80 con 0x7f5d5c106780 2026-03-09T16:21:44.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.136+0000 7f5d58ff9640 1 -- 192.168.123.103:0/309080309 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 43 entries v80) v1 ==== 81+0+2677 (secure 0 0 0) 0x7f5d48061130 con 0x7f5d5c106780 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6827/1138709798 2026-03-10T16:19:51.825288+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6827/1622851291 2026-03-10T16:19:18.201597+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6826/1622851291 2026-03-10T16:19:18.201597+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3251037420 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3925759916 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/808004487 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2256909910 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3168090362 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/1463602741 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/4159093290 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/2277604392 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/4285644309 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2298651818 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/384698677 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/551325590 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1811194494 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6825/3577484575 2026-03-10T16:12:18.628672+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/906769150 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3405276359 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2364062892 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6824/3577484575 2026-03-10T16:12:18.628672+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/4285644309 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6828/3419491835 2026-03-10T16:19:37.067629+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2646707583 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1314893658 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/4159093290 2026-03-10T16:09:45.518986+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3405276359 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/553055862 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2831546175 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/808004487 2026-03-10T16:16:53.865477+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1922548321 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3168090362 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/3810151125 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3754062357 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6828/2751989419 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/4258831734 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6826/1138709798 2026-03-10T16:19:51.825288+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:0/4280840102 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6829/3419491835 2026-03-10T16:19:37.067629+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.105:6829/2751989419 2026-03-10T16:14:43.117286+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3979296636 2026-03-10T16:10:33.810609+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1871615672 2026-03-10T16:09:57.470573+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1273452242 2026-03-10T16:14:20.413279+0000 2026-03-09T16:21:44.139 INFO:teuthology.orchestra.run.vm03.stderr:listed 43 entries 2026-03-09T16:21:44.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 msgr2=0x7f5d30079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:44.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 0x7f5d30079c60 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5d4c005ec0 tx=0x7f5d4c03a040 comp rx=0 tx=0).stop 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 msgr2=0x7f5d5c1969d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f5d4800b790 tx=0x7f5d4800bc60 comp rx=0 tx=0).stop 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 shutdown_connections 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:6800/2191779109,v1:192.168.123.103:6801/2191779109] conn(0x7f5d300777a0 0x7f5d30079c60 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d5c106780 0x7f5d5c1969d0 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 --2- 192.168.123.103:0/309080309 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d5c100780 0x7f5d5c196490 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 >> 192.168.123.103:0/309080309 conn(0x7f5d5c0fc460 msgr2=0x7f5d5c10a6f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 shutdown_connections 2026-03-09T16:21:44.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T16:21:44.141+0000 7f5d61fb9640 1 -- 192.168.123.103:0/309080309 wait complete. 2026-03-09T16:21:44.208 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm03.local... 2026-03-09T16:21:44.208 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T16:21:44.209 DEBUG:teuthology.orchestra.run.vm03:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-09T16:21:44.238 INFO:teuthology.orchestra.run:waiting for 300 2026-03-09T16:21:44.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:44 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/3250352585' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T16:21:44.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:44 vm03.local ceph-mon[133973]: pgmap v182: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:44.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:44 vm03.local ceph-mon[133973]: from='client.? 192.168.123.103:0/309080309' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T16:21:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:44 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/3250352585' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T16:21:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:44 vm05.local ceph-mon[108543]: pgmap v182: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:44 vm05.local ceph-mon[108543]: from='client.? 192.168.123.103:0/309080309' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T16:21:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:47 vm03.local ceph-mon[133973]: pgmap v183: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:47.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:47 vm05.local ceph-mon[108543]: pgmap v183: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:49 vm03.local ceph-mon[133973]: pgmap v184: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:49 vm05.local ceph-mon[108543]: pgmap v184: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:51 vm03.local ceph-mon[133973]: pgmap v185: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:51.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:51 vm05.local ceph-mon[108543]: pgmap v185: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:53 vm03.local ceph-mon[133973]: pgmap v186: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:53 vm05.local ceph-mon[108543]: pgmap v186: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:21:55.363 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:55 vm03.local ceph-mon[133973]: pgmap v187: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:55.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:55 vm05.local ceph-mon[108543]: pgmap v187: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:57 vm03.local ceph-mon[133973]: pgmap v188: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:57 vm05.local ceph-mon[108543]: pgmap v188: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:21:59 vm03.local ceph-mon[133973]: pgmap v189: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:21:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:21:59 vm05.local ceph-mon[108543]: pgmap v189: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:01 vm03.local ceph-mon[133973]: pgmap v190: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:01.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:01 vm05.local ceph-mon[108543]: pgmap v190: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:03 vm03.local ceph-mon[133973]: pgmap v191: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:03 vm05.local ceph-mon[108543]: pgmap v191: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:04.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:04 vm05.local ceph-mon[108543]: pgmap v192: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:04.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:04 vm03.local ceph-mon[133973]: pgmap v192: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:07 vm03.local ceph-mon[133973]: pgmap v193: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:07 vm05.local ceph-mon[108543]: pgmap v193: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:09 vm03.local ceph-mon[133973]: pgmap v194: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:09 vm05.local ceph-mon[108543]: pgmap v194: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:10.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:10 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:22:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:10 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:22:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:11 vm03.local ceph-mon[133973]: pgmap v195: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:22:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:22:11.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:22:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:11 vm05.local ceph-mon[108543]: pgmap v195: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:22:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:22:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:22:13.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:13 vm03.local ceph-mon[133973]: pgmap v196: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:13.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:13 vm05.local ceph-mon[108543]: pgmap v196: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:15 vm03.local ceph-mon[133973]: pgmap v197: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:15 vm05.local ceph-mon[108543]: pgmap v197: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:17 vm03.local ceph-mon[133973]: pgmap v198: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:17 vm05.local ceph-mon[108543]: pgmap v198: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:19 vm03.local ceph-mon[133973]: pgmap v199: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:19 vm05.local ceph-mon[108543]: pgmap v199: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:21.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:21 vm03.local ceph-mon[133973]: pgmap v200: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:21 vm05.local ceph-mon[108543]: pgmap v200: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:23 vm03.local ceph-mon[133973]: pgmap v201: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:23 vm05.local ceph-mon[108543]: pgmap v201: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:24.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:25.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:25 vm03.local ceph-mon[133973]: pgmap v202: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:25 vm05.local ceph-mon[108543]: pgmap v202: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:26.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:26 vm03.local ceph-mon[133973]: pgmap v203: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:26 vm05.local ceph-mon[108543]: pgmap v203: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:29.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:29 vm03.local ceph-mon[133973]: pgmap v204: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:29 vm05.local ceph-mon[108543]: pgmap v204: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:31 vm03.local ceph-mon[133973]: pgmap v205: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:31 vm05.local ceph-mon[108543]: pgmap v205: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:33 vm03.local ceph-mon[133973]: pgmap v206: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:33 vm05.local ceph-mon[108543]: pgmap v206: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:35 vm03.local ceph-mon[133973]: pgmap v207: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:35 vm05.local ceph-mon[108543]: pgmap v207: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:37 vm03.local ceph-mon[133973]: pgmap v208: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:37 vm05.local ceph-mon[108543]: pgmap v208: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:39 vm03.local ceph-mon[133973]: pgmap v209: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:39 vm05.local ceph-mon[108543]: pgmap v209: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:41 vm03.local ceph-mon[133973]: pgmap v210: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:41 vm05.local ceph-mon[108543]: pgmap v210: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:43 vm03.local ceph-mon[133973]: pgmap v211: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:43 vm05.local ceph-mon[108543]: pgmap v211: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:44 vm05.local ceph-mon[108543]: pgmap v212: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:44 vm03.local ceph-mon[133973]: pgmap v212: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:47.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:47 vm05.local ceph-mon[108543]: pgmap v213: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:47 vm03.local ceph-mon[133973]: pgmap v213: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:49 vm03.local ceph-mon[133973]: pgmap v214: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:49 vm05.local ceph-mon[108543]: pgmap v214: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:51.140 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:51 vm03.local ceph-mon[133973]: pgmap v215: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:51 vm05.local ceph-mon[108543]: pgmap v215: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:53 vm03.local ceph-mon[133973]: pgmap v216: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:53 vm05.local ceph-mon[108543]: pgmap v216: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:22:55.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:55 vm03.local ceph-mon[133973]: pgmap v217: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:55.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:55 vm05.local ceph-mon[108543]: pgmap v217: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:57.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:57 vm05.local ceph-mon[108543]: pgmap v218: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:57 vm03.local ceph-mon[133973]: pgmap v218: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:22:59 vm03.local ceph-mon[133973]: pgmap v219: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:22:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:22:59 vm05.local ceph-mon[108543]: pgmap v219: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:01 vm03.local ceph-mon[133973]: pgmap v220: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:01 vm05.local ceph-mon[108543]: pgmap v220: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:03 vm03.local ceph-mon[133973]: pgmap v221: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:03 vm05.local ceph-mon[108543]: pgmap v221: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:05 vm03.local ceph-mon[133973]: pgmap v222: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:05 vm05.local ceph-mon[108543]: pgmap v222: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:07 vm03.local ceph-mon[133973]: pgmap v223: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:07 vm05.local ceph-mon[108543]: pgmap v223: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:09 vm03.local ceph-mon[133973]: pgmap v224: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:09.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:09 vm05.local ceph-mon[108543]: pgmap v224: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:10 vm05.local ceph-mon[108543]: pgmap v225: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:10 vm03.local ceph-mon[133973]: pgmap v225: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:23:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:23:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:23:11.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:23:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:23:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:23:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:23:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:23:12.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:12 vm05.local ceph-mon[108543]: pgmap v226: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:12 vm03.local ceph-mon[133973]: pgmap v226: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:15 vm03.local ceph-mon[133973]: pgmap v227: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:15 vm05.local ceph-mon[108543]: pgmap v227: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:17 vm03.local ceph-mon[133973]: pgmap v228: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:17 vm05.local ceph-mon[108543]: pgmap v228: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:19.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:19 vm03.local ceph-mon[133973]: pgmap v229: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:19.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:19 vm05.local ceph-mon[108543]: pgmap v229: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:20.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:20 vm03.local ceph-mon[133973]: pgmap v230: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:20.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:20 vm05.local ceph-mon[108543]: pgmap v230: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:23 vm03.local ceph-mon[133973]: pgmap v231: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:23 vm05.local ceph-mon[108543]: pgmap v231: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:24.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:25.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:25 vm03.local ceph-mon[133973]: pgmap v232: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:25 vm05.local ceph-mon[108543]: pgmap v232: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:27.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:27 vm03.local ceph-mon[133973]: pgmap v233: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:27 vm05.local ceph-mon[108543]: pgmap v233: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:29.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:29 vm03.local ceph-mon[133973]: pgmap v234: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:29 vm05.local ceph-mon[108543]: pgmap v234: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:31 vm03.local ceph-mon[133973]: pgmap v235: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:31 vm05.local ceph-mon[108543]: pgmap v235: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:33 vm03.local ceph-mon[133973]: pgmap v236: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:33 vm05.local ceph-mon[108543]: pgmap v236: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:35 vm03.local ceph-mon[133973]: pgmap v237: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:35 vm05.local ceph-mon[108543]: pgmap v237: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:37 vm03.local ceph-mon[133973]: pgmap v238: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:37 vm05.local ceph-mon[108543]: pgmap v238: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:39 vm03.local ceph-mon[133973]: pgmap v239: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:39 vm05.local ceph-mon[108543]: pgmap v239: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:41 vm03.local ceph-mon[133973]: pgmap v240: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:41 vm05.local ceph-mon[108543]: pgmap v240: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:43 vm05.local ceph-mon[108543]: pgmap v241: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:43 vm03.local ceph-mon[133973]: pgmap v241: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:44 vm05.local ceph-mon[108543]: pgmap v242: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:44 vm03.local ceph-mon[133973]: pgmap v242: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:47.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:47 vm03.local ceph-mon[133973]: pgmap v243: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:47.440 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:47 vm05.local ceph-mon[108543]: pgmap v243: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:49 vm03.local ceph-mon[133973]: pgmap v244: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:49 vm05.local ceph-mon[108543]: pgmap v244: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:51 vm03.local ceph-mon[133973]: pgmap v245: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:51 vm05.local ceph-mon[108543]: pgmap v245: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:53 vm03.local ceph-mon[133973]: pgmap v246: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:53 vm05.local ceph-mon[108543]: pgmap v246: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:23:55.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:55 vm03.local ceph-mon[133973]: pgmap v247: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:55.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:55 vm05.local ceph-mon[108543]: pgmap v247: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:57 vm03.local ceph-mon[133973]: pgmap v248: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:57 vm05.local ceph-mon[108543]: pgmap v248: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:58.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:23:58 vm05.local ceph-mon[108543]: pgmap v249: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:23:58.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:23:58 vm03.local ceph-mon[133973]: pgmap v249: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:01 vm03.local ceph-mon[133973]: pgmap v250: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:01 vm05.local ceph-mon[108543]: pgmap v250: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:03 vm03.local ceph-mon[133973]: pgmap v251: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:03 vm05.local ceph-mon[108543]: pgmap v251: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:05 vm03.local ceph-mon[133973]: pgmap v252: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:05 vm05.local ceph-mon[108543]: pgmap v252: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:07.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:07 vm03.local ceph-mon[133973]: pgmap v253: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:07 vm05.local ceph-mon[108543]: pgmap v253: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:08 vm05.local ceph-mon[108543]: pgmap v254: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:08.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:08 vm03.local ceph-mon[133973]: pgmap v254: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:09.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:10 vm05.local ceph-mon[108543]: pgmap v255: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:10 vm03.local ceph-mon[133973]: pgmap v255: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:24:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:24:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:24:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:24:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:24:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:24:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:24:11.776 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:24:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:12 vm03.local ceph-mon[133973]: pgmap v256: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:12.655 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:12 vm05.local ceph-mon[108543]: pgmap v256: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:15 vm03.local ceph-mon[133973]: pgmap v257: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:15 vm05.local ceph-mon[108543]: pgmap v257: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:17 vm03.local ceph-mon[133973]: pgmap v258: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:17 vm05.local ceph-mon[108543]: pgmap v258: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:19 vm03.local ceph-mon[133973]: pgmap v259: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:19 vm05.local ceph-mon[108543]: pgmap v259: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:21 vm05.local ceph-mon[108543]: pgmap v260: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:21.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:21 vm03.local ceph-mon[133973]: pgmap v260: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:22.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:22 vm05.local ceph-mon[108543]: pgmap v261: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:22.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:22 vm03.local ceph-mon[133973]: pgmap v261: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:24.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:25 vm05.local ceph-mon[108543]: pgmap v262: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:25.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:25 vm03.local ceph-mon[133973]: pgmap v262: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:26.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:26 vm05.local ceph-mon[108543]: pgmap v263: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:26.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:26 vm03.local ceph-mon[133973]: pgmap v263: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:29.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:29 vm03.local ceph-mon[133973]: pgmap v264: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:29 vm05.local ceph-mon[108543]: pgmap v264: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:31 vm03.local ceph-mon[133973]: pgmap v265: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:31 vm05.local ceph-mon[108543]: pgmap v265: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:33 vm03.local ceph-mon[133973]: pgmap v266: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:33 vm05.local ceph-mon[108543]: pgmap v266: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:35 vm03.local ceph-mon[133973]: pgmap v267: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:35 vm05.local ceph-mon[108543]: pgmap v267: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:37 vm03.local ceph-mon[133973]: pgmap v268: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:37 vm05.local ceph-mon[108543]: pgmap v268: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:39 vm03.local ceph-mon[133973]: pgmap v269: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:39 vm05.local ceph-mon[108543]: pgmap v269: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:41 vm03.local ceph-mon[133973]: pgmap v270: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:41 vm05.local ceph-mon[108543]: pgmap v270: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:43.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:43 vm05.local ceph-mon[108543]: pgmap v271: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:43.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:43 vm03.local ceph-mon[133973]: pgmap v271: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:45.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:45 vm05.local ceph-mon[108543]: pgmap v272: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:45.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:45 vm03.local ceph-mon[133973]: pgmap v272: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:46.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:46 vm05.local ceph-mon[108543]: pgmap v273: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:46.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:46 vm03.local ceph-mon[133973]: pgmap v273: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:49 vm03.local ceph-mon[133973]: pgmap v274: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:49 vm05.local ceph-mon[108543]: pgmap v274: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:51 vm03.local ceph-mon[133973]: pgmap v275: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:51 vm05.local ceph-mon[108543]: pgmap v275: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:53 vm03.local ceph-mon[133973]: pgmap v276: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:53 vm05.local ceph-mon[108543]: pgmap v276: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:24:55.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:55 vm03.local ceph-mon[133973]: pgmap v277: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:55.426 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:55 vm05.local ceph-mon[108543]: pgmap v277: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:57 vm03.local ceph-mon[133973]: pgmap v278: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:57 vm05.local ceph-mon[108543]: pgmap v278: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:24:59 vm03.local ceph-mon[133973]: pgmap v279: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:24:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:24:59 vm05.local ceph-mon[108543]: pgmap v279: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:01 vm03.local ceph-mon[133973]: pgmap v280: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:01 vm05.local ceph-mon[108543]: pgmap v280: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:03 vm03.local ceph-mon[133973]: pgmap v281: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:03 vm05.local ceph-mon[108543]: pgmap v281: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:05 vm03.local ceph-mon[133973]: pgmap v282: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:05 vm05.local ceph-mon[108543]: pgmap v282: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:07 vm05.local ceph-mon[108543]: pgmap v283: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:07.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:07 vm03.local ceph-mon[133973]: pgmap v283: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:08 vm05.local ceph-mon[108543]: pgmap v284: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:08.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:08 vm03.local ceph-mon[133973]: pgmap v284: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:09.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:10 vm05.local ceph-mon[108543]: pgmap v285: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:10 vm03.local ceph-mon[133973]: pgmap v285: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:11.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:25:11.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:25:12.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:25:12.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:25:12.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:25:12.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:12 vm05.local ceph-mon[108543]: pgmap v286: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:25:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:25:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:25:12.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:12 vm03.local ceph-mon[133973]: pgmap v286: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:15 vm03.local ceph-mon[133973]: pgmap v287: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:15 vm05.local ceph-mon[108543]: pgmap v287: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:17.351 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:17 vm03.local ceph-mon[133973]: pgmap v288: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:17 vm05.local ceph-mon[108543]: pgmap v288: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:19 vm03.local ceph-mon[133973]: pgmap v289: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:19 vm05.local ceph-mon[108543]: pgmap v289: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:21.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:21 vm03.local ceph-mon[133973]: pgmap v290: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:21 vm05.local ceph-mon[108543]: pgmap v290: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:23 vm03.local ceph-mon[133973]: pgmap v291: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:23 vm05.local ceph-mon[108543]: pgmap v291: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:24.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:25.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:25 vm03.local ceph-mon[133973]: pgmap v292: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:25 vm05.local ceph-mon[108543]: pgmap v292: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:27.391 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:27 vm03.local ceph-mon[133973]: pgmap v293: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:27 vm05.local ceph-mon[108543]: pgmap v293: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:29.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:29 vm05.local ceph-mon[108543]: pgmap v294: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:29.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:29 vm03.local ceph-mon[133973]: pgmap v294: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:30.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:30 vm05.local ceph-mon[108543]: pgmap v295: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:30.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:30 vm03.local ceph-mon[133973]: pgmap v295: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:33 vm03.local ceph-mon[133973]: pgmap v296: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:33.409 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:33 vm05.local ceph-mon[108543]: pgmap v296: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:35 vm05.local ceph-mon[108543]: pgmap v297: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:35.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:35 vm03.local ceph-mon[133973]: pgmap v297: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:37.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:36 vm05.local ceph-mon[108543]: pgmap v298: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:36 vm03.local ceph-mon[133973]: pgmap v298: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:39 vm03.local ceph-mon[133973]: pgmap v299: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:39 vm05.local ceph-mon[108543]: pgmap v299: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:39.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:41 vm03.local ceph-mon[133973]: pgmap v300: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:41 vm05.local ceph-mon[108543]: pgmap v300: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:43.511 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:43 vm05.local ceph-mon[108543]: pgmap v301: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:43.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:43 vm03.local ceph-mon[133973]: pgmap v301: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:44.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:44 vm05.local ceph-mon[108543]: pgmap v302: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:44.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:44 vm03.local ceph-mon[133973]: pgmap v302: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:46.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:46 vm05.local ceph-mon[108543]: pgmap v303: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:46.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:46 vm03.local ceph-mon[133973]: pgmap v303: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:49.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:49 vm03.local ceph-mon[133973]: pgmap v304: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:49.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:49 vm05.local ceph-mon[108543]: pgmap v304: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:51.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:51 vm03.local ceph-mon[133973]: pgmap v305: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:51.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:51 vm05.local ceph-mon[108543]: pgmap v305: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:53.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:53 vm03.local ceph-mon[133973]: pgmap v306: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:53.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:53 vm05.local ceph-mon[108543]: pgmap v306: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:54.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:54 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:54.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:54 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:25:55.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:55 vm03.local ceph-mon[133973]: pgmap v307: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:55.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:55 vm05.local ceph-mon[108543]: pgmap v307: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:57.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:57 vm03.local ceph-mon[133973]: pgmap v308: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:57.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:57 vm05.local ceph-mon[108543]: pgmap v308: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:59.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:25:59 vm03.local ceph-mon[133973]: pgmap v309: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:25:59.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:25:59 vm05.local ceph-mon[108543]: pgmap v309: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:01.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:01 vm03.local ceph-mon[133973]: pgmap v310: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:01.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:01 vm05.local ceph-mon[108543]: pgmap v310: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:03.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:03 vm03.local ceph-mon[133973]: pgmap v311: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:03.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:03 vm05.local ceph-mon[108543]: pgmap v311: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:05.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:05 vm05.local ceph-mon[108543]: pgmap v312: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:05.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:05 vm03.local ceph-mon[133973]: pgmap v312: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:07.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:07 vm05.local ceph-mon[108543]: pgmap v313: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:07.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:07 vm03.local ceph-mon[133973]: pgmap v313: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:08.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:08 vm05.local ceph-mon[108543]: pgmap v314: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:08.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:08 vm03.local ceph-mon[133973]: pgmap v314: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:09.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:09 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:09.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:09 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:10.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:10 vm05.local ceph-mon[108543]: pgmap v315: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:10.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:10 vm03.local ceph-mon[133973]: pgmap v315: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:11.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:11 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:26:12.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:11 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T16:26:12.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:26:12.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:26:12.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:12 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:26:12.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:12 vm03.local ceph-mon[133973]: pgmap v316: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T16:26:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T16:26:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:12 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' 2026-03-09T16:26:13.026 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:12 vm05.local ceph-mon[108543]: pgmap v316: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:15.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:15 vm03.local ceph-mon[133973]: pgmap v317: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:15.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:15 vm05.local ceph-mon[108543]: pgmap v317: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:17.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:17 vm03.local ceph-mon[133973]: pgmap v318: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:17.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:17 vm05.local ceph-mon[108543]: pgmap v318: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:19.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:19 vm03.local ceph-mon[133973]: pgmap v319: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:19.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:19 vm05.local ceph-mon[108543]: pgmap v319: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:21.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:21 vm03.local ceph-mon[133973]: pgmap v320: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:21.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:21 vm05.local ceph-mon[108543]: pgmap v320: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:23.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:23 vm03.local ceph-mon[133973]: pgmap v321: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:23.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:23 vm05.local ceph-mon[108543]: pgmap v321: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:24.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:24 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:24.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:24 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:25.427 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:25 vm05.local ceph-mon[108543]: pgmap v322: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:25.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:25 vm03.local ceph-mon[133973]: pgmap v322: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:27.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:27 vm05.local ceph-mon[108543]: pgmap v323: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:27.640 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:27 vm03.local ceph-mon[133973]: pgmap v323: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:28.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:28 vm03.local ceph-mon[133973]: pgmap v324: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:28.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:28 vm05.local ceph-mon[108543]: pgmap v324: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:31.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:31 vm03.local ceph-mon[133973]: pgmap v325: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:31.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:31 vm05.local ceph-mon[108543]: pgmap v325: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:33.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:33 vm03.local ceph-mon[133973]: pgmap v326: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:33.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:33 vm05.local ceph-mon[108543]: pgmap v326: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:35 vm03.local ceph-mon[133973]: pgmap v327: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:35.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:35 vm05.local ceph-mon[108543]: pgmap v327: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:37.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:37 vm03.local ceph-mon[133973]: pgmap v328: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:37.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:37 vm05.local ceph-mon[108543]: pgmap v328: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:39.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:39 vm05.local ceph-mon[108543]: pgmap v329: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:39.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:39 vm05.local ceph-mon[108543]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:39 vm03.local ceph-mon[133973]: pgmap v329: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:39.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:39 vm03.local ceph-mon[133973]: from='mgr.34104 192.168.123.103:0/1902774' entity='mgr.vm03.gbgzmu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T16:26:41.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:41 vm03.local ceph-mon[133973]: pgmap v330: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:41.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:41 vm05.local ceph-mon[108543]: pgmap v330: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:43.288 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-09T16:26:43.288 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T16:26:43.289 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-09T16:26:43.292 INFO:tasks.cephadm:Teardown begin 2026-03-09T16:26:43.292 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T16:26:43.292 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:26:43.325 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:26:43.356 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-09T16:26:43.356 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc -- ceph mgr module disable cephadm 2026-03-09T16:26:43.389 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:43 vm03.local ceph-mon[133973]: pgmap v331: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:43.519 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/mon.vm03/config 2026-03-09T16:26:43.526 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:43 vm05.local ceph-mon[108543]: pgmap v331: 65 pgs: 65 active+clean; 254 MiB data, 976 MiB used, 119 GiB / 120 GiB avail 2026-03-09T16:26:43.690 INFO:teuthology.orchestra.run.vm03.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-09T16:26:43.706 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-09T16:26:43.707 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-09T16:26:43.707 DEBUG:teuthology.orchestra.run.vm03:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T16:26:43.762 DEBUG:teuthology.orchestra.run.vm05:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T16:26:43.781 INFO:tasks.cephadm:Stopping all daemons... 2026-03-09T16:26:43.781 INFO:tasks.cephadm.mon.vm03:Stopping mon.vm03... 2026-03-09T16:26:43.781 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03 2026-03-09T16:26:44.008 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:43 vm03.local systemd[1]: Stopping Ceph mon.vm03 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:26:44.008 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:43 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[133969]: 2026-03-09T16:26:43.910+0000 7f80f80f3640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:26:44.008 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 16:26:43 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm03[133969]: 2026-03-09T16:26:43.910+0000 7f80f80f3640 -1 mon.vm03@0(leader) e3 *** Got Signal Terminated *** 2026-03-09T16:26:44.096 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm03.service' 2026-03-09T16:26:44.133 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:26:44.133 INFO:tasks.cephadm.mon.vm03:Stopped mon.vm03 2026-03-09T16:26:44.133 INFO:tasks.cephadm.mon.vm05:Stopping mon.vm05... 2026-03-09T16:26:44.133 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm05 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05[108539]: 2026-03-09T16:26:44.245+0000 7fefa0d5e640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm05 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05[108539]: 2026-03-09T16:26:44.245+0000 7fefa0d5e640 -1 mon.vm05@1(peon) e3 *** Got Signal Terminated *** 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local podman[139270]: 2026-03-09 16:26:44.288111711 +0000 UTC m=+0.055161057 container died b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local podman[139270]: 2026-03-09 16:26:44.310735918 +0000 UTC m=+0.077785275 container remove b6d6af84a66daf439a819a594bf59d3b645350890a0cf600f1a98a172826883b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-09T16:26:44.369 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 09 16:26:44 vm05.local bash[139270]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-mon-vm05 2026-03-09T16:26:44.386 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@mon.vm05.service' 2026-03-09T16:26:44.423 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:26:44.423 INFO:tasks.cephadm.mon.vm05:Stopped mon.vm05 2026-03-09T16:26:44.423 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-09T16:26:44.423 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.0 2026-03-09T16:26:44.641 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:44 vm03.local systemd[1]: Stopping Ceph osd.0 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:26:44.641 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:44 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:26:44.534+0000 7f0d0b53c640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:26:44.641 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:44 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:26:44.534+0000 7f0d0b53c640 -1 osd.0 80 *** Got signal Terminated *** 2026-03-09T16:26:44.641 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:44 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0[142998]: 2026-03-09T16:26:44.534+0000 7f0d0b53c640 -1 osd.0 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173744]: 2026-03-09 16:26:49.565642007 +0000 UTC m=+5.045324145 container died fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173744]: 2026-03-09 16:26:49.594102434 +0000 UTC m=+5.073784572 container remove fba6e40f54d4f4cf3c4cbbceb541f0f1a126b91176cffbe0663d024a4e3bd9ca (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local bash[173744]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173809]: 2026-03-09 16:26:49.734804126 +0000 UTC m=+0.017202768 container create e03cd2453157691d3fa2664f0725436ae704d683f9f3baafd2fff75e69dedc59 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173809]: 2026-03-09 16:26:49.780761121 +0000 UTC m=+0.063159763 container init e03cd2453157691d3fa2664f0725436ae704d683f9f3baafd2fff75e69dedc59 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173809]: 2026-03-09 16:26:49.785798749 +0000 UTC m=+0.068197382 container start e03cd2453157691d3fa2664f0725436ae704d683f9f3baafd2fff75e69dedc59 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default) 2026-03-09T16:26:49.827 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 16:26:49 vm03.local podman[173809]: 2026-03-09 16:26:49.78701206 +0000 UTC m=+0.069410702 container attach e03cd2453157691d3fa2664f0725436ae704d683f9f3baafd2fff75e69dedc59 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-0-deactivate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-09T16:26:49.955 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.0.service' 2026-03-09T16:26:49.991 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:26:49.991 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-09T16:26:49.991 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-09T16:26:49.991 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.1 2026-03-09T16:26:50.140 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:50 vm03.local systemd[1]: Stopping Ceph osd.1 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:26:50.640 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:50 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:26:50.143+0000 7fcaa0795640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:26:50.640 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:50 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:26:50.143+0000 7fcaa0795640 -1 osd.1 80 *** Got signal Terminated *** 2026-03-09T16:26:50.641 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:50 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1[147788]: 2026-03-09T16:26:50.143+0000 7fcaa0795640 -1 osd.1 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173904]: 2026-03-09 16:26:55.17387497 +0000 UTC m=+5.047245170 container died 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173904]: 2026-03-09 16:26:55.198423737 +0000 UTC m=+5.071793937 container remove 9e86c92fc9cd9f12dd8c5176dd68cab6f7a21eb72e2409f58c6f60a2c3f2455c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local bash[173904]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173981]: 2026-03-09 16:26:55.347102628 +0000 UTC m=+0.019815067 container create 4592d4819e0b688f2cac541e3a67b6b334a826163e1f0659a73f78e7390c4dab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173981]: 2026-03-09 16:26:55.383824748 +0000 UTC m=+0.056537198 container init 4592d4819e0b688f2cac541e3a67b6b334a826163e1f0659a73f78e7390c4dab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-09T16:26:55.438 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173981]: 2026-03-09 16:26:55.386607517 +0000 UTC m=+0.059319946 container start 4592d4819e0b688f2cac541e3a67b6b334a826163e1f0659a73f78e7390c4dab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T16:26:55.439 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 16:26:55 vm03.local podman[173981]: 2026-03-09 16:26:55.389902474 +0000 UTC m=+0.062614913 container attach 4592d4819e0b688f2cac541e3a67b6b334a826163e1f0659a73f78e7390c4dab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-1-deactivate, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:26:55.556 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.1.service' 2026-03-09T16:26:55.588 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:26:55.588 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-09T16:26:55.588 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-09T16:26:55.588 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.2 2026-03-09T16:26:55.720 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:26:55 vm03.local systemd[1]: Stopping Ceph osd.2 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:26:56.140 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:26:55 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:26:55.719+0000 7fdda26c0640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:26:56.140 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:26:55 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:26:55.719+0000 7fdda26c0640 -1 osd.2 80 *** Got signal Terminated *** 2026-03-09T16:26:56.140 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:26:55 vm03.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2[152639]: 2026-03-09T16:26:55.719+0000 7fdda26c0640 -1 osd.2 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174078]: 2026-03-09 16:27:00.751224592 +0000 UTC m=+5.044404035 container died 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174078]: 2026-03-09 16:27:00.775786613 +0000 UTC m=+5.068966056 container remove 2e666ccd4bf7899b3ebe55258b3f8d81df7ebf8f6b6c4cae6ba5fa2559bcae7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local bash[174078]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174146]: 2026-03-09 16:27:00.939155771 +0000 UTC m=+0.019044727 container create cfe75a462ad7c40fe2f06819c632d55d618477ea96ed0984624abcd46e7079e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174146]: 2026-03-09 16:27:00.986823627 +0000 UTC m=+0.066712583 container init cfe75a462ad7c40fe2f06819c632d55d618477ea96ed0984624abcd46e7079e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174146]: 2026-03-09 16:27:00.991065075 +0000 UTC m=+0.070954031 container start cfe75a462ad7c40fe2f06819c632d55d618477ea96ed0984624abcd46e7079e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:00 vm03.local podman[174146]: 2026-03-09 16:27:00.99591334 +0000 UTC m=+0.075802285 container attach cfe75a462ad7c40fe2f06819c632d55d618477ea96ed0984624abcd46e7079e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-2-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0) 2026-03-09T16:27:01.029 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 16:27:01 vm03.local podman[174146]: 2026-03-09 16:27:00.930326816 +0000 UTC m=+0.010215782 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:27:01.157 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.2.service' 2026-03-09T16:27:01.193 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:27:01.193 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-09T16:27:01.193 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-09T16:27:01.193 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.3 2026-03-09T16:27:01.526 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:01 vm05.local systemd[1]: Stopping Ceph osd.3 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:27:01.526 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:01 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:27:01.295+0000 7f7ea7a11640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:27:01.526 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:01 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:27:01.295+0000 7f7ea7a11640 -1 osd.3 80 *** Got signal Terminated *** 2026-03-09T16:27:01.526 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:01 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3[116220]: 2026-03-09T16:27:01.295+0000 7f7ea7a11640 -1 osd.3 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:27:06.598 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139388]: 2026-03-09 16:27:06.333644182 +0000 UTC m=+5.050143789 container died c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-09T16:27:06.598 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139388]: 2026-03-09 16:27:06.356143474 +0000 UTC m=+5.072643081 container remove c052610d74d5d164c540ff0275b7dc94403a3b29408868e729a32cd7e6882091 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:27:06.599 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local bash[139388]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3 2026-03-09T16:27:06.599 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139458]: 2026-03-09 16:27:06.506075758 +0000 UTC m=+0.016253073 container create 4d53e1b71c141911e10a31f230a44b1b0774239522bbdca488f01c1eadce5250 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-09T16:27:06.599 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139458]: 2026-03-09 16:27:06.546609195 +0000 UTC m=+0.056786510 container init 4d53e1b71c141911e10a31f230a44b1b0774239522bbdca488f01c1eadce5250 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:27:06.599 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139458]: 2026-03-09 16:27:06.550192963 +0000 UTC m=+0.060370269 container start 4d53e1b71c141911e10a31f230a44b1b0774239522bbdca488f01c1eadce5250 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-09T16:27:06.599 INFO:journalctl@ceph.osd.3.vm05.stdout:Mar 09 16:27:06 vm05.local podman[139458]: 2026-03-09 16:27:06.551364787 +0000 UTC m=+0.061542102 container attach 4d53e1b71c141911e10a31f230a44b1b0774239522bbdca488f01c1eadce5250 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-3-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:27:06.713 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.3.service' 2026-03-09T16:27:06.750 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:27:06.750 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-09T16:27:06.750 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-09T16:27:06.750 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4 2026-03-09T16:27:06.897 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:06 vm05.local systemd[1]: Stopping Ceph osd.4 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:27:07.276 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:06 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:27:06.896+0000 7f83d87f0640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:27:07.276 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:06 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:27:06.896+0000 7f83d87f0640 -1 osd.4 80 *** Got signal Terminated *** 2026-03-09T16:27:07.276 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:06 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4[120548]: 2026-03-09T16:27:06.896+0000 7f83d87f0640 -1 osd.4 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:27:12.198 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:11 vm05.local podman[139553]: 2026-03-09 16:27:11.931224897 +0000 UTC m=+5.048944865 container died 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS) 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:11 vm05.local podman[139553]: 2026-03-09 16:27:11.952702527 +0000 UTC m=+5.070422495 container remove 4115e4720b892c0ebdaf2ba23a5cd3d7b508713d063d0c1af354be329d564f20 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:11 vm05.local bash[139553]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:12 vm05.local podman[139621]: 2026-03-09 16:27:12.107142248 +0000 UTC m=+0.019755037 container create c8e942f7ab1e797dba4e4517c209b2c1a355feedbda6b33030de590b40d8615d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:12 vm05.local podman[139621]: 2026-03-09 16:27:12.150362514 +0000 UTC m=+0.062975303 container init c8e942f7ab1e797dba4e4517c209b2c1a355feedbda6b33030de590b40d8615d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:12 vm05.local podman[139621]: 2026-03-09 16:27:12.15407868 +0000 UTC m=+0.066691469 container start c8e942f7ab1e797dba4e4517c209b2c1a355feedbda6b33030de590b40d8615d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0) 2026-03-09T16:27:12.199 INFO:journalctl@ceph.osd.4.vm05.stdout:Mar 09 16:27:12 vm05.local podman[139621]: 2026-03-09 16:27:12.155061189 +0000 UTC m=+0.067673978 container attach c8e942f7ab1e797dba4e4517c209b2c1a355feedbda6b33030de590b40d8615d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-4-deactivate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-09T16:27:12.346 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.4.service' 2026-03-09T16:27:12.385 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:27:12.385 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-09T16:27:12.385 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-09T16:27:12.385 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5 2026-03-09T16:27:12.460 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:12.350+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:12.460 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:12 vm05.local systemd[1]: Stopping Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc... 2026-03-09T16:27:12.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:12.523+0000 7fcc5bcb2640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T16:27:12.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:12.523+0000 7fcc5bcb2640 -1 osd.5 80 *** Got signal Terminated *** 2026-03-09T16:27:12.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:12 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:12.523+0000 7fcc5bcb2640 -1 osd.5 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T16:27:13.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:13 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:13.365+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:14.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:14 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:14.389+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:15.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:15 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:15.357+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:16.776 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:16 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:16.367+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:17.733 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5[124742]: 2026-03-09T16:27:17.361+0000 7fcc57ab9640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T16:26:47.813248+0000 front 2026-03-09T16:26:47.813207+0000 (oldest deadline 2026-03-09T16:27:11.912784+0000) 2026-03-09T16:27:17.733 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139721]: 2026-03-09 16:27:17.56043958 +0000 UTC m=+5.051747130 container died d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T16:27:17.733 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139721]: 2026-03-09 16:27:17.582491627 +0000 UTC m=+5.073799176 container remove d93569840b13eade2a9a2c481bc6891f0a7b9e7d517d37373f009afcca5a64cb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T16:27:17.733 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local bash[139721]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5 2026-03-09T16:27:17.969 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service' 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139789]: 2026-03-09 16:27:17.732549124 +0000 UTC m=+0.016218889 container create 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139789]: 2026-03-09 16:27:17.785238104 +0000 UTC m=+0.068907879 container init 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139789]: 2026-03-09 16:27:17.788368563 +0000 UTC m=+0.072038338 container start 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139789]: 2026-03-09 16:27:17.791099435 +0000 UTC m=+0.074769210 container attach 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139789]: 2026-03-09 16:27:17.726132263 +0000 UTC m=+0.009802038 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139807]: 2026-03-09 16:27:17.936290624 +0000 UTC m=+0.011879937 container died 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local podman[139807]: 2026-03-09 16:27:17.953041106 +0000 UTC m=+0.028630409 container remove 2cc80576abc264c283212ba4ec0cb828f2a553c4f330b28729a09c6065ac2f05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc-osd-5-deactivate, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service: Deactivated successfully. 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local systemd[1]: Stopped Ceph osd.5 for 2b05df78-1bd2-11f1-83c0-c950214d6edc. 2026-03-09T16:27:18.026 INFO:journalctl@ceph.osd.5.vm05.stdout:Mar 09 16:27:17 vm05.local systemd[1]: ceph-2b05df78-1bd2-11f1-83c0-c950214d6edc@osd.5.service: Consumed 5.075s CPU time. 2026-03-09T16:27:18.043 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T16:27:18.043 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-09T16:27:18.044 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc --force --keep-logs 2026-03-09T16:27:18.143 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:27:19.511 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[98426]: fuse finished with error 0 and tester_r 0 2026-03-09T16:27:30.207 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc --force --keep-logs 2026-03-09T16:27:30.306 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:27:35.118 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:27:35.148 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T16:27:35.177 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-09T16:27:35.177 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm03/crash 2026-03-09T16:27:35.177 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash -- . 2026-03-09T16:27:35.215 INFO:teuthology.orchestra.run.vm03.stderr:tar: /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash: Cannot open: No such file or directory 2026-03-09T16:27:35.215 INFO:teuthology.orchestra.run.vm03.stderr:tar: Error is not recoverable: exiting now 2026-03-09T16:27:35.216 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm05/crash 2026-03-09T16:27:35.216 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash -- . 2026-03-09T16:27:35.243 INFO:teuthology.orchestra.run.vm05.stderr:tar: /var/lib/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/crash: Cannot open: No such file or directory 2026-03-09T16:27:35.243 INFO:teuthology.orchestra.run.vm05.stderr:tar: Error is not recoverable: exiting now 2026-03-09T16:27:35.244 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-09T16:27:35.244 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T16:27:35.318 INFO:tasks.cephadm:Compressing logs... 2026-03-09T16:27:35.318 DEBUG:teuthology.orchestra.run.vm03:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T16:27:35.320 DEBUG:teuthology.orchestra.run.vm05:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T16:27:35.340 INFO:teuthology.orchestra.run.vm03.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T16:27:35.341 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T16:27:35.341 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm03.log 2026-03-09T16:27:35.341 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log 2026-03-09T16:27:35.342 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/cephadm.log: /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm03.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm03.gbgzmu.log 2026-03-09T16:27:35.347 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T16:27:35.348 INFO:teuthology.orchestra.run.vm05.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T16:27:35.349 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log 2026-03-09T16:27:35.349 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm05.log 2026-03-09T16:27:35.349 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log: 87.3% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log.gz 2026-03-09T16:27:35.350 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/cephadm.log: /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log: 92.7% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T16:27:35.351 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm05.dygxfv.log 2026-03-09T16:27:35.352 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm05.log: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm05.log.gz 2026-03-09T16:27:35.352 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm05.log 2026-03-09T16:27:35.352 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log 2026-03-09T16:27:35.353 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm05.dygxfv.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log 2026-03-09T16:27:35.355 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm03.gbgzmu.log: 90.7% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T16:27:35.355 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log 2026-03-09T16:27:35.358 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log: 91.4% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log.gz 2026-03-09T16:27:35.358 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log 2026-03-09T16:27:35.358 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm05.log: 92.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log.gz 2026-03-09T16:27:35.359 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log: 85.6% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log.gz 2026-03-09T16:27:35.359 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm03.log 2026-03-09T16:27:35.359 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log 2026-03-09T16:27:35.360 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.0.log 2026-03-09T16:27:35.362 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log: 91.6% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.audit.log.gz 2026-03-09T16:27:35.364 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log 2026-03-09T16:27:35.364 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.3.log 2026-03-09T16:27:35.365 INFO:teuthology.orchestra.run.vm05.stderr: 87.4% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.log.gz 2026-03-09T16:27:35.366 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log: 85.3% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph.cephadm.log.gz 2026-03-09T16:27:35.367 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm03.log: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-client.ceph-exporter.vm03.log.gz 2026-03-09T16:27:35.368 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.1.log 2026-03-09T16:27:35.369 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.2.log 2026-03-09T16:27:35.369 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kygyjl.log 2026-03-09T16:27:35.370 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kntrco.log 2026-03-09T16:27:35.375 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.4.log 2026-03-09T16:27:35.376 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.2.log: /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kygyjl.log: 92.7% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-volume.log.gz 2026-03-09T16:27:35.384 INFO:teuthology.orchestra.run.vm05.stderr: 89.1% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm05.dygxfv.log.gz 2026-03-09T16:27:35.385 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.5.log 2026-03-09T16:27:35.399 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.jgzfvu.log 2026-03-09T16:27:35.401 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.sqhria.log 2026-03-09T16:27:35.402 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-09T16:27:35.413 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.jgzfvu.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-09T16:27:35.419 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.sqhria.log: 91.8% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.jgzfvu.log.gz 2026-03-09T16:27:35.815 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.1.log: 92.2% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm05.log.gz 2026-03-09T16:27:35.984 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kntrco.log: /var/log/ceph/ceph-client.0.log: 89.4% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mgr.vm03.gbgzmu.log.gz 2026-03-09T16:27:36.802 INFO:teuthology.orchestra.run.vm03.stderr: 90.6% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mon.vm03.log.gz 2026-03-09T16:27:40.965 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.4.log.gz 2026-03-09T16:27:42.480 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.5.log.gz 2026-03-09T16:27:43.202 INFO:teuthology.orchestra.run.vm03.stderr: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.2.log.gz 2026-03-09T16:27:43.205 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.3.log.gz 2026-03-09T16:27:43.566 INFO:teuthology.orchestra.run.vm05.stderr: 94.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm05.sqhria.log.gz 2026-03-09T16:27:43.618 INFO:teuthology.orchestra.run.vm03.stderr: 93.8% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.0.log.gz 2026-03-09T16:27:44.097 INFO:teuthology.orchestra.run.vm03.stderr: 93.9% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-osd.1.log.gz 2026-03-09T16:27:47.945 INFO:teuthology.orchestra.run.vm05.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-09T16:27:47.945 INFO:teuthology.orchestra.run.vm05.stderr: 93.2% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-09T16:27:47.946 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-09T16:27:47.946 INFO:teuthology.orchestra.run.vm05.stderr:real 0m12.610s 2026-03-09T16:27:47.946 INFO:teuthology.orchestra.run.vm05.stderr:user 0m19.835s 2026-03-09T16:27:47.946 INFO:teuthology.orchestra.run.vm05.stderr:sys 0m0.836s 2026-03-09T16:27:51.628 INFO:teuthology.orchestra.run.vm03.stderr: 94.8% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kntrco.log.gz 2026-03-09T16:27:52.287 INFO:teuthology.orchestra.run.vm03.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-09T16:27:52.416 INFO:teuthology.orchestra.run.vm03.stderr: 93.3% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-09T16:28:46.933 INFO:teuthology.orchestra.run.vm03.stderr: 93.0% -- replaced with /var/log/ceph/2b05df78-1bd2-11f1-83c0-c950214d6edc/ceph-mds.cephfs.vm03.kygyjl.log.gz 2026-03-09T16:28:46.937 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-09T16:28:46.937 INFO:teuthology.orchestra.run.vm03.stderr:real 1m11.606s 2026-03-09T16:28:46.937 INFO:teuthology.orchestra.run.vm03.stderr:user 1m21.739s 2026-03-09T16:28:46.937 INFO:teuthology.orchestra.run.vm03.stderr:sys 0m5.474s 2026-03-09T16:28:46.938 INFO:tasks.cephadm:Archiving logs... 2026-03-09T16:28:46.938 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm03/log 2026-03-09T16:28:46.938 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T16:28:51.229 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm05/log 2026-03-09T16:28:51.229 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T16:28:52.300 INFO:tasks.cephadm:Removing cluster... 2026-03-09T16:28:52.300 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc --force 2026-03-09T16:28:52.463 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:28:53.199 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 2b05df78-1bd2-11f1-83c0-c950214d6edc --force 2026-03-09T16:28:53.300 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 2b05df78-1bd2-11f1-83c0-c950214d6edc 2026-03-09T16:28:53.575 INFO:tasks.cephadm:Removing cephadm ... 2026-03-09T16:28:53.575 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T16:28:53.596 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T16:28:53.614 INFO:tasks.cephadm:Teardown complete 2026-03-09T16:28:53.615 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-09T16:28:53.617 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T16:28:53.618 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-09T16:28:53.618 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T16:28:53.638 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T16:28:53.693 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T16:28:53.693 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T16:28:53.693 DEBUG:teuthology.orchestra.run.vm03:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T16:28:53.693 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y remove $d || true 2026-03-09T16:28:53.693 DEBUG:teuthology.orchestra.run.vm03:> done 2026-03-09T16:28:53.698 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T16:28:53.698 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-09T16:28:53.698 DEBUG:teuthology.orchestra.run.vm05:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T16:28:53.698 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y remove $d || true 2026-03-09T16:28:53.698 DEBUG:teuthology.orchestra.run.vm05:> done 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 31 M 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:28:53.976 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 31 M 2026-03-09T16:28:53.977 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:28:53.981 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:28:53.981 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:28:53.997 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:28:53.997 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:28:54.030 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:28:54.033 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 31 M 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 31 M 2026-03-09T16:28:54.034 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:28:54.038 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:28:54.038 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:28:54.053 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:28:54.054 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:28:54.054 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.054 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:54.055 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T16:28:54.055 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T16:28:54.055 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T16:28:54.055 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.057 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.069 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.084 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.087 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T16:28:54.112 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.114 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.123 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.144 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.170 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.170 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.234 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.234 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.235 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.294 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:28:54.469 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 149 M 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Remove 4 Packages 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 151 M 2026-03-09T16:28:54.470 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:28:54.473 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:28:54.473 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:28:54.499 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:28:54.499 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:28:54.509 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 149 M 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Remove 4 Packages 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 151 M 2026-03-09T16:28:54.510 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:28:54.513 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:28:54.513 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:28:54.540 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:28:54.540 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:28:54.548 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:28:54.554 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T16:28:54.557 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T16:28:54.563 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T16:28:54.588 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T16:28:54.594 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:28:54.602 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T16:28:54.605 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T16:28:54.609 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T16:28:54.625 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T16:28:54.663 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T16:28:54.663 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T16:28:54.663 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T16:28:54.663 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T16:28:54.699 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T16:28:54.699 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T16:28:54.699 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T16:28:54.699 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.720 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:28:54.759 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.760 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:28:54.947 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 0 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 6.4 M 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 18 M 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 58 M 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Remove 8 Packages 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 84 M 2026-03-09T16:28:54.948 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:28:54.951 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:28:54.951 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:28:54.974 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:28:54.974 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:28:54.995 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:28:54.996 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.996 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:28:54.996 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.996 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:28:54.996 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 0 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 6.4 M 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 18 M 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 58 M 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:Remove 8 Packages 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 84 M 2026-03-09T16:28:54.997 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:28:55.000 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:28:55.000 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:28:55.013 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:28:55.015 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T16:28:55.024 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:28:55.024 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T16:28:55.037 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.040 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.049 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.063 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:28:55.064 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T16:28:55.064 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.064 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T16:28:55.065 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.066 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T16:28:55.084 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.086 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.086 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.089 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T16:28:55.092 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T16:28:55.094 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T16:28:55.095 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.108 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.108 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T16:28:55.108 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.109 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T16:28:55.116 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.117 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.126 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.133 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T16:28:55.136 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T16:28:55.138 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T16:28:55.140 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T16:28:55.147 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.148 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.161 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.170 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T16:28:55.190 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.191 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.244 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 3/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 4/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T16:28:55.245 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T16:28:55.288 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T16:28:55.288 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T16:28:55.288 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T16:28:55.288 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 3/8 2026-03-09T16:28:55.289 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 4/8 2026-03-09T16:28:55.289 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T16:28:55.289 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T16:28:55.289 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.299 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:28:55.338 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T16:28:55.338 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.339 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:28:55.511 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout:============================================================================================ 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout:============================================================================================ 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 22 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 395 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 4.5 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 736 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 87 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 66 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 563 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 71 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 355 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 1.5 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 52 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 138 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 438 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.5 M 2026-03-09T16:28:55.516 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 640 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T16:28:55.517 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout:============================================================================================ 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout:Remove 84 Packages 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 515 M 2026-03-09T16:28:55.518 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:28:55.541 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:28:55.541 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:28:55.563 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================================ 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================================ 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 22 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 395 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 4.5 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 736 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 87 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 66 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 563 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 71 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 355 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 1.5 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 52 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 138 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 438 k 2026-03-09T16:28:55.568 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.5 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 640 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T16:28:55.569 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================================ 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout:Remove 84 Packages 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 515 M 2026-03-09T16:28:55.570 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:28:55.594 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:28:55.594 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:28:55.653 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:28:55.653 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:28:55.705 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:28:55.705 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:28:55.795 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:28:55.795 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T16:28:55.804 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T16:28:55.822 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:55.823 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.840 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.847 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:28:55.847 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T16:28:55.854 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T16:28:55.872 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.872 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:55.872 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T16:28:55.872 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T16:28:55.872 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T16:28:55.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:55.873 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.887 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:28:55.899 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 3/84 2026-03-09T16:28:55.921 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 4/84 2026-03-09T16:28:55.921 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T16:28:55.933 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T16:28:55.939 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T16:28:55.940 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T16:28:55.941 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 3/84 2026-03-09T16:28:55.954 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T16:28:55.961 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T16:28:55.964 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T16:28:55.967 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T16:28:55.968 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 4/84 2026-03-09T16:28:55.968 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T16:28:55.972 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T16:28:55.977 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T16:28:55.983 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T16:28:55.986 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T16:28:55.988 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T16:28:55.988 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T16:28:56.000 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T16:28:56.002 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T16:28:56.007 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T16:28:56.009 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T16:28:56.013 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T16:28:56.015 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T16:28:56.019 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T16:28:56.021 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T16:28:56.026 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T16:28:56.026 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T16:28:56.036 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T16:28:56.050 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T16:28:56.056 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T16:28:56.059 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T16:28:56.066 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T16:28:56.068 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T16:28:56.068 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T16:28:56.075 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T16:28:56.078 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T16:28:56.086 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T16:28:56.086 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T16:28:56.096 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T16:28:56.109 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T16:28:56.117 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T16:28:56.120 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T16:28:56.131 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T16:28:56.140 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T16:28:56.140 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T16:28:56.148 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T16:28:56.193 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T16:28:56.223 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T16:28:56.241 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T16:28:56.247 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 27/84 2026-03-09T16:28:56.250 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T16:28:56.255 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T16:28:56.272 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:56.273 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.285 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T16:28:56.287 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.292 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 30/84 2026-03-09T16:28:56.295 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 31/84 2026-03-09T16:28:56.298 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 32/84 2026-03-09T16:28:56.302 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 33/84 2026-03-09T16:28:56.303 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T16:28:56.305 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 34/84 2026-03-09T16:28:56.309 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 27/84 2026-03-09T16:28:56.310 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T16:28:56.312 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T16:28:56.315 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 36/84 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T16:28:56.335 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:56.336 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.349 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T16:28:56.353 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 30/84 2026-03-09T16:28:56.355 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 31/84 2026-03-09T16:28:56.358 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 32/84 2026-03-09T16:28:56.361 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 33/84 2026-03-09T16:28:56.364 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 37/84 2026-03-09T16:28:56.366 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 34/84 2026-03-09T16:28:56.371 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T16:28:56.376 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 36/84 2026-03-09T16:28:56.377 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 38/84 2026-03-09T16:28:56.381 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 39/84 2026-03-09T16:28:56.384 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 40/84 2026-03-09T16:28:56.386 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 41/84 2026-03-09T16:28:56.388 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 42/84 2026-03-09T16:28:56.408 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.408 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:56.408 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T16:28:56.408 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:56.409 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.419 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.421 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 44/84 2026-03-09T16:28:56.423 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 45/84 2026-03-09T16:28:56.428 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 37/84 2026-03-09T16:28:56.428 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ply-3.11-14.el9.noarch 46/84 2026-03-09T16:28:56.431 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 47/84 2026-03-09T16:28:56.433 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 48/84 2026-03-09T16:28:56.437 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 49/84 2026-03-09T16:28:56.439 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 38/84 2026-03-09T16:28:56.440 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 50/84 2026-03-09T16:28:56.442 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 39/84 2026-03-09T16:28:56.444 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 51/84 2026-03-09T16:28:56.445 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 40/84 2026-03-09T16:28:56.448 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 41/84 2026-03-09T16:28:56.450 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 42/84 2026-03-09T16:28:56.453 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 52/84 2026-03-09T16:28:56.459 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 53/84 2026-03-09T16:28:56.461 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 54/84 2026-03-09T16:28:56.465 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 55/84 2026-03-09T16:28:56.468 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 56/84 2026-03-09T16:28:56.471 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.471 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T16:28:56.471 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T16:28:56.471 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:56.472 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.476 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 57/84 2026-03-09T16:28:56.483 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 58/84 2026-03-09T16:28:56.483 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T16:28:56.486 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 44/84 2026-03-09T16:28:56.488 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 45/84 2026-03-09T16:28:56.490 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 59/84 2026-03-09T16:28:56.491 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ply-3.11-14.el9.noarch 46/84 2026-03-09T16:28:56.493 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 47/84 2026-03-09T16:28:56.495 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 60/84 2026-03-09T16:28:56.495 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 48/84 2026-03-09T16:28:56.498 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 49/84 2026-03-09T16:28:56.501 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 50/84 2026-03-09T16:28:56.502 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 61/84 2026-03-09T16:28:56.505 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 51/84 2026-03-09T16:28:56.506 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 62/84 2026-03-09T16:28:56.509 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 63/84 2026-03-09T16:28:56.513 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 64/84 2026-03-09T16:28:56.513 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 52/84 2026-03-09T16:28:56.519 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 53/84 2026-03-09T16:28:56.521 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 54/84 2026-03-09T16:28:56.522 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 65/84 2026-03-09T16:28:56.524 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 55/84 2026-03-09T16:28:56.527 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 66/84 2026-03-09T16:28:56.527 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 56/84 2026-03-09T16:28:56.529 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 67/84 2026-03-09T16:28:56.533 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 57/84 2026-03-09T16:28:56.536 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 68/84 2026-03-09T16:28:56.539 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 58/84 2026-03-09T16:28:56.543 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 69/84 2026-03-09T16:28:56.545 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 59/84 2026-03-09T16:28:56.547 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 70/84 2026-03-09T16:28:56.550 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 71/84 2026-03-09T16:28:56.550 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 60/84 2026-03-09T16:28:56.556 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 61/84 2026-03-09T16:28:56.560 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 62/84 2026-03-09T16:28:56.564 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 63/84 2026-03-09T16:28:56.567 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 64/84 2026-03-09T16:28:56.570 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.570 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T16:28:56.570 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:28:56.576 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.576 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 65/84 2026-03-09T16:28:56.580 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 66/84 2026-03-09T16:28:56.582 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 67/84 2026-03-09T16:28:56.590 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 68/84 2026-03-09T16:28:56.597 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 69/84 2026-03-09T16:28:56.598 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.598 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T16:28:56.602 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 70/84 2026-03-09T16:28:56.604 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 71/84 2026-03-09T16:28:56.623 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.623 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T16:28:56.624 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:28:56.630 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.648 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T16:28:56.648 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-09T16:29:02.734 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:02.747 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T16:29:02.785 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T16:29:02.790 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 75/84 2026-03-09T16:29:02.794 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 76/84 2026-03-09T16:29:02.798 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 77/84 2026-03-09T16:29:02.801 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 78/84 2026-03-09T16:29:02.804 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T16:29:02.805 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T16:29:02.819 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T16:29:02.822 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T16:29:02.826 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T16:29:02.829 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T16:29:02.829 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 1/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 3/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 4/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 5/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 6/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 7/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 8/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 9/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 10/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 11/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 12/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T16:29:02.939 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 17/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 18/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 19/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 20/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 21/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 22/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 23/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 24/84 2026-03-09T16:29:02.940 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 25/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 26/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 27/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 28/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 29/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 30/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 31/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 32/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 33/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 34/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 35/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 36/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 37/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 38/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 39/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 40/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 41/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 42/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 43/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 44/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 45/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 46/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 47/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 48/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 49/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 50/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T16:29:02.942 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T16:29:02.943 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T16:29:02.961 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-09T16:29:02.962 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:02.977 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T16:29:03.008 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T16:29:03.012 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 75/84 2026-03-09T16:29:03.015 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 76/84 2026-03-09T16:29:03.019 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 77/84 2026-03-09T16:29:03.023 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 78/84 2026-03-09T16:29:03.025 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T16:29:03.025 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T16:29:03.026 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T16:29:03.027 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.028 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:03.046 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T16:29:03.048 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T16:29:03.055 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T16:29:03.059 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T16:29:03.059 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 1/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 3/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 4/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 5/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 6/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 7/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 8/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 9/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 10/84 2026-03-09T16:29:03.163 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 11/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 12/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 17/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 18/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 19/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 20/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 21/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 22/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 23/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 24/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 25/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 26/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 27/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 28/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 29/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 30/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 31/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 32/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 33/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 34/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 35/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 36/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 37/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 38/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 39/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 40/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 41/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 42/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 43/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 44/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 45/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 46/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 47/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 48/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 49/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 50/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T16:29:03.164 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T16:29:03.165 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T16:29:03.166 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T16:29:03.166 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 218 k 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.244 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 218 k 2026-03-09T16:29:03.245 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:29:03.246 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:29:03.246 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:29:03.248 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:29:03.248 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:29:03.249 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T16:29:03.250 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.251 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:03.267 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:29:03.267 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.385 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.431 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.432 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.432 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:03.432 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.432 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:03.432 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 218 k 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 218 k 2026-03-09T16:29:03.489 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:29:03.491 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:29:03.491 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:29:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:29:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:29:03.513 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:29:03.513 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.638 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T16:29:03.639 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:03.642 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:03.643 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:03.643 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:03.649 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:03.698 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:03.844 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr 2026-03-09T16:29:03.844 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:03.847 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:03.848 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:03.848 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:03.912 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T16:29:03.913 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:03.915 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:03.916 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:03.916 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.042 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T16:29:04.043 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:04.045 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:04.046 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:04.046 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:04.096 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr 2026-03-09T16:29:04.097 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:04.099 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:04.100 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:04.100 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.232 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T16:29:04.233 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:04.235 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:04.236 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:04.236 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:04.286 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T16:29:04.286 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:04.291 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:04.292 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:04.292 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.446 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-rook 2026-03-09T16:29:04.446 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:04.449 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:04.450 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:04.450 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:04.505 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T16:29:04.505 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:04.509 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:04.510 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:04.510 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.650 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T16:29:04.651 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:04.654 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:04.654 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:04.654 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:04.686 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-rook 2026-03-09T16:29:04.687 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:04.689 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:04.690 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:04.690 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:29:04.847 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.5 M 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.5 M 2026-03-09T16:29:04.848 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:29:04.849 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:29:04.849 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:29:04.860 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:29:04.860 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:29:04.870 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T16:29:04.870 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:04.873 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:04.873 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:04.873 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:04.886 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:29:04.902 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:04.981 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:05.037 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:05.038 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.038 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:05.038 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.038 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.038 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:05.091 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.5 M 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.5 M 2026-03-09T16:29:05.092 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:29:05.094 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:29:05.094 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:29:05.105 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:29:05.105 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:29:05.132 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:29:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:05.219 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repo Size 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 456 k 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 139 k 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-09T16:29:05.265 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.266 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 595 k 2026-03-09T16:29:05.266 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:29:05.268 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:29:05.268 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.269 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:05.280 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:29:05.280 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:29:05.309 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:29:05.312 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:29:05.326 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.394 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.394 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:29:05.439 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.439 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.439 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:05.440 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.440 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.440 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.440 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:05.482 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repo Size 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 456 k 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 139 k 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 595 k 2026-03-09T16:29:05.483 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:29:05.485 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:29:05.485 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:29:05.497 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:29:05.497 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:29:05.526 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:29:05.529 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:29:05.544 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.609 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.609 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T16:29:05.652 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repo Size 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.0 M 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 505 k 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 186 k 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Remove 3 Packages 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.6 M 2026-03-09T16:29:05.653 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:29:05.655 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:29:05.655 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:29:05.668 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:29:05.668 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:29:05.704 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:29:05.707 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T16:29:05.709 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T16:29:05.709 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:05.777 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:05.777 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T16:29:05.777 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T16:29:05.822 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:05.822 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:05.823 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repo Size 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.0 M 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 505 k 2026-03-09T16:29:05.873 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 186 k 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout:Remove 3 Packages 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.6 M 2026-03-09T16:29:05.874 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:29:05.875 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:29:05.875 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:29:05.888 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:29:05.889 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:29:05.916 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:29:05.919 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T16:29:05.920 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T16:29:05.920 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:05.983 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:05.983 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T16:29:05.983 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T16:29:06.005 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: libcephfs-devel 2026-03-09T16:29:06.005 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:06.008 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:06.008 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:06.008 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.028 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:06.203 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 265 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 231 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 490 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:29:06.204 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 16 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout:Remove 19 Packages 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 73 M 2026-03-09T16:29:06.205 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T16:29:06.209 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T16:29:06.209 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T16:29:06.229 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: libcephfs-devel 2026-03-09T16:29:06.230 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:06.233 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:06.234 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:06.234 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:06.234 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T16:29:06.234 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T16:29:06.278 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T16:29:06.281 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 1/19 2026-03-09T16:29:06.283 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2/19 2026-03-09T16:29:06.286 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 3/19 2026-03-09T16:29:06.286 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T16:29:06.302 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T16:29:06.304 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/19 2026-03-09T16:29:06.306 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T16:29:06.308 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T16:29:06.310 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/19 2026-03-09T16:29:06.313 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/19 2026-03-09T16:29:06.313 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T16:29:06.328 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T16:29:06.328 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T16:29:06.328 INFO:teuthology.orchestra.run.vm03.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T16:29:06.328 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:06.342 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T16:29:06.345 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/19 2026-03-09T16:29:06.349 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/19 2026-03-09T16:29:06.354 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/19 2026-03-09T16:29:06.357 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/19 2026-03-09T16:29:06.360 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 16/19 2026-03-09T16:29:06.364 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 17/19 2026-03-09T16:29:06.366 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 18/19 2026-03-09T16:29:06.381 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T16:29:06.436 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 265 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 231 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 490 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 16 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Remove 19 Packages 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 73 M 2026-03-09T16:29:06.438 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-09T16:29:06.442 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-09T16:29:06.442 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-09T16:29:06.453 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T16:29:06.453 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 4/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 5/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 8/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 9/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 10/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 11/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 12/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 13/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 14/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 15/19 2026-03-09T16:29:06.454 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 16/19 2026-03-09T16:29:06.455 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 17/19 2026-03-09T16:29:06.455 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 18/19 2026-03-09T16:29:06.467 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-09T16:29:06.467 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-09T16:29:06.497 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 19/19 2026-03-09T16:29:06.497 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:06.497 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T16:29:06.497 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T16:29:06.498 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:06.514 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-09T16:29:06.518 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 1/19 2026-03-09T16:29:06.520 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2/19 2026-03-09T16:29:06.523 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 3/19 2026-03-09T16:29:06.523 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T16:29:06.538 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T16:29:06.541 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/19 2026-03-09T16:29:06.544 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T16:29:06.546 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T16:29:06.549 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/19 2026-03-09T16:29:06.552 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/19 2026-03-09T16:29:06.552 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T16:29:06.567 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T16:29:06.567 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T16:29:06.567 INFO:teuthology.orchestra.run.vm05.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T16:29:06.567 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.585 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T16:29:06.588 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/19 2026-03-09T16:29:06.594 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/19 2026-03-09T16:29:06.599 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/19 2026-03-09T16:29:06.604 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/19 2026-03-09T16:29:06.607 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 16/19 2026-03-09T16:29:06.610 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 17/19 2026-03-09T16:29:06.612 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 18/19 2026-03-09T16:29:06.629 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 4/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 5/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 8/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 9/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 10/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 11/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 12/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 13/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 14/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 15/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 16/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 17/19 2026-03-09T16:29:06.709 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 18/19 2026-03-09T16:29:06.721 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: librbd1 2026-03-09T16:29:06.721 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:06.724 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:06.725 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:06.725 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:06.752 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 19/19 2026-03-09T16:29:06.752 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.752 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-09T16:29:06.753 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:06.902 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rados 2026-03-09T16:29:06.902 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:06.905 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:06.906 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:06.906 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:06.965 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: librbd1 2026-03-09T16:29:06.965 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:06.968 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:06.969 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:06.969 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:07.098 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rgw 2026-03-09T16:29:07.098 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:07.101 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:07.102 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:07.102 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:07.171 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rados 2026-03-09T16:29:07.171 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:07.174 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:07.175 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:07.175 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:07.271 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-cephfs 2026-03-09T16:29:07.272 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:07.274 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:07.275 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:07.275 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:07.353 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rgw 2026-03-09T16:29:07.353 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:07.356 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:07.357 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:07.357 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:07.467 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rbd 2026-03-09T16:29:07.467 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:07.470 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:07.471 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:07.471 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:07.548 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-cephfs 2026-03-09T16:29:07.548 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:07.552 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:07.552 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:07.552 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:07.652 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-fuse 2026-03-09T16:29:07.652 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:07.655 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:07.656 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:07.656 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:07.731 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rbd 2026-03-09T16:29:07.731 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:07.735 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:07.736 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:07.736 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:07.836 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-mirror 2026-03-09T16:29:07.837 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:07.840 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:07.841 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:07.841 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:07.920 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-fuse 2026-03-09T16:29:07.922 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:07.923 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:07.924 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:07.924 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:08.025 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-nbd 2026-03-09T16:29:08.025 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T16:29:08.028 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T16:29:08.029 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T16:29:08.029 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T16:29:08.061 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-09T16:29:08.106 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-mirror 2026-03-09T16:29:08.106 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:08.109 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:08.110 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:08.110 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:08.193 INFO:teuthology.orchestra.run.vm03.stdout:56 files removed 2026-03-09T16:29:08.219 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T16:29:08.247 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean expire-cache 2026-03-09T16:29:08.303 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-nbd 2026-03-09T16:29:08.303 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-09T16:29:08.307 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-09T16:29:08.307 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-09T16:29:08.307 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-09T16:29:08.336 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-09T16:29:08.415 INFO:teuthology.orchestra.run.vm03.stdout:Cache was expired 2026-03-09T16:29:08.415 INFO:teuthology.orchestra.run.vm03.stdout:0 files removed 2026-03-09T16:29:08.439 DEBUG:teuthology.parallel:result is None 2026-03-09T16:29:08.472 INFO:teuthology.orchestra.run.vm05.stdout:56 files removed 2026-03-09T16:29:08.496 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T16:29:08.520 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean expire-cache 2026-03-09T16:29:08.691 INFO:teuthology.orchestra.run.vm05.stdout:Cache was expired 2026-03-09T16:29:08.691 INFO:teuthology.orchestra.run.vm05.stdout:0 files removed 2026-03-09T16:29:08.712 DEBUG:teuthology.parallel:result is None 2026-03-09T16:29:08.712 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm03.local 2026-03-09T16:29:08.713 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm05.local 2026-03-09T16:29:08.713 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T16:29:08.713 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T16:29:08.740 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T16:29:08.743 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T16:29:08.811 DEBUG:teuthology.parallel:result is None 2026-03-09T16:29:08.813 DEBUG:teuthology.parallel:result is None 2026-03-09T16:29:08.813 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-09T16:29:08.816 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-09T16:29:08.816 DEBUG:teuthology.orchestra.run.vm03:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T16:29:08.853 DEBUG:teuthology.orchestra.run.vm05:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T16:29:08.867 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-09T16:29:08.871 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:^+ formularfetischisten.de 2 7 377 96 -81us[ -88us] +/- 40ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:^* sv1.ggsrv.de 2 6 377 29 +26us[ +22us] +/- 18ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:^+ 217.145.111.106 2 7 377 98 -55us[ -62us] +/- 66ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm05.stdout:^- mx03.fischl-online.de 2 6 377 31 +2987us[+2984us] +/- 63ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:^+ formularfetischisten.de 2 7 377 95 -86us[ -103us] +/- 40ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:^* sv1.ggsrv.de 2 6 377 29 +27us[ +24us] +/- 18ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:^+ 217.145.111.106 2 7 377 94 -80us[ -97us] +/- 66ms 2026-03-09T16:29:08.875 INFO:teuthology.orchestra.run.vm03.stdout:^- mx03.fischl-online.de 2 6 377 30 +2990us[+2987us] +/- 63ms 2026-03-09T16:29:08.877 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-09T16:29:08.880 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-09T16:29:08.880 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-09T16:29:08.883 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-09T16:29:08.885 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-09T16:29:08.888 INFO:teuthology.task.internal:Duration was 1454.584539 seconds 2026-03-09T16:29:08.888 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-09T16:29:08.891 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-09T16:29:08.891 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T16:29:08.919 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T16:29:08.960 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T16:29:08.960 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T16:29:09.306 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-09T16:29:09.306 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm03.local 2026-03-09T16:29:09.306 DEBUG:teuthology.orchestra.run.vm03:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T16:29:09.336 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm05.local 2026-03-09T16:29:09.336 DEBUG:teuthology.orchestra.run.vm05:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T16:29:09.371 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-09T16:29:09.371 DEBUG:teuthology.orchestra.run.vm03:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T16:29:09.380 DEBUG:teuthology.orchestra.run.vm05:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T16:29:10.159 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-09T16:29:10.160 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T16:29:10.161 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T16:29:10.186 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T16:29:10.186 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T16:29:10.187 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T16:29:10.187 INFO:teuthology.orchestra.run.vm03.stderr: /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T16:29:10.187 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T16:29:10.188 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T16:29:10.189 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T16:29:10.189 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5/home/ubuntu/cephtest/archive/syslog/kern.log: --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T16:29:10.189 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T16:29:10.190 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T16:29:10.356 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T16:29:10.383 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.1% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T16:29:10.385 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-09T16:29:10.389 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-09T16:29:10.389 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T16:29:10.453 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T16:29:10.481 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-09T16:29:10.485 DEBUG:teuthology.orchestra.run.vm03:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T16:29:10.496 DEBUG:teuthology.orchestra.run.vm05:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T16:29:10.524 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = core 2026-03-09T16:29:10.552 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = core 2026-03-09T16:29:10.567 DEBUG:teuthology.orchestra.run.vm03:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T16:29:10.595 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:29:10.595 DEBUG:teuthology.orchestra.run.vm05:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T16:29:10.625 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:29:10.626 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-09T16:29:10.629 INFO:teuthology.task.internal:Transferring archived files... 2026-03-09T16:29:10.629 DEBUG:teuthology.misc:Transferring archived files from vm03:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm03 2026-03-09T16:29:10.629 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T16:29:10.668 DEBUG:teuthology.misc:Transferring archived files from vm05:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/542/remote/vm05 2026-03-09T16:29:10.668 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T16:29:10.705 INFO:teuthology.task.internal:Removing archive directory... 2026-03-09T16:29:10.705 DEBUG:teuthology.orchestra.run.vm03:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T16:29:10.708 DEBUG:teuthology.orchestra.run.vm05:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T16:29:10.763 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-09T16:29:10.767 INFO:teuthology.task.internal:Not uploading archives. 2026-03-09T16:29:10.767 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-09T16:29:10.770 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-09T16:29:10.770 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T16:29:10.772 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T16:29:10.787 INFO:teuthology.orchestra.run.vm03.stdout: 8532144 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 9 16:29 /home/ubuntu/cephtest 2026-03-09T16:29:10.787 INFO:teuthology.orchestra.run.vm03.stdout: 50577507 0 d--------- 2 ubuntu ubuntu 6 Mar 9 16:12 /home/ubuntu/cephtest/mnt.0 2026-03-09T16:29:10.787 INFO:teuthology.orchestra.run.vm03.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-09T16:29:10.787 INFO:teuthology.orchestra.run.vm03.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-09T16:29:10.806 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T16:29:10.806 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm03 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-09T16:29:10.807 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-09T16:29:10.810 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T16:29:10.811 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{reef} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1454.584538936615 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-09T16:29:10.811 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T16:29:10.838 INFO:teuthology.run:FAIL